Computing security Research Papers - Academia.edu (original) (raw)

2025

Recently, the approach towards mining various opinions on weblogs, forums and websites has gained attentions and interests of numerous researchers. In this regard, feature-based opinion mining has been extensively studied in English... more

Recently, the approach towards mining various opinions on weblogs, forums and websites has gained attentions and interests of numerous researchers. In this regard, feature-based opinion mining has been extensively studied in English documents in order to identify implicit and explicit product features and relevant opinions. However, in case of texts written in Persian language, this task faces serious challenges. The objective of this research is to present an unsupervised method for feature-based opinion mining in Persian; an approach which does not require a labeled training dataset. The proposed method in this paper involves extracting explicit product features. Previous studies dealing with extraction of explicit features often focus on lexical roles of words; the approach which cannot be used in distinguishing between an adjective as a part of a noun or a sentiment word. In this study, in addition to lexical roles, syntactic roles are also considered to extract more relevant explicit features. The results demonstrate that the proposed method has got higher recall and precision values compared to prior studies.

2025

Tourism is one of the major sources of income for many countries and with the advent of internet, using this new channel (internet) for providing different services of tourism, i.e., e-tourism, is growing rapidly. This is also the case in... more

Tourism is one of the major sources of income for many countries and with the advent of internet, using this new channel (internet) for providing different services of tourism, i.e., e-tourism, is growing rapidly. This is also the case in Iran, but it is clear that focusing on technology alone can’t lead the business to success. The key point of successful business in e-tourism is to concentrate on the customers and their behavior for having a better relationship with them. This study is intended to develop a model based on Decomposed Theory of Planned Behavior (DTPB) that considers influential factors to predict and evaluate intention of tourists and travelers while using e-tourism services and websites, such as www.expedia.com and www.priceline.com. The influential factors are studied and investigated through a survey with 259 responses. Structural Equation Modeling (SEM) has also been utilized for testing relevant hypotheses to either include or thwart specific factors which driv...

2025, Proceedings of the 2009 ACM workshop on Cloud computing security

Cloud computing is clearly one of today's most enticing technology areas due, at least in part, to its cost-efficiency and flexibility. However, despite the surge in activity and interest, there are significant, persistent concerns about... more

Cloud computing is clearly one of today's most enticing technology areas due, at least in part, to its cost-efficiency and flexibility. However, despite the surge in activity and interest, there are significant, persistent concerns about cloud computing that are impeding momentum and will eventually compromise the vision of cloud computing as a new IT procurement model. In this paper, we characterize the problems and their impact on adoption. In addition, and equally importantly, we describe how the combination of existing research thrusts has the potential to alleviate many of the concerns impeding adoption. In particular, we argue that with continued research advances in trusted computing and computation-supporting encryption, life in the cloud can be advantageous from a business intelligence standpoint over the isolated alternative that is more common today.

2025

and weighted majority rule gave precisions of 0.972 and 0.974, respectively. The C4.5 also has a lower false positive rate than the other classifiers. The study shows that the decision tree C4.5 classifier is most effective than the... more

and weighted majority rule gave precisions of 0.972 and 0.974, respectively. The C4.5 also has a lower false positive rate than the other classifiers. The study shows that the decision tree C4.5 classifier is most effective than the statistically based Support Vector Machine classifier and the weighted majority rule ensemble. This result is in tandem with earlier results reported by researchers in the literature.

2025, Journal of Information and Data Management

2025

Traditional Network intrusion detection system (NIDS) mostly uses individual classification techniques; such system fails to provide the best possible attack detection rate. In this paper, we propose a new two-stage hybrid classification... more

Traditional Network intrusion detection system (NIDS) mostly uses individual classification techniques; such system fails to provide the best possible attack detection rate. In this paper, we propose a new two-stage hybrid classification technique using Support Vector Machine (SVM) as anomaly detection in the first stage and Artificial Neural Network (ANN) as misuse detection in the second, the key idea is to combine the advantages of each algorithm to ameliorate classification accuracy along with low false positive. The first stage (Anomaly) classify the network data into two classes namely, normal and attack. The second stage (Misuse) further classify the attack data into four classes namely, Denial of Service (DoS), Remote to Local (R2L), User to Root (U2R) and Probe. Training and testing datasets are obtained from NSL-KDD datasets. Simulation results demonstrate that the proposed algorithm outperforms conventional model and individual classification of SVM and ANN algorithm. The...

2024

With the huge expanding in internet usage and with the existence of many hacking tools and people who like to put their hands on others secret information to get their benefit of it, the existing preventive procedures are not appropriate... more

With the huge expanding in internet usage and with the existence of many hacking tools and people who like to put their hands on others secret information to get their benefit of it, the existing preventive procedures are not appropriate for avoiding such activities. Network security is very important issue. As long as computers are connected to internet, Intruders or hackers will always try to explore the network security imperfection, they will try different types of attacks and techniques, so protecting the network, and developing a strength measure protection is essential to avoid organizational and personal loss or data damage. In this paper the network intrusion detection system (IDS) that is depending on support vector machines (SVM) classifier and two different features selection algorithms which are (SOM) self-organizing map and (PCA) Principle component analysis is presented. The novelty in this paper is the combination of both feature selection algorithm using voting technique. Combination provides better results comparing to stand alone feature selection based on SVM classifier. For evaluating the proposed system, a random subset of KDD99 data set is chosen. Results of this work showed that different features selection algorithm can affect the classification output in different manner. A comparison study between the SOM and PCA based on binary SVM classifier is presented in respect to their accuracy results, then an ensemble of SVM classifiers and vetoing technique applied so that SVM can select the best features which results in a best accuracy.

2024

Considering the dynamic and wireless topology of the mobile ad hoc networks (MANETs), the implementation of security in such networks is an important problem. Key revocation is a solution for this challenge which prevents the continuous... more

Considering the dynamic and wireless topology of the mobile ad hoc networks (MANETs), the implementation of security in such networks is an important problem. Key revocation is a solution for this challenge which prevents the continuous presence of the attacker in the network and disclosure of confidential information. In some suggested key revocation protocols such as Liu et al.’s scheme, each node monitors the network and will send an accusation packet to the authority center if it receives a suspicious behavior from the neighboring nodes.Then, the authority center decides to revoke the key of the accused node based on the number of received accusations. Due to the fact that the participation of the nodes is time consuming, the threshold for the number of accusations is an important factor for key revocation procedure. In this paper, a stochastic model is presented to determine the optimal threshold for the number of accusations and the appropriate time for decision making in the ...

2024

This paper presents a semantic method for aerial image segmentation. Multi-class aerial images are often featured with large intra-class variations and inter-class similarities. Furthermore, shadows, reflections and changes in viewpoint,... more

This paper presents a semantic method for aerial image segmentation. Multi-class aerial images are often featured with large intra-class variations and inter-class similarities. Furthermore, shadows, reflections and changes in viewpoint, high and varying altitude and variability of natural scene pose serious problems for simultaneous segmentation. The main purpose of segmentation of aerial images is to make subsequent recognition phase straightforward. Present algorithm combines two challenging tasks of segmentation and classification in a manner that no extra recognition phase is needed. This algorithm is supposed to be part of a system which will be developed to automatically locate the appropriate site for Unmanned Aerial Vehicle (UAV) landing. With this perspective, we focused on segregating natural and man-made areas in aerial images. We compared different classifiers and explored the best set of features for this task in an experimental manner. In addition, a certainty based method has been used for integrating color and texture descriptors in a more efficient way. The experimental results over a dataset comprised of 25 high-resolution images show the overall binary segmentation accuracy rate of 91.34%.

2024, Intelligent Data Analysis

Active learning for semi-supervised clustering allows algorithms to solicit a domain expert to provide side information as instances constraints, for example a set of labeled instances called seeds. The problem consists in selecting the... more

Active learning for semi-supervised clustering allows algorithms to solicit a domain expert to provide side information as instances constraints, for example a set of labeled instances called seeds. The problem consists in selecting the queries to the expert that are likely to improve either the relevance or the quality of the proposed clustering. However, these active methods suffer from several limitations: (i) they are generally tailored for only one specific clustering paradigm or cluster shape and size, (ii) they may be counter-productive if the seeds are not selected in an appropriate manner and, (iii) they have to work efficiently with minimal expert supervision. In this paper, we propose a new active seed selection algorithm that relies on a k-nearest neighbors structure to locate dense potential clusters and efficiently query and propagate expert information. Our approach makes no hypothesis on the underlying data distribution and can be paired with any clustering algorithm. Comparative experiments conducted on real data sets show the efficiency of this new approach compared to existing ones.

2024

The paper discusses intrusion detection systems built using ensemble approaches, i.e., by combining several machine learning algorithms. The main idea is to exploit the strengths of each algorithm of the ensemble to obtain a robust... more

The paper discusses intrusion detection systems built using ensemble approaches, i.e., by combining several machine learning algorithms. The main idea is to exploit the strengths of each algorithm of the ensemble to obtain a robust classifier. Network attacks can be divided into four classes: probe, remote to local, denial of service, and user to root. Each module of the ensemble designed in this work is itself an ensemble created by using bagging of decision trees and is specialized on the detection of one class of attacks. Experiments highlighted the efficiency of the approach and showed that increased accuracy can be obtained when each class of attacks is treated as a separate problem and handled by specialized algorithms. In all experiments, the ensemble was able to decrease the number of false positives and false negatives. However, some limitations of the used dataset (KDD99) were observed. In particular, the distribution of examples of remote to local attacks between the training set and test set made it difficult to evaluate the ensemble for this class of attacks. Furthermore, the algorithms need to be trained with specific feature subsets selected according to their relevance to the class of attacks being detected.

2024, International Journal of Industrial Electronics, Control and Optimization (IECO)

An essential reason for implementing multilevel processing systems is to reduce the number of semiconductor elements and hence the complexity of the system. Multilevel processing systems are realized much easier by carbon nanotube field... more

An essential reason for implementing multilevel processing systems is to reduce the number of semiconductor elements and hence the complexity of the system. Multilevel processing systems are realized much easier by carbon nanotube field effect transistors (CNTFET) than by MOSFET transistors due to the CNTFET transistors' adjustable threshold voltage capabilities. This paper presents an efficient quaternary full-adder based on CNTFET technology that consists of two half-adder blocks, a quaternary decoder, and a carry generator circuit. The proposed architecture combines the base-two and base-four circuit design techniques to take full advantage of both techniques, namely, simple implementation and low chip area occupation of the entire proposed quaternary full-adder. The proposed structure is evaluated using a Stanford 32nm CNTFET library in HSPICE software. The simulation results for the proposed full-adder structure that utilizes a 0.9-V supply voltage reveal that the power consumption, propagation delay, and energy index are equal to 2.67 μW, 40 ps, and 10.68 aJ, respectively. NOMENCLATURE 1. 2 Chiral vector The diameter Carbon Nanotube (CNT) π π-π junction energy of carbon The flat band voltage The drain to source voltage δ The Drain Induced-Barrier Lowering (DIBIL) coefficient in CNT

2024

Jarvis and Nevins presented ETRU in 2013 which has applausive performance with moderate key-sizes and conjectured resistance to quantum computers. ETRU, as an efficient NTRUEncrypt-like cryptosystem, is over the ring of Eisenstein... more

Jarvis and Nevins presented ETRU in 2013 which has applausive performance with moderate key-sizes and conjectured resistance to quantum computers. ETRU, as an efficient NTRUEncrypt-like cryptosystem, is over the ring of Eisenstein integers that is faster with smaller keys for the same or better level of security than does NTRUEncrypt which is a desirable alternative to public-key cryptosystems based on factorisation and discrete logarithm problem. However, because of its construction, doubts have regularly arisen on its security. In this paper, we propose how to modify ETRU to make it provably secure, under our modified assumption of quantum hardness of standard worst-case lattice problems, restricted to extended ideal lattices related to some extensions of cyclotomic fields structures. We describe the structure of all generated polynomial rings of quotient over direct product of Dedekind domains Z and Z[ζ3], where ζ3 is complex cube root of unity. We give a detailed description to ...

2024, 2013 IEEE International Workshop on Genetic and Evolutionary Fuzzy Systems (GEFS)

A multi-objective evolutionary fuzzy rule selection process extracts a subset of fuzzy rules from an initial set, by applying a multi-objective evolutionary algorithm. Two approaches can be used to determine the number of terms (i.e. the... more

A multi-objective evolutionary fuzzy rule selection process extracts a subset of fuzzy rules from an initial set, by applying a multi-objective evolutionary algorithm. Two approaches can be used to determine the number of terms (i.e. the granularity) associated with the linguistic variables that appear in the rules: a pre-established single granularity can be chosen, or a multiple granularities approach can be preferred. The latter favors a reduction in the number of extracted rules, but it also brings to a possible loss of interpretability. To prevent this problem, suitable granularities can be determined by applying automatic techniques before the initial rule generation process. In this contribution, we investigate how the application of a single granularity learning approach influences the performance of fuzzy associative rule-based classifiers. The aim is to reduce the complexity of the obtained models, trying to maintain a good classification ability.

2024

Multiple Classifier Systems are often found to be useful for improving individual results by combining a set of classifier decisions where a single base level classifier may not achieve the same level of results. However not every set of... more

Multiple Classifier Systems are often found to be useful for improving individual results by combining a set of classifier decisions where a single base level classifier may not achieve the same level of results. However not every set of base classifiers improve results, therefore a selection of a set of classifiers is required. The process of selecting base level classifiers for a multiple classifier system may be performed by the use of a Genetic Algorithm. The aim of this work is the selection of optimal sets of base level classifies using an evolutionary computation approach. In addition, a comparative analysis is made of the performance of the generated ensembles against the individual base level classifiers.

2024, 2016 IEEE Symposium Series on Computational Intelligence (SSCI)

Multiple Classifier Systems are often found to improve results of intrusion detection by combining a set of classifier decisions where single classifiers may not achieve the same level of detection. However not every set of classifiers is... more

Multiple Classifier Systems are often found to improve results of intrusion detection by combining a set of classifier decisions where single classifiers may not achieve the same level of detection. However not every set of classifiers is more able, therefore selection of more capable sets is required. A misclassification is a false positive or negative instance; a set of classifiers may produce one more than the other. An optimal set of classifiers is required to reduce both, thus treating them as individual objectives allows a balance to be found. The aim of this work is the selection of optimal sets of base level classifies using an evolutionary computation approach. A comparative analysis is made of the performance of the generated ensembles against the individual base level classifiers, it is shown that optimal ensembles can be found to perform better than a majority of individuals.

2024

An architect designs software using architectural styles and quality attributes. To satisfy quality attributes, architectural tactics are used. An architectural tactic determines how to implement a quality attribute in architectural... more

An architect designs software using architectural styles and quality attributes. To satisfy quality attributes, architectural tactics are used. An architectural tactic determines how to implement a quality attribute in architectural styles. The interaction between architectural tactics and architectural styles plays an important role in meeting quality attributes. In this article, the interaction between architectural tactics, namely "availability" and "performance", and architectural styles, namely "pipe & filter", "layered", "backboard", "client-server" and "broker" is evaluated, and then, a new ranking scheme for the architectural styles is proposed. Through this scheme, the best architectural style maybe obtained for every individual quality attribute or their combinations.

2024

Linear feedback shift registers (LFSRs) are used in many stream ciphers because of their maximal period and good statistical prop- erties. Due to the linearity of the LFSR, its output cannot be directly used as the keystream. Dierent... more

Linear feedback shift registers (LFSRs) are used in many stream ciphers because of their maximal period and good statistical prop- erties. Due to the linearity of the LFSR, its output cannot be directly used as the keystream. Dierent methods have been proposed to intro- duce the nonlinearity to the LFSR output. Irregular clocking is one of the methods to do this but the stream ciphers based on this method, are vulnerable to the side-channel attacks. In addition, the generation rate of the irregular clocked LFSRs is less than the corresponding regular ones. Jumping is a method of irregular clocking for LFSRs which may have non of the mentioned aws but its output period cannot be determined. In this paper, using the jumping LFSRs, some new primitives will be proposed. Structures of the new primitives have the determined lower bound of period and the resistance against the side-channel attacks. In some of the proposed structures, the lower bound of period can be de- termined without kn...

2024

Today, pervasive systems have become an inseparable part of computer science and engineering. These systems provide automated connection with remote access and seamless transmission of biological and other data upon request. The health... more

Today, pervasive systems have become an inseparable part of computer science and engineering. These systems provide automated connection with remote access and seamless transmission of biological and other data upon request. The health domain is one of the most important application of these systems. Moreover, heart is the most important part of human body and cardiac diseases are the second leading cause of death. Therefore, different tools and methods have been invented for the rapid investigation and early detection of cardiac diseases and the cardiac operations. These methods aim to obtain structural and operational information about the heart. Any changes in the form of cardiac signals can indicate a disease or abnormal behavior of the heart. Therefore, early detection of these changes can be significant to prevent and treat cardiac diseases. This paper proposes a method to detect atrial arrhythmia, which is one of the most common cardiac anomalies. The proposed approach can be...

2024

Recently, researchers have shown an increased interest in kinship verification via facial images in the field of computer vision. The matter of fact is that kinship verification is done according to similarities of parent and child faces.... more

Recently, researchers have shown an increased interest in kinship verification via facial images in the field of computer vision. The matter of fact is that kinship verification is done according to similarities of parent and child faces. To this end, we need more local features extraction. All the methods reviewed so far, however, suffer the fact that they have divided images into distinct block, to extract more local features. The main problem has two aspects: aimless division and features extraction from unnecessary regions that lead to overlapping, noise and reduction of classification rate. In this paper, at first, the main parts of face such as eyes, nose and mouth are detected along with the whole face image. Then they will be used for feature extraction. In order to reduce feature vectors redundancy, new method of feature selection named as Kinship Feature Selection (KinFS), based on Random Subset Feature Selection (RSFS) algorithm is proposed. This method reduces the redund...

2024

Face recognition systems suffer from different spoofing attacks like photograph-based and video-based spoofing attacks. This paper presents a new method to empower the face recognition systems against video-based spoofing by employing... more

Face recognition systems suffer from different spoofing attacks like photograph-based and video-based spoofing attacks. This paper presents a new method to empower the face recognition systems against video-based spoofing by employing efficient scene texture analyzing. To this end, the scene of input and reference images are divided into same non-overlapped blocks and the texture pattern of each block is extracted by local binary pattern (LBP) operator. To reduce the sensitivity of LBP to noise and also to increase the reliability of the proposed method, first input image transformed to YCbCr color space and then similarity of texture pattern in Y, Cb and Cr channels are extracted independently. The majority of similarity of three channels is used as the final similarity of each block. The ratio of same blocks in the input image and reference image is used as a measure for detecting video-based spoofing attacks. The performance of the proposed algorithm is evaluated using several sc...

2024, Periodica Polytechnica Electrical Engineering and Computer Science

Fuzzy Rule-Based Classifier (FRBC) design problem has been widely studied due to many practical applications. Hedge Algebras based Classifier Design Methods (HACDMs) are the outstanding and effective approaches because these approaches... more

Fuzzy Rule-Based Classifier (FRBC) design problem has been widely studied due to many practical applications. Hedge Algebras based Classifier Design Methods (HACDMs) are the outstanding and effective approaches because these approaches based on a mathematical formal formalism allowing the fuzzy sets based computational semantics generated from their inherent qualitative semantics of linguistic terms. HACDMs include two phase optimization process. The first phase is to optimize the semantic parameter values by applying an optimization algorithm. Then, in the second phase, the optimal fuzzy rule based system for FRBC is extracted based on the optimal semantic parameter values provided by the first phase. The performance of FRBC design methods depends on the quality of the applied optimization algorithms. This paper presents our proposed co-optimization Particle Swarm Optimization (PSO) algorithm for designing FRBC with trapezoidal fuzzy sets based computational semantics generated by ...

2024, Adaptive Agents and Multi-Agents Systems

This paper introduces a new Negotiating Agent for automated negotiation on continuous domains and without considering a specified deadline. The agent bidding strategy relies on Monte Carlo Tree Search, which is a trendy method since it... more

This paper introduces a new Negotiating Agent for automated negotiation on continuous domains and without considering a specified deadline. The agent bidding strategy relies on Monte Carlo Tree Search, which is a trendy method since it has been used with success on games with high branching factor such as Go. It uses two opponent modeling techniques for its bidding strategy and its utility: Gaussian process regression and Bayesian learning. Evaluation is done by confronting the existing agents that are able to negotiate in such context: Random Walker, Tit-fortat and Nice Tit-for-Tat. None of those agents succeeds in beating our agent; moreover the modular and adaptive nature of our approach is a huge advantage when it comes to optimize it in specific applicative contexts.

2024

CryptoPAi is a hybrid operational-epistemic framework for specification and analysis of security protocols with genuine support for cryptographic constructs. This framework includes a process algebraic formalism for the operational... more

CryptoPAi is a hybrid operational-epistemic framework for specification and analysis of security protocols with genuine support for cryptographic constructs. This framework includes a process algebraic formalism for the operational specification and an epistemic extension of modal µ-calculus with past for the property specification. In this paper, we extend CryptoPAi framework with more cryptographic constructs. The main practical motivation for this work came from the domain of e-voting protocols and then we investigate the applicability of the extended framework in this domain. The framework provides explicit support for cryptographic constructs, which is among the most essential ingredients of security and e-voting protocols. We apply our extended framework to the FOO e-voting protocol. We also promote the prototype model-checker of the framework in the Maude rewriting logic tool and apply it to model-check some specified properties on their corresponding models.

2024

DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page... more

DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal. If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the "Taverne" license above, please follow below link for the End User Agreement:

2024

CryptoPAi is a hybrid operational-epistemic framework for specification and analysis of security protocols with genuine support for cryptographic constructs. This framework includes a process algebraic formalism for the operational... more

CryptoPAi is a hybrid operational-epistemic framework for specification and analysis of security protocols with genuine support for cryptographic constructs. This framework includes a process algebraic formalism for the operational specification and an epistemic extension of modal mu-calculus with past for the property specification. In this paper, we extend CryptoPAi framework with more cryptographic constructs. The main practical motivation for this work came from the domain of e-voting protocols and then we investigate the applicability of the extended framework in this domain. The framework provides explicit support for cryptographic constructs, which is among the most essential ingredients of security and e-voting protocols. Some more advanced cryptographic constructs are provided to allow specifying the behavior ofmore protocols in our process language and then verifying properties expressed in our logic with both temporal and epistemic operators.We apply our extended framewor...

2024

sensors, which are used to collect information from the surrounding environment [1-3]. These networks are suitable for many purposes such as border security, military target tracking, and scientific research in dangerous environments... more

sensors, which are used to collect information from the surrounding environment [1-3]. These networks are suitable for many purposes such as border security, military target tracking, and scientific research in dangerous environments [4-6]. Sensors commonly communicate with each other via a wireless channel. Since the sensors may be used in hostile environments, especially in military usage, security is a very important issue with these networks. Therefore, security services such as encryption and authentication should be used. Since authentication and encryption use a common key between nodes, key management is a

2024, Proceedings of the 15th ACM/IEEE-CS Joint Conference on Digital Libraries

Name ambiguity in the context of bibliographic citation records is a hard problem that affects the quality of services and content in digital libraries and similar systems. This problem occurs when an author publishes works under distinct... more

Name ambiguity in the context of bibliographic citation records is a hard problem that affects the quality of services and content in digital libraries and similar systems. This problem occurs when an author publishes works under distinct names or distinct authors publish works under similar names. The challenges of dealing with author name ambiguity have led to a myriad of name disambiguation methods. In this tutorial, we characterize such methods by means of a proposed taxonomy, present an overview of some of the most representative ones and discuss open challenges.

2024, arXiv (Cornell University)

This study addresses the problem of performing clustering in the presence of two types of background knowledge: pairwise constraints and monotonicity constraints. To achieve this, the formal framework to perform clustering under... more

This study addresses the problem of performing clustering in the presence of two types of background knowledge: pairwise constraints and monotonicity constraints. To achieve this, the formal framework to perform clustering under monotonicity constraints is, firstly, defined, resulting in a specific distance measure. Pairwise constraints are integrated afterwards by designing an objective function which combines the proposed distance measure and a pairwise constraint-based penalty term, in order to fuse both types of information. This objective function can be optimized with an EM optimization scheme. The proposed method serves as the first approach to the problem it addresses, as it is the first method designed to work with the two types of background knowledge mentioned above. Our proposal is tested in a variety of benchmark datasets and in a real-world case of study.

2024, Information Sciences

The author name disambiguation task has to deal with uncertainties related to the possible many-to-many correspondences between ambiguous names and unique authors. Despite the variety of name disambiguation methods available in the... more

The author name disambiguation task has to deal with uncertainties related to the possible many-to-many correspondences between ambiguous names and unique authors. Despite the variety of name disambiguation methods available in the literature to solve the problem, most of them are rarely compared against each other. Moreover, they are often evaluated without considering a time evolving digital library, susceptible to dynamic (and therefore challenging) patterns such as the introduction of new authors and the change of researchers' interests over time. In order to facilitate the evaluation of name disambiguation methods in various realistic scenarios and under controlled conditions, in this article we propose SyGAR, a new Synthetic Generator of Authorship Records that generates citation records based on author profiles. SyGAR can be used to generate successive loads of citation records simulating a living digital library that evolves according to various publication patterns. We validate SyGAR by comparing the results produced by three representative name disambiguation methods on real as well as synthetically generated collections of citation records. We also demonstrate its applicability by evaluating those methods on a time evolving digital library collection generated with the tool, considering several dynamic and realistic scenarios.

2023

vices take advantage of pervasive systems. Pervasive systems in non-hospital settings aim at better managing of chronic care patients. They also control health delivery costs and increase the quality of life and quality of health... more

vices take advantage of pervasive systems. Pervasive systems in non-hospital settings aim at better managing of chronic care patients. They also control health delivery costs and increase the quality of life and quality of health services. Furthermore, they can lead us to avoid serious complications. To this end, it is mainly required to monitor the patient's vital signals (i.e. ECG, blood pressure, heart rate, breath rate, oxygen saturation and perspiration).

2023

Recently, researchers have shown an increased interest in kinship verification via facial images in the field of computer vision. The matter of fact is that kinship verification is done according to similarities of parent and child faces.... more

Recently, researchers have shown an increased interest in kinship verification via facial images in the field of computer vision. The matter of fact is that kinship verification is done according to similarities of parent and child faces. To this end, we need more local features extraction. All the methods reviewed so far, however, suffer the fact that they have divided images into distinct block, to extract more local features. The main problem has two aspects: aimless division and features extraction from unnecessary regions that lead to overlapping, noise and reduction of classification rate. In this paper, at first, the main parts of face such as eyes, nose and mouth are detected along with the whole face image. Then they will be used for feature extraction. In order to reduce feature vectors redundancy, new method of feature selection named as Kinship Feature Selection (KinFS), based on Random Subset Feature Selection (RSFS) algorithm is proposed. This method reduces the redundancy and improves verification rate by selecting effective features. The experiment results show that purposeful feature extraction by proposed KinFS method are efficient in improving kinship verification rate.

2023

Prediction of the peptide-binding site of proteins is a significant and essential task in different processes such as understanding biological processes, protein functional analysis, comparison of functional sites, comprehension of the... more

Prediction of the peptide-binding site of proteins is a significant and essential task in different processes such as understanding biological processes, protein functional analysis, comparison of functional sites, comprehension of the transactions mechanism, drug design, cellular signaling, and cancer treatment. Predictive analysis of the protein-peptide binding site is one of the most challenging bioinformatics issues. Experimental methods are time-consuming, costly, and laborious. Therefore, we propose a machine learning-based method for predicting protein-peptide binding sites by utilizing enhanced features vector obtained from three-dimensional protein structure and one-dimensional sequence string data. To this end, the genetic programming technique is applied to the obtained basic features extract a more discriminative feature vector. Then support vector machine is employed to determine the binding residue of each amino acid. Finally, the binding sites are predicted using the ...

2023

In this paper, we introduce a new class of association rules (ARs) named "Multi-Relation Association Rules" which in contrast to primitive ARs (that are usually extracted from multi-relational databases), each rule item consists of one... more

In this paper, we introduce a new class of association rules (ARs) named "Multi-Relation Association Rules" which in contrast to primitive ARs (that are usually extracted from multi-relational databases), each rule item consists of one entity and several relations. These relations indicate indirect relationship between entities. Consider the following Multi-Relation Association Rule where the first item consists of three relations live in, nearby and humid: "Those who live in a place which is near by a city with humid climate type and also are younger than 20 → their health condition is good". A new algorithm called MRAR is proposed to extract such rules from directed graphs with labeled edges which are constructed from RDBMSs or semantic web data. Also, the question "how to convert RDBMS data or semantic web data to a directed graph with labeled edges?" is answered. In order to evaluate the proposed algorithm, some experiments are performed on a sample dataset and also a real-world drug semantic web dataset. Obtained results confirm the ability of the proposed algorithm in mining Multi-Relation Association Rules.

2023

In this modern era computer network security is a vital issue. Network security is developed by an efficient Intrusion Detection System (IDS). It is used to identify unauthorized access, malicious attacks and give an alert when monitors... more

In this modern era computer network security is a vital issue. Network security is developed by an efficient Intrusion Detection System (IDS). It is used to identify unauthorized access, malicious attacks and give an alert when monitors any kind of unusual activity. Over the past 30 years, there have been lots of work on intrusion detection system using machine learning algorithms. Basically, realizing the present status of application of machine learning algorithms for solving intrusion classification task, this review work gives a proper guideline. This survey work selected 84 papers based on highest citations number from the years of 2009-2018. This thesis work gives an overview of a different intrusion detection systems, a statistical comparison based on different classifier like single, hybrid and ensemble learning. In addition, we have discussed best machine learning classifiers, best datasets and some feature selections process in this thesis work. This work is devoted to our...

2023

In this modern era computer network security is a vital issue. Network security is developed by an efficient Intrusion Detection System (IDS). It is used to identify unauthorized access, malicious attacks and give an alert when monitors... more

In this modern era computer network security is a vital issue. Network security is developed by an efficient Intrusion Detection System (IDS). It is used to identify unauthorized access, malicious attacks and give an alert when monitors any kind of unusual activity. Over the past 30 years, there have been lots of work on intrusion detection system using machine learning algorithms. Basically, realizing the present status of application of machine learning algorithms for solving intrusion classification task, this review work gives a proper guideline. This survey work selected 84 papers based on highest citations number from the years of 2009-2018. This thesis work gives an overview of a different intrusion detection systems, a statistical comparison based on different classifier like single, hybrid and ensemble learning. In addition, we have discussed best machine learning classifiers, best datasets and some feature selections process in this thesis work.

2023, Applied Intelligence

Day by day, human-agent negotiation becomes more and more vital to reach a socially beneficial agreement when stakeholders need to make a joint decision together. Developing agents who understand not only human preferences but also... more

Day by day, human-agent negotiation becomes more and more vital to reach a socially beneficial agreement when stakeholders need to make a joint decision together. Developing agents who understand not only human preferences but also attitudes is a significant prerequisite for this kind of interaction. Studies on opponent modeling are predominantly based on automated negotiation and may yield good predictions after exchanging hundreds of offers. However, this is not the case in human-agent negotiation in which the total number of rounds does not usually exceed tens. For this reason, an opponent model technique is needed to extract the maximum information gained with limited interaction. This study presents a conflict-based opponent modeling technique and compares its prediction performance with the well-known approaches in human-agent and automated negotiation experimental settings. According to the results of human-agent studies, the proposed model outpr erforms them despite the dive...

2023, Journal of Ambient Intelligence and Humanized Computing

Artificial intelligence and machine learning are in widespread use nowadays in order to develop automatic and precise models for different tasks especially in the Internet. In this paper, by the use of machine learning techniques, an... more

Artificial intelligence and machine learning are in widespread use nowadays in order to develop automatic and precise models for different tasks especially in the Internet. In this paper, by the use of machine learning techniques, an intrusion detection system is proposed. An intrusion detection system is involved extensive mass of data; such data is naturally characterized with repetitions and noise which leads to the reduction in the stability and the accuracy of the intrusion detection system. Hence, the issue of reducing features dimensions for achieving a smaller subset of features which can precisely express the results and status of network observations has attracted a lot of researchers' attention. In the proposed method, by using gradually feature removal method, 16 critical features were selected for representing various network visits. By combining ant colony algorithm and ensemble of decision trees, we proposed an efficient and stable classifier for judging a network visit to be normal or not. Despite the selection of 16 features, high accuracy, i.e. 99.92%, and the average value of Matthews correlation coefficient 0.91 are obtained.

2023

The success of a negotiation depends largely on the actors and the negotiation domain. It is common that negotiators rely on an agenda to simplify the process and reach better deals. This is particularly the case when the preferences of... more

The success of a negotiation depends largely on the actors and the negotiation domain. It is common that negotiators rely on an agenda to simplify the process and reach better deals. This is particularly the case when the preferences of the negotiators are complex and when multiple issues are at stake. Using an agenda to explore and decompose the interdependence relationships between the issues is one way to address this problem. In this paper, we propose to address this challenge by applying the classical divide-andconquer approach to automated negotiations through means of utility decomposition and bottom-up agenda construction. The approach does not impose an agenda from the top level of the negotiations, but builds it bottom-up given the individual utility functions of the agents and the relationships between the issues. Our approach reduces the cost of exploring the utility spaces of the agents and the resulting bidding processes. We implement the approach in a novel protocol called the Decomposable Alternating Offers Protocol (DAOP). The experimental results show that our divide-and-conquer algorithm makes a positive influence on the global performance of an automated negotiation system.

2023, TEXILA INTERNATIONAL JOURNAL OF ACADEMIC RESEARCH

Cloud Computing is a technology that provides on-demand computing services such as applications, storage, and processing to consumers over the internet (cloud). It is centered on the pay-per-use model meaning that a user has to pay only... more

Cloud Computing is a technology that provides on-demand computing services such as applications, storage, and processing to consumers over the internet (cloud). It is centered on the pay-per-use model meaning that a user has to pay only for the services utilized. This technology is based on the virtualization perception. Such services allow the companies/establishments to scale-up or scale-down their in-house grounds. Cloud computing has numerous advantages such as flexibility, efficiency, scalability, integration, and capital reduction (upfront fixed cost), and shared resources. Moreover, it offers an advanced virtual space for companies to deploy their applications or run their operations. However, companies which consider embracing cloud based services must also appreciate that regardless of its benefits, the transition to this computing paradigm raises security concerns which are subject of several researchers. The goal of this article is to identify the main security issues and to draw the attention of both decision makers and users to the potential risks of moving data into "the cloud".

2023, Pattern Recognition

In this article, we address the problem of automatic constraint selection to improve the performance of constraint-based clustering algorithms. To this aim we propose a novel active learning algorithm that relies on a k-nearest neighbors... more

In this article, we address the problem of automatic constraint selection to improve the performance of constraint-based clustering algorithms. To this aim we propose a novel active learning algorithm that relies on a k-nearest neighbors graph and a new constraint utility function to generate queries to the human expert. This mechanism is paired with propagation and refinement processes that limit the number of constraint candidates and introduce a minimal diversity in the proposed constraints. Existing constraint selection heuristics are based on a random selection or on a min-max criterion and thus are either inefficient or more adapted to spherical clusters. Contrary to these approaches, our method is designed to be beneficial for all constraint-based clustering algorithms. Comparative experiments conducted on real datasets and with two distinct representative constraint-based clustering algorithms show that our approach significantly improves clustering quality while minimizing the number of human expert solicitations.

2023, British Journal of Neurosurgery

An important aim in bilateral negotiations is to achieve a win-win solution for both parties; therefore, a critical aspect of a negotiating agent's success is its ability to take the opponent's preferences into account. Every year, new... more

An important aim in bilateral negotiations is to achieve a win-win solution for both parties; therefore, a critical aspect of a negotiating agent's success is its ability to take the opponent's preferences into account. Every year, new negotiation agents are introduced with better learning techniques to model the opponent. Our main goal in this work is to evaluate and compare the performance of a selection of state-of-the-art online opponent modeling techniques in negotiation, and to determine under which circumstances they are beneficial in a real-time, online negotiation setting. Towards this end, we provide an overview of the factors influencing the quality of a model and we analyze how the performance of opponent models depends on the negotiation setting. This results in better insight into the performance of opponent models, and allows us to pinpoint well-performing opponent modeling techniques that did not receive much previous attention in literature.

2023

El trabajo que a continuación se presenta tiene como objetivo construir un sitio web que recoja informaciones referentes al ámbito de la Seguridad Informática. Este sitio no cumple un propósito divulgativo, si no que se dirige... more

El trabajo que a continuación se presenta tiene como objetivo construir un sitio web que recoja informaciones referentes al ámbito de la Seguridad Informática. Este sitio no cumple un propósito divulgativo, si no que se dirige especialmente a profesionales en informática, concretamente a los que se dedican a esta temática de la seguridad.

2023, Matrix Science Mathematic

Cloud computing is a platform for dynamically enhancing capabilities and increasing potentialities without requiring the addition of new hardware, employees, or software. Additionally, cloud computing began as a business notion and later... more

Cloud computing is a platform for dynamically enhancing capabilities and increasing potentialities without requiring the addition of new hardware, employees, or software. Additionally, cloud computing began as a business notion and later became a successful IT innovation. Concerns have been expressed about the security of the cloud environment, though, considering that the cloud contains a lot of information about people and businesses. Although cloud computing has received a lot of attention, clients are still hesitant to move their businesses there. However, the only significant issue preventing greater adoption of cloud computing is a lack of protection. Additionally, the market is wary about cloud computing due to the intricacy with which it controls data protection and confidentiality. When used in a cloud context, the cloud model architecture jeopardises the security of current technology. Therefore, cloud service customers should be aware of the risks associated with uploadin...

2023, Electronics

This paper introduces a method for optimizing non-recursive filtering algorithms. A mathematical model of a non-recursive digital filter is proposed and a performance estimation is given. A method for optimizing the structural... more

This paper introduces a method for optimizing non-recursive filtering algorithms. A mathematical model of a non-recursive digital filter is proposed and a performance estimation is given. A method for optimizing the structural implementation of the modular digital filter is described. The essence of the optimization is that by using the property of the residue ring and the properties of the symmetric impulse response of the filter, it is possible to obtain a filter having almost a half the length of the impulse response compared to the traditional modular filter. A difference equation is given by calculating the output sample of modules p1 … pn in the modified modular digital filter. The performance of the modular filters was compared with the performance of positional non-recursive filters implemented on a digital signal processor. An example of the estimation of the hardware costs is shown to be required for implementing a modular digital filter with a modified structure. This pap...

2023, Pattern Recognition

In this article, we address the problem of automatic constraint selection to improve the performance of constraint-based clustering algorithms. To this aim we propose a novel active learning algorithm that relies on a k-nearest neighbors... more

In this article, we address the problem of automatic constraint selection to improve the performance of constraint-based clustering algorithms. To this aim we propose a novel active learning algorithm that relies on a k-nearest neighbors graph and a new constraint utility function to generate queries to the human expert. This mechanism is paired with propagation and refinement processes that limit the number of constraint candidates and introduce a minimal diversity in the proposed constraints. Existing constraint selection heuristics are based on a random selection or on a min-max criterion and thus are either inefficient or more adapted to spherical clusters. Contrary to these approaches, our method is designed to be beneficial for all constraint-based clustering algorithms. Comparative experiments conducted on real datasets and with two distinct representative constraint-based clustering algorithms show that our approach significantly improves clustering quality while minimizing the number of human expert solicitations.