Manas Ranjan Patra - Academia.edu (original) (raw)
Papers by Manas Ranjan Patra
Abstract: Network Intrusion Detection has been an active area of<br> research with the grow... more Abstract: Network Intrusion Detection has been an active area of<br> research with the growing incidences of cybercrimes. This has<br> led to continuous monitoring of network traffic, analysis, and<br> raise alarm if any abnormality is noticed so as to trigger<br> appropriate response in order to curb the possibility of an attack.<br> One of the approaches to deal with the network intrusion<br> problem is to classify the network user behavior as normal or<br> suspicious. Soft computing based techniques are being tried out<br> to classify network users with higher degree of accuracy and low<br> false alarm rate. In this paper, we propose a classification model<br> for the detection of known as well as unknown network attacks<br> based on artificial neural network based techniques namely,<br> RBFN, SOM, LVQ3, SMO, and CNN. Further, in order to<br> improve the performance of the classifier, Z-Score normalizat...
Advances in Intelligent Systems and Computing, 2019
Privacy preservation is a serious concern of today's scenario. Anonymity is a way to preserve pri... more Privacy preservation is a serious concern of today's scenario. Anonymity is a way to preserve privacy of different types of data publications like structural and descriptive. In this paper, we discuss the preservation of both types of data. We use anatomy for descriptive data and sequential clustering for structural data anonymization. The proposed approach is based on assortativity called Anatomy Based Modified Sequential Clustering (ABMSC). We propose a new total info. loss measure for cluster optimization. The performance of the proposed approach is measured by information loss and number of node movements takes place to meet the optimization. The experimental results show that our proposed algorithm outperforms the two state-of-the-art algorithms.
Transactions on Networks and Communications, 2015
With the ever growing number of web services in the Internet the selection of a suitable web serv... more With the ever growing number of web services in the Internet the selection of a suitable web service has become a mind boggling task. Several quality parameters are being considered for a judicious selection of a web service in a given service context. One of the popular measures for classifying web services is the use of Web Service Relevance Function (WSRF). A number of classifiers have been used which have yielded accuracy up to 99% considering WSRF as one of the attributes of QWS dataset. But, the maximum accuracy achieved by any classifier without WSRF is only 89.99%. In this paper, a feed-forward backpropagation neural network (BPNN) with adaptive momentum factor has been employed to further improve the accuracy. Furthermore, a similar neural network is modelled for determining the WSRF of the web services included in the QWS dataset. As an alternative, Genetic Algorithm is used to find the weight factors associated with each parameter to calculate WSRF, assuming it to be a l...
Social network research relies on a variety of data sources, depending on the problem scenario an... more Social network research relies on a variety of data sources, depending on the problem scenario and the questions, which the research is trying to answer or inform. Social networks are very popular nowadays and the understanding of their inner structure seems to be promising area. Cluster analysis has also been an increasingly interesting topic in the area of computational intelligence and found suitable in social network analysis in its social network structure. In this chapter, we use k-cluster analysis with various performance measures to analyse some of the data sources obtained for social network analysis. Our proposed approach is intended to address the users of social network, that will not only help an organization to understand their external and internal associations but also highly necessary for the enhancement of collaboration, innovation and dissemination of knowledge. M. Panda ( ) Department of ECE, Gandhi Institute for Technological Advancement (GITA), Bhubaneswar, Odi...
Transactions on Machine Learning and Artificial Intelligence, 2016
In recent years advancements in communication technology have led to a wide range of Internet ser... more In recent years advancements in communication technology have led to a wide range of Internet services. While an overwhelming number of Internet users have shown interest in such services, incidences of cyber-attacks by miscreants have thwarted their dependence on electronically-accessible services. In order to deal with this alarming situation intrusion detection systems (IDS) have emerged as a potential solution to analyse network activities of users and report attempts of possible intrusions. Building an intrusion detection system is a complex and challenging task. This requires analysis of network data from several dimensions so as to develop a pragmatic system to handle different forms of intrusive behaviour of attackers. In this paper, we propose a hybrid intrusion detection approach which combines techniques based on both fuzzy and rough set theories to classify network data as normal and anomalous. Our approach comprises of two phases; in the first phase the most relevant features are extracted using a set of rank and search based methods; and in the second phase we classify the reduced dataset as normal or anomalous using five different classifiers, namely, Fuzzy Nearest Neighbour, Fuzzy-Rough Nearest Neighbour, Fuzzy-Rough Ownership NN, Vaguely Quantified Nearest Neighbours, and Ordered Weighted Average Nearest Neighbours. Experimental results show that the proposed hybrid approach has the ability to achieve high intrusion detection rate and low false alarm
Rapid expansion of computer network throughout the world has made security a crucial issue in a c... more Rapid expansion of computer network throughout the world has made security a crucial issue in a computing environment. In the recent past several cyber-attacks have corrupted data of many organizations and creating serious problems. Intrusion Detection System which is increasingly a key part of system defense is used to identify abnormal activities in a computer system. The success of an intrusion detection system depends on the selection of the appropriate features in detecting the intrusion activity. Experiments have been conducted using four classifier techniques, viz, Fuzzy NN, Fuzzy Rough NN, VQNN, Fuzzy Rough Ownership NN. We have studied the accuracy, recall, precision, false alarm rate, error rate of all the classifier techniques
Transactions on Networks and Communications, 2015
Web services have become an innovative and accepted means of service delivery over the Internet. ... more Web services have become an innovative and accepted means of service delivery over the Internet. In recent years there has been astounding growth in the number of web services provisioned by businesses and corporate houses. In the presence of a plethora of web services, a service consumer faces the real challenge of making a right choice based on certain preferences. Therefore, it becomes necessary to classify a set of web services based on certain quality parameters in order to facilitate user choice of web services under different scenarios. Several classification techniques have been proposed by researchers to classify data sets in different application domains. In this work, we have employed three fuzzy classifiers, namely, Fuzzy Nearest Neighbor, Fuzzy Rough Nearest Neighbor, and Fuzzy Rough Ownership Nearest Neighbor to classify web services. We have used the standard QWS dataset for our experimentation. The accuracy of the classifiers has been computed with and without feature selection. In order to further improve classification accuracy, a Weighted Average Accuracy technique has been applied to the confusion matrix obtained after feature selection.
Transactions on Machine Learning and Artificial Intelligence, 2015
Confusion matrix is a useful tool to measure the performance of classifiers in their ability to c... more Confusion matrix is a useful tool to measure the performance of classifiers in their ability to classify multi-classed objects. Computation of classification accuracy for 2-classed attributes using confusion matrix is rather straightforward whereas it is quite cumbersome in case of multi-class attributes. In this work, we propose a novel approach to transform an n × n confusion matrix for n-class attributes to its equivalent 2 × 2 weighted average confusion matrix (WACM). The suitability of WACM has been shown for a classification problem using a web service data set. We have computed the accuracy of four classifiers, namely, Naïve Bayes (NB), Genetic Programming (GP), Instance Based Lazy Learner (IB1), and Decision Tree(J48) with and without feature selection. Next, WACM has been employed on the confusion matrix obtained after feature selection which further improves the classification accuracy.
Transactions on Machine Learning and Artificial Intelligence, 2014
Accuracy of a classifier or predictor is normally estimated with the help of confusion matrix, wh... more Accuracy of a classifier or predictor is normally estimated with the help of confusion matrix, which is a useful tool for analyzing how well the classifier can recognize tuples of different classes. Calculation of classification accuracy of a predictor using confusion matrix for two classed attribute is simple. In case of multi classed attribute, we have to take accuracy of all the classes into consideration, to aggregate them to come with the actual accuracy of the particular classifier or predictor for that particular attribute. Here formulating this, weighted average classification accuracy has been introduced for the overall recognition rate of the classifier, which reflects how well the classifier recognizes tuples of various classes. Classification accuracy is being calculated for the classifiers BayesNet(BN), NaiveBayes(NB), J48 and Decision Table(DT) through weighted average accuracy formulation and the trend of the accuracy values for different number of instances is displayed in tables, which shows the flawless calculation.
With recent advances in network based technology and increased dependability of our everyday life... more With recent advances in network based technology and increased dependability of our everyday life on this technology, assuring reliable operation of network based system is very important. Signature based intrusion detection systems cannot detect new attacks. These systems are the most used and developed ones. Current anomaly based intrusion detection systems are also unable to detect all kinds of new
International Journal of Computer Science, Engineering and Applications, 2014
Web portals have served as an excellent medium to facilitate user centric services for organizati... more Web portals have served as an excellent medium to facilitate user centric services for organizations irrespective of the type, size, and domain of operation. The objective of these portals has been to deliver a plethora of services such as information dissemination, transactional services, and customer feedback. Therefore, the design of a web portal is crucial in order that it is accessible to a wide range of user community irrespective of age group, physical abilities, and level of literacy. In this paper, we have studied the compliance of WCAG 2.0 by three different categories of Indian web sites which are most frequently accessed by a large section of user community. We have provided a quantitative evaluation of different aspects of accessibility which we believe can pave the way for better design of web sites by taking care of the deficiencies inherent in the web portals.
Contextual Advancements and New Methods
This chapter presents an overview of the field of recommender systems and describes the current g... more This chapter presents an overview of the field of recommender systems and describes the current generation of recommendation methods with their limitations and possible extensions that can improve the capabilities of the recommendations made suitable for a wide range of applications. In recent years, machine learning algorithms have been considered to be an important part of the recommendation process to take intelligent decisions. The chapter will explore the application of such techniques in the field of network intrusion detection in order to examine the vulnerabilities of different recommendation techniques. Finally, the authors outline some of the major issues in building secure recommendation systems in identifying possible network intrusions.
Advances in Secure Computing, Internet Services, and Applications
Intrusion Detection and Prevention Systems (IDPS) are being widely implemented to prevent suspici... more Intrusion Detection and Prevention Systems (IDPS) are being widely implemented to prevent suspicious threats in computer networks. Intrusion detection and prevention systems are security systems that are used to detect and prevent security threats to computer networks. In order to understand the security risks and IDPS, in this chapter, the authors make a quick review on classification of the IDPSs and categorize them in certain groups. Further, in order to improve accuracy and security, data mining techniques have been used to analyze audit data and extract features that can distinguish normal activities from intrusions. Experiments have been conducted for building efficient intrusion detection and prevention systems by combining online detection and offline data mining. During online data examination, real-time data are captured and are passed through a detection engine that uses a set of rules and parameters for analysis. During offline data mining, necessary knowledge is extracted about the process of intrusion.
Procedia Engineering, 2012
Intrusion detection is an emerging area of research in the computer security and networks with th... more Intrusion detection is an emerging area of research in the computer security and networks with the growing usage of internet in everyday life. Most intrusion detection systems (IDSs) mostly use a single classifier algorithm to classify the network traffic data as normal behaviour or anomalous. However, these single classifier systems fail to provide the best possible attack detection rate with low false alarm rate. In this paper, we propose to use a hybrid intelligent approach using combination of classifiers in order to make the decision intelligently, so that the overall performance of the resultant model is enhanced. The general procedure in this is to follow the supervised or un-supervised data filtering with classifier or clusterer first on the whole training dataset and then the output is applied to another classifier to classify the data. We use 2-class classification strategy along with 10-fold cross validation method to produce the final classification results in terms of normal or intrusion. Experimental results on NSL-KDD dataset, an improved version of KDDCup 1999 dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.
2010 Sixth International Conference on Information Assurance and Security, 2010
This paper applies discriminative multinomial Naïve Bayes with various filtering analysis in orde... more This paper applies discriminative multinomial Naïve Bayes with various filtering analysis in order to build a network intrusion detection system. For our experimental analysis, we used the new NSL-KDD dataset, which is considered as a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. We perform 2 class classifications with 10-fold cross validation for building our proposed model. The experimental results show that the proposed approach is very accurate with low false positive rate and takes less time in comparison to other existing approaches while building an efficient network intrusion detection system.
2008 First International Conference on Emerging Trends in Engineering and Technology, 2008
... Engg. GIET, Gunupur-765022, India. E-Mail: mrutyunjaya.2007@rediffmail.com Manas Ranjan Patra... more ... Engg. GIET, Gunupur-765022, India. E-Mail: mrutyunjaya.2007@rediffmail.com Manas Ranjan Patra Dept. ... 3. Cross-Validation Test Cross-validation (CV) tests exist in a number of variants but the general idea is to divide the training data into a number of partitions or folds. ...
10th International Conference on Information Technology (ICIT 2007), 2007
Distributed Multi-agent System Architecture for Mobile Traders Ananta Charan Ojha School of IT, I... more Distributed Multi-agent System Architecture for Mobile Traders Ananta Charan Ojha School of IT, ICFAI University, Hyderabad, INDIA acojha2002@yahoo.co.in ... Several research efforts have been made towards agent-based e-commerce and found in literature [5, 6, 7, 8, 9, 10]. ...
International journal of recent trends …, 2009
The growing dependence of modern society on telecommunication and information networks has become... more The growing dependence of modern society on telecommunication and information networks has become inevitable. Therefore, the security aspects of such networks play a strategic role in ensuring protection of data against misuse. Intrusion Detection systems (IDS) are meant to detect intruders who elude the "first line" protection. Data mining techniques are being used for building effective IDS. In this paper we analyze the performance of some data classifiers in a heterogeneous environment using voting ensemble system with the purpose of detecting anomaly based network intrusions. Experimental results using KDDCup 1999 benchmark dataset demonstrate that the voting ensemble technique yield significantly better results in detecting intrusions when compared to other techniques.
IJCSNS, 2007
With the tremendous growth of network-based services and sensitive information on networks, netwo... more With the tremendous growth of network-based services and sensitive information on networks, network security is getting more and more importance than ever. Intrusion poses a serious security risk in a network environment. The ever growing new intrusion types posses a serious problem for their detection. The human labelling of the available network audit data instances is usually tedious, time consuming and expensive. In this paper, we apply one of the efficient data mining algorithms called naïve bayes for anomaly based network intrusion detection. Experimental results on the KDD cup'99 data set show the novelty of our approach in detecting network intrusion. It is observed that the proposed technique performs better in terms of false positive rate, cost, and computational time when applied to KDD'99 data sets compared to a back propagation neural network based approach.
Abstract: Network Intrusion Detection has been an active area of<br> research with the grow... more Abstract: Network Intrusion Detection has been an active area of<br> research with the growing incidences of cybercrimes. This has<br> led to continuous monitoring of network traffic, analysis, and<br> raise alarm if any abnormality is noticed so as to trigger<br> appropriate response in order to curb the possibility of an attack.<br> One of the approaches to deal with the network intrusion<br> problem is to classify the network user behavior as normal or<br> suspicious. Soft computing based techniques are being tried out<br> to classify network users with higher degree of accuracy and low<br> false alarm rate. In this paper, we propose a classification model<br> for the detection of known as well as unknown network attacks<br> based on artificial neural network based techniques namely,<br> RBFN, SOM, LVQ3, SMO, and CNN. Further, in order to<br> improve the performance of the classifier, Z-Score normalizat...
Advances in Intelligent Systems and Computing, 2019
Privacy preservation is a serious concern of today's scenario. Anonymity is a way to preserve pri... more Privacy preservation is a serious concern of today's scenario. Anonymity is a way to preserve privacy of different types of data publications like structural and descriptive. In this paper, we discuss the preservation of both types of data. We use anatomy for descriptive data and sequential clustering for structural data anonymization. The proposed approach is based on assortativity called Anatomy Based Modified Sequential Clustering (ABMSC). We propose a new total info. loss measure for cluster optimization. The performance of the proposed approach is measured by information loss and number of node movements takes place to meet the optimization. The experimental results show that our proposed algorithm outperforms the two state-of-the-art algorithms.
Transactions on Networks and Communications, 2015
With the ever growing number of web services in the Internet the selection of a suitable web serv... more With the ever growing number of web services in the Internet the selection of a suitable web service has become a mind boggling task. Several quality parameters are being considered for a judicious selection of a web service in a given service context. One of the popular measures for classifying web services is the use of Web Service Relevance Function (WSRF). A number of classifiers have been used which have yielded accuracy up to 99% considering WSRF as one of the attributes of QWS dataset. But, the maximum accuracy achieved by any classifier without WSRF is only 89.99%. In this paper, a feed-forward backpropagation neural network (BPNN) with adaptive momentum factor has been employed to further improve the accuracy. Furthermore, a similar neural network is modelled for determining the WSRF of the web services included in the QWS dataset. As an alternative, Genetic Algorithm is used to find the weight factors associated with each parameter to calculate WSRF, assuming it to be a l...
Social network research relies on a variety of data sources, depending on the problem scenario an... more Social network research relies on a variety of data sources, depending on the problem scenario and the questions, which the research is trying to answer or inform. Social networks are very popular nowadays and the understanding of their inner structure seems to be promising area. Cluster analysis has also been an increasingly interesting topic in the area of computational intelligence and found suitable in social network analysis in its social network structure. In this chapter, we use k-cluster analysis with various performance measures to analyse some of the data sources obtained for social network analysis. Our proposed approach is intended to address the users of social network, that will not only help an organization to understand their external and internal associations but also highly necessary for the enhancement of collaboration, innovation and dissemination of knowledge. M. Panda ( ) Department of ECE, Gandhi Institute for Technological Advancement (GITA), Bhubaneswar, Odi...
Transactions on Machine Learning and Artificial Intelligence, 2016
In recent years advancements in communication technology have led to a wide range of Internet ser... more In recent years advancements in communication technology have led to a wide range of Internet services. While an overwhelming number of Internet users have shown interest in such services, incidences of cyber-attacks by miscreants have thwarted their dependence on electronically-accessible services. In order to deal with this alarming situation intrusion detection systems (IDS) have emerged as a potential solution to analyse network activities of users and report attempts of possible intrusions. Building an intrusion detection system is a complex and challenging task. This requires analysis of network data from several dimensions so as to develop a pragmatic system to handle different forms of intrusive behaviour of attackers. In this paper, we propose a hybrid intrusion detection approach which combines techniques based on both fuzzy and rough set theories to classify network data as normal and anomalous. Our approach comprises of two phases; in the first phase the most relevant features are extracted using a set of rank and search based methods; and in the second phase we classify the reduced dataset as normal or anomalous using five different classifiers, namely, Fuzzy Nearest Neighbour, Fuzzy-Rough Nearest Neighbour, Fuzzy-Rough Ownership NN, Vaguely Quantified Nearest Neighbours, and Ordered Weighted Average Nearest Neighbours. Experimental results show that the proposed hybrid approach has the ability to achieve high intrusion detection rate and low false alarm
Rapid expansion of computer network throughout the world has made security a crucial issue in a c... more Rapid expansion of computer network throughout the world has made security a crucial issue in a computing environment. In the recent past several cyber-attacks have corrupted data of many organizations and creating serious problems. Intrusion Detection System which is increasingly a key part of system defense is used to identify abnormal activities in a computer system. The success of an intrusion detection system depends on the selection of the appropriate features in detecting the intrusion activity. Experiments have been conducted using four classifier techniques, viz, Fuzzy NN, Fuzzy Rough NN, VQNN, Fuzzy Rough Ownership NN. We have studied the accuracy, recall, precision, false alarm rate, error rate of all the classifier techniques
Transactions on Networks and Communications, 2015
Web services have become an innovative and accepted means of service delivery over the Internet. ... more Web services have become an innovative and accepted means of service delivery over the Internet. In recent years there has been astounding growth in the number of web services provisioned by businesses and corporate houses. In the presence of a plethora of web services, a service consumer faces the real challenge of making a right choice based on certain preferences. Therefore, it becomes necessary to classify a set of web services based on certain quality parameters in order to facilitate user choice of web services under different scenarios. Several classification techniques have been proposed by researchers to classify data sets in different application domains. In this work, we have employed three fuzzy classifiers, namely, Fuzzy Nearest Neighbor, Fuzzy Rough Nearest Neighbor, and Fuzzy Rough Ownership Nearest Neighbor to classify web services. We have used the standard QWS dataset for our experimentation. The accuracy of the classifiers has been computed with and without feature selection. In order to further improve classification accuracy, a Weighted Average Accuracy technique has been applied to the confusion matrix obtained after feature selection.
Transactions on Machine Learning and Artificial Intelligence, 2015
Confusion matrix is a useful tool to measure the performance of classifiers in their ability to c... more Confusion matrix is a useful tool to measure the performance of classifiers in their ability to classify multi-classed objects. Computation of classification accuracy for 2-classed attributes using confusion matrix is rather straightforward whereas it is quite cumbersome in case of multi-class attributes. In this work, we propose a novel approach to transform an n × n confusion matrix for n-class attributes to its equivalent 2 × 2 weighted average confusion matrix (WACM). The suitability of WACM has been shown for a classification problem using a web service data set. We have computed the accuracy of four classifiers, namely, Naïve Bayes (NB), Genetic Programming (GP), Instance Based Lazy Learner (IB1), and Decision Tree(J48) with and without feature selection. Next, WACM has been employed on the confusion matrix obtained after feature selection which further improves the classification accuracy.
Transactions on Machine Learning and Artificial Intelligence, 2014
Accuracy of a classifier or predictor is normally estimated with the help of confusion matrix, wh... more Accuracy of a classifier or predictor is normally estimated with the help of confusion matrix, which is a useful tool for analyzing how well the classifier can recognize tuples of different classes. Calculation of classification accuracy of a predictor using confusion matrix for two classed attribute is simple. In case of multi classed attribute, we have to take accuracy of all the classes into consideration, to aggregate them to come with the actual accuracy of the particular classifier or predictor for that particular attribute. Here formulating this, weighted average classification accuracy has been introduced for the overall recognition rate of the classifier, which reflects how well the classifier recognizes tuples of various classes. Classification accuracy is being calculated for the classifiers BayesNet(BN), NaiveBayes(NB), J48 and Decision Table(DT) through weighted average accuracy formulation and the trend of the accuracy values for different number of instances is displayed in tables, which shows the flawless calculation.
With recent advances in network based technology and increased dependability of our everyday life... more With recent advances in network based technology and increased dependability of our everyday life on this technology, assuring reliable operation of network based system is very important. Signature based intrusion detection systems cannot detect new attacks. These systems are the most used and developed ones. Current anomaly based intrusion detection systems are also unable to detect all kinds of new
International Journal of Computer Science, Engineering and Applications, 2014
Web portals have served as an excellent medium to facilitate user centric services for organizati... more Web portals have served as an excellent medium to facilitate user centric services for organizations irrespective of the type, size, and domain of operation. The objective of these portals has been to deliver a plethora of services such as information dissemination, transactional services, and customer feedback. Therefore, the design of a web portal is crucial in order that it is accessible to a wide range of user community irrespective of age group, physical abilities, and level of literacy. In this paper, we have studied the compliance of WCAG 2.0 by three different categories of Indian web sites which are most frequently accessed by a large section of user community. We have provided a quantitative evaluation of different aspects of accessibility which we believe can pave the way for better design of web sites by taking care of the deficiencies inherent in the web portals.
Contextual Advancements and New Methods
This chapter presents an overview of the field of recommender systems and describes the current g... more This chapter presents an overview of the field of recommender systems and describes the current generation of recommendation methods with their limitations and possible extensions that can improve the capabilities of the recommendations made suitable for a wide range of applications. In recent years, machine learning algorithms have been considered to be an important part of the recommendation process to take intelligent decisions. The chapter will explore the application of such techniques in the field of network intrusion detection in order to examine the vulnerabilities of different recommendation techniques. Finally, the authors outline some of the major issues in building secure recommendation systems in identifying possible network intrusions.
Advances in Secure Computing, Internet Services, and Applications
Intrusion Detection and Prevention Systems (IDPS) are being widely implemented to prevent suspici... more Intrusion Detection and Prevention Systems (IDPS) are being widely implemented to prevent suspicious threats in computer networks. Intrusion detection and prevention systems are security systems that are used to detect and prevent security threats to computer networks. In order to understand the security risks and IDPS, in this chapter, the authors make a quick review on classification of the IDPSs and categorize them in certain groups. Further, in order to improve accuracy and security, data mining techniques have been used to analyze audit data and extract features that can distinguish normal activities from intrusions. Experiments have been conducted for building efficient intrusion detection and prevention systems by combining online detection and offline data mining. During online data examination, real-time data are captured and are passed through a detection engine that uses a set of rules and parameters for analysis. During offline data mining, necessary knowledge is extracted about the process of intrusion.
Procedia Engineering, 2012
Intrusion detection is an emerging area of research in the computer security and networks with th... more Intrusion detection is an emerging area of research in the computer security and networks with the growing usage of internet in everyday life. Most intrusion detection systems (IDSs) mostly use a single classifier algorithm to classify the network traffic data as normal behaviour or anomalous. However, these single classifier systems fail to provide the best possible attack detection rate with low false alarm rate. In this paper, we propose to use a hybrid intelligent approach using combination of classifiers in order to make the decision intelligently, so that the overall performance of the resultant model is enhanced. The general procedure in this is to follow the supervised or un-supervised data filtering with classifier or clusterer first on the whole training dataset and then the output is applied to another classifier to classify the data. We use 2-class classification strategy along with 10-fold cross validation method to produce the final classification results in terms of normal or intrusion. Experimental results on NSL-KDD dataset, an improved version of KDDCup 1999 dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.
2010 Sixth International Conference on Information Assurance and Security, 2010
This paper applies discriminative multinomial Naïve Bayes with various filtering analysis in orde... more This paper applies discriminative multinomial Naïve Bayes with various filtering analysis in order to build a network intrusion detection system. For our experimental analysis, we used the new NSL-KDD dataset, which is considered as a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. We perform 2 class classifications with 10-fold cross validation for building our proposed model. The experimental results show that the proposed approach is very accurate with low false positive rate and takes less time in comparison to other existing approaches while building an efficient network intrusion detection system.
2008 First International Conference on Emerging Trends in Engineering and Technology, 2008
... Engg. GIET, Gunupur-765022, India. E-Mail: mrutyunjaya.2007@rediffmail.com Manas Ranjan Patra... more ... Engg. GIET, Gunupur-765022, India. E-Mail: mrutyunjaya.2007@rediffmail.com Manas Ranjan Patra Dept. ... 3. Cross-Validation Test Cross-validation (CV) tests exist in a number of variants but the general idea is to divide the training data into a number of partitions or folds. ...
10th International Conference on Information Technology (ICIT 2007), 2007
Distributed Multi-agent System Architecture for Mobile Traders Ananta Charan Ojha School of IT, I... more Distributed Multi-agent System Architecture for Mobile Traders Ananta Charan Ojha School of IT, ICFAI University, Hyderabad, INDIA acojha2002@yahoo.co.in ... Several research efforts have been made towards agent-based e-commerce and found in literature [5, 6, 7, 8, 9, 10]. ...
International journal of recent trends …, 2009
The growing dependence of modern society on telecommunication and information networks has become... more The growing dependence of modern society on telecommunication and information networks has become inevitable. Therefore, the security aspects of such networks play a strategic role in ensuring protection of data against misuse. Intrusion Detection systems (IDS) are meant to detect intruders who elude the "first line" protection. Data mining techniques are being used for building effective IDS. In this paper we analyze the performance of some data classifiers in a heterogeneous environment using voting ensemble system with the purpose of detecting anomaly based network intrusions. Experimental results using KDDCup 1999 benchmark dataset demonstrate that the voting ensemble technique yield significantly better results in detecting intrusions when compared to other techniques.
IJCSNS, 2007
With the tremendous growth of network-based services and sensitive information on networks, netwo... more With the tremendous growth of network-based services and sensitive information on networks, network security is getting more and more importance than ever. Intrusion poses a serious security risk in a network environment. The ever growing new intrusion types posses a serious problem for their detection. The human labelling of the available network audit data instances is usually tedious, time consuming and expensive. In this paper, we apply one of the efficient data mining algorithms called naïve bayes for anomaly based network intrusion detection. Experimental results on the KDD cup'99 data set show the novelty of our approach in detecting network intrusion. It is observed that the proposed technique performs better in terms of false positive rate, cost, and computational time when applied to KDD'99 data sets compared to a back propagation neural network based approach.