Jca Ksrce | K.S.R.COLLEGE OF ENGINEERING (original) (raw)
Papers by Jca Ksrce
In this paper, classification of microcalcification in digital mammogram based on Discrete Shearl... more In this paper, classification of microcalcification in digital mammogram based on Discrete Shearlet Transform (DST) is proposed. The clusters of microcalcifications in digital mammograms are a significant sign of breast cancer. In the diagnosis of mammographic clusters, a novel algorithm based on DST is developed to assist radiologists. First, the mammogram image is decomposed by using DST. Then the energy of each sub-band of shearlet transformed image is extracted. These energies are used as features to classify the given mammogram image into benign or malignant. The robust K-Nearest Neighbor (KNN) classifier is used in the classification stage. Microcalcification images in the Mammography Image Analysis society (MIAS) database are used to evaluate the proposed system. Experimental results show that the classification rate of the proposed system based on shearlet transform is satisfied.
In this paper, a fully automated lesion detection method based on color histogram is proposed to ... more In this paper, a fully automated lesion detection method based on color histogram is proposed to help the adept to make a more robust diagnosis. In the proposed method, a preprocessing is applied to remove the irrelevant information and specular reflection from the colposcopy cervical images before feature extraction. Then the color histogram features are extracted and the features are used as an input to KNN, NaiveBayes, and SVM classifier to detect the Aceto White Region (AWR). Segmentation results are evaluated on 240 images of colposcopy and accuracy of the proposed system is over 85%.
(OFDM) has recently been applied in wireless communication systems due to its high data rate tran... more (OFDM) has recently been applied in wireless communication systems due to its high data rate transmission capability with high bandwidth efficiency and its robustness to multi-path delay. UWB (Ultra Wide Band) OFDM communication was proposed for physical layer in the IEEE 802.15.3a standard which covers wideband communication in wireless personal area networks. Channel estimation is an important issue for wireless communication systems. Dynamic channel estimation is necessary before demodulation of UWB OFDM signals since the radio channel is time varying and frequency selective for wideband systems. A channel estimation scheme using a Takagi-Sugeno (T-S) fuzzy-based neural network under the time varying velocity of the mobile station in a UWB OFDM system is proposed in this paper. In our proposal, by utilizing the learning capability of adaptive neuro-fuzzy inference system (ANFIS), the ANFIS is trained with correct channel state information then the trained network is used as a channel estimator. To validate the performance of our proposed method, several simulation results are given and compared with Least Squares (LS) method. The estimated coefficients have been tracked using T-S fuzzy-based neural network and found that it gives more accurate prediction of channel coefficients as compared with Least Squares (LS) channel estimation technique.
In this paper, an Information system for forecasting processes based on unsupervised, supervised ... more In this paper, an Information system for forecasting processes based on unsupervised, supervised neural networks is developed. The unsupervised learning process is performed using Kohonen's Neural Networks (KNN) for clustering of the input space into affinity number of classes. For each class, the supervised learning process is performed using Feedforwared Neural Networks (FNN) The historical database that contains the data of the attributes of the forecasting process that cover two years is formed. The unsupervised process performs the role of front-end data compression. The ANN is used to learn the relationship among past, current, future daily load and weather patterns that obtained from energy Distribution Company, Libya. All input patterns information are stored in distributed form among the various connection weights. The Comparison of the forecasted values of ANN and with without KNN with actual values is made to demonstratrate that the forecasting accuracy with KNN is very encouraging. By the proposed Information system the computation time for the neural network learning with KNN can largely reduced.
Most users process short tasks using MapReduce. In other words, most tasks handled by the Map and... more Most users process short tasks using MapReduce. In other words, most tasks handled by the Map and Reduce functions require low response time. Currently, quite few users use MapReduce for 2D to 3D image processing, which is highly complicated and requires long execution time. However, in our opin-ion, MapReduce is exactly suitable for processing applications of high complexity and high computation. This paper implements MapReduce on an integrated 2D to 3D multi-user scheme, in which Map is respon-sible for image processing procedures of high complexity and high computation, and Reduce is respon-sible for integrating the intermediate data processed by Map for the final output. Different from short tasks, when several users compete simultaneously to acquire data from MapReduce for 2D to 3D appli-cations, data that waits to be processed by Map will be delayed by the current user and Reduce has to wait until the completion of all Map tasks to generate the final result. Therefore, a novel scheduling scheme, Dynamic Switch of Reduce Function (DSRF) Algorithm, is proposed in this paper for MapReduce to switch dynamically to the next task according to the achieved percentage of tasks and reduce the idle time of Reduce. By using Hadoop to implement our Map Reduce platform, we compare the performance of traditional Hadoop with our proposed scheme. The experimental results reveal that our proposed sched-uling scheme efficiently enhances MapReduce performance in running 2D to 3D applications.
Large no of information's are available in internet in terms of text, imagesdios, vidieos.Retrivi... more Large no of information's are available in internet in terms of text, imagesdios, vidieos.Retriving accurate of these information is very essential. For retrieving images different techniques are used automatic image annotation techniques are mostly used for image retrieval. Content-based image retrieval, the problem of searching large image repositories according to their content, has been the subject of a significant amount of computer vision research in the recent past. While early retrieval architectures were based on the query-by-example paradigm, which formulates image retrieval as the search for the best database match to a user-provided query image, it was quickly realized that the design of fully functional retrieval systems would require support for semantic queries. These are systems where the databases of images are annotated with semantic keywords, enabling the user to specify the query through a natural language description of the visual concepts of interest. This realization, combined with the cost of manual image labeling, generated significant interest in the problem of automatically extracting semantic descriptors from images.
The information is presented, transferred and shared using natural language even by naïve users. ... more The information is presented, transferred and shared using natural language even by naïve users. The biggest challenge and research area has been to enable machines understand and decipher what has been communicated to it through natural language. Free form text documents are found aplenty in this information era. In spite of various other mechanisms people have always found expressing their idea using unstructured manner than structured manner. Also knowledge acquisition is necessarily from text corpus which may contain scientific concepts, historical happenings, literature reviews, product reviews, tourist reports, environment impact reports etc. Event reporting in case of disasters or in case of special occasions is also generally done using free form text rather than structured methods since it allows more detailed descriptions to be added in. Much of these afore said text documents acting as an information source and the query posed by the user implicitly have a geographic or spatial reference component present in it. This logically leads to the conclusion by the previous studies that more than 80% of the searches are pertaining to geographic locations. Text documents imply the usage of natural language and as such it yields to explicit vague fuzzy descriptions involving linguistic terms such as near to, far from, to the east of, very close and also implicit vague spatial references. Fuzzy logic is an extension to the Boolean crisp logic to accommodate for the fuzziness of an element belonging to a set. This paper studies the feasibility of fuzzy logic techniques in resolving the spatial uncertainty in text.
The visualization of datasets with two or more dimensions is a vital role in a medical image anal... more The visualization of datasets with two or more dimensions is a vital role in a medical image analysis and research. It normally based on the orientation, region or progression (time) of structures in clinical and non clinical datasets. The actual quantification of the data is typically required to evaluate regions-of-interest (ROIs) or volumes-of-interest (VOIs). It may be either one slice or multiple slices throughout the dataset. The look up table (LUT) in an image datasets is used to highlight structures of interest, control the magnification level and adjust the transfer function. This paper focuses the image datasets of brain and explains the functional brain metabolism of a healthy and Alzheimer's disease in terms of clinical variables.
Construction of a 3-D face model from a 2-D face image is fundamentally important for face recogn... more Construction of a 3-D face model from a 2-D face image is fundamentally important for face recognition and animation because the 3-D face model is invariant to changes of viewpoint, illumination, background clutter, and occlusions. Given a coupled training set that contains pairs of 2-D faces and the corresponding 3-D faces, we train a novel Neural network (Feed Forward Back Propagation) to recover the 3-D face model from a single 2-D face image. The particular face can be reconstructed by its nearest neighbors, we can assume that the linear combination coefficients for a particular 2-D face image reconstruction are identical to those for the corresponding 3-D face model reconstruction. Therefore, we can reconstruct a 3-D face model by using 2-D face image based on the neural networks.
Software as a Service (SaaS) provides access to applications to end users over the Internet witho... more Software as a Service (SaaS) provides access to applications to end users over the Internet without upfront investment in infrastructure and software. To serve their customers, SaaS providers utilize resources of internal data centers or rent resources from a public Infrastructure as a Service (IaaS) provider. In-house hosting can increase administration and maintenance costs whereas renting from an IaaS provider can impact the service quality due to its variable performance. To overcome these limitations, innovative admission control and scheduling algorithms for SaaS providers to effectively utilize public Cloud resources to maximize profit by minimizing cost and improving customer satisfaction level are required. Furthermore, we conduct an extensive evaluation study to analyze which solution suits best in which scenario to maximize SaaS provider's profit.
Computer! As the name suggests, computes complex mathematical operations and performs the desired... more Computer! As the name suggests, computes complex mathematical operations and performs the desired operations. DNA computing is a new method of simulating bio-molecular DNA and computing by means of molecular technical computation. It is a new method of liquid computation by harnessing the enormous parallel computing ability and high memory density of bio-molecules, which brings potential challenges and opportunities to traditional cryptography. DNA cryptography is a new field of research in the recent years which encodes the data to be transmitted in a more secure fashion by using DNA CRYPTOGRAPHIC ALGORITHM [1] over the underlying INTERNATIONAL DATA ENCRYPTION ALGORITHM (IDEA) [9] . The cipher (Unreadable) text produced will be in DNA sequences making it more reliable towards security over transmission of the data. Thus providing an effective transaction of confidential data in military purpose, in cloud computing and in many other applications though it requires high tech lab constrains and implementation. Index Terms -CIPHER, DNA CRYPTOGRAPHY, ENCRYPTION, IDEA. BIBLIOGRAPHY Deepika.S and I am pursuing my final year( IV year) in B.E.
In gene expression dataset, classification is the task of involving high dimensionality and risk ... more In gene expression dataset, classification is the task of involving high dimensionality and risk since large number of features is irrelevant and redundant. The classification requires feature selection method and a classification; hence this paper proposed a method of choosing suitable combination of attribute selection and classifying algorithms for good accuracy in addition for computational efficiency, generalization performance and feature interpretability. In this paper, the comparative study had done by some well known feature selection methods such as FCBF, ReliefF,
The lifetime of Wireless Sensor Networks restricted by the energy level of the nodes .In traditio... more The lifetime of Wireless Sensor Networks restricted by the energy level of the nodes .In traditional WSN, if a sensor node dies the base station could not able to find the dead of the node. Even if predicts it can't able to get the sensing information before it going to die. The node can dies because of animal distraction, natural disaster, battery power consumption. The base station will be in critical situation because of loosing node information like identity of the node, reason for dead, and sensed information. In our proposed method the WSN nodes are arranged in cellular structure, to optimize the coverage area, reliability in getting information from the nodes and minimizing loss of information's are improved.
Mobile Adhoc Networks (MANET) is associate rising space of analysis within the communication netw... more Mobile Adhoc Networks (MANET) is associate rising space of analysis within the communication network world. MANET may be a group of wireless mobile nodes dynamically establishing a brief live network with none use of network infrastructure or centralized administration. Additionally to the high degree of quality, MANET nodes square measure distinguished by their restricted resources like power, bandwidth, processing, and memory. Routing in MANET is serious issue as a result of topology that is changeable because of nodes quality. Routing algorithms uses specific metrics to work out the optimum path between senders and receivers like shortest minimum price, minimum total power transmission and min-max power price etc. several routing protocols are purposed in previous few years. Particularly energy economical routing is most vital as a result of all the nodes square measure battery-powered. Failure of 1 node might have an effect on the complete network. If a node runs out of energy the likelihood of network partitioning are exaggerated. Since each mobile node has restricted power offer, energy depletion is become one in all the most threats to the period of time of the mobile adhoc network. Therefore routing in MANET ought to be in such the way that it'll use the remaining battery power in associate economical thanks to increase the life time of the network. In this paper we proposed Edge Node Calculations (ENC) for optimized link state routing protocol (OLSR) using Potential calculations in MANET. This concept provides better performance to finding path from source to destination and also improves throughput and reduced end-to-end delay.
This paper presents, a Feature Selection using Integer and Binary coded Genetic Algorithm to impr... more This paper presents, a Feature Selection using Integer and Binary coded Genetic Algorithm to improve the performance of SVM Classifier. Data Mining (DM) is the process of exploration and analysis, by automatic or semiautomatic means of large quantities of data in order to discover meaningful patterns and rules. DM methods can be divided into supervised and unsupervised learning techniques. Classification is a supervised learning technique. In this paper classification algorithms like Support Vector Machine (SVM) and Genetic Algorithm (GA) are used to find the classification accuracy for the Wisconsin Breast Cancer dataset. SVM is a class of supervised learning method. In SVM, Radial basis function and Polynomial kernel function are used. GA is a search and optimization method. In GA, Integer and Binary Coded Genetic Algorithm are used. Feature Selection is used to improve the accuracy of the SVM classifier. The comparison of Support Vector Machine and Genetic Algorithm are performed based on the classification accuracy and run time.
Attackers, usually busy to launch malicious threat to damage the compromised host. Botnet's are n... more Attackers, usually busy to launch malicious threat to damage the compromised host. Botnet's are newly developed technology by attackers and its duty to increase the traffic in DNS service to launch attacks. Due to increased traffic in DNS, botmaster's create a new channel between server and client; it has capability to command and control the Operating System and automatic generating more queries over DNS to increase the Traffic. Many botnet operators used HTTP server to pass the information. In this paper, we proposed viable approach called Wide Packet Inspection to analyze the DNS traffic to control and avoid the Botnet. This paper provides a countermeasure against botnet operators to slow down the bot activity.
Breast cancer is the leading cause of cancer related casualties among women all over the world. T... more Breast cancer is the leading cause of cancer related casualties among women all over the world. The mortality rate can be reduced significantly by detecting the disease at its premature stage. Among various screening programs X-ray mammography is an effective screening tool for breast cancer detection. Detection process for presence of microcalcifications in mammogram is cumbersome and time consuming for radiologists. The suspicion of breast cancer arises from the presence of microcalcification clusters. However the radiologists are prone to make false judgments of whether the suspected region is malignant or benign. We have developed a novel algorithm to assist radiologists in the diagnosis of microcalcification clusters. In this work we have investigated the performance of a Computer Aided Diagnosis (CAD) system for the detection of clustered microcalcifications and classifying it as benign or malignant. DWT (Dyadic Wavelet Transformation) features are extracted from preprocessed images and passed through support vector machine (SVM) classifier. The proposed methodology is carried out on mammogram images downloaded from DDSM database. The proposed system results exhibits good classification accuracy.
Wireless Sensor Networks (WSNs) are often deployed in such hostile environments where an adversar... more Wireless Sensor Networks (WSNs) are often deployed in such hostile environments where an adversary can physically capture some of the nodes, reprogram is an first step, and then, can replicate them in a large number of clones, control can be easily taking over the network .recently proposed few distributed solutions to address this fundamental problem. However, these solutions are not satisfactory. First, thing energy and memory demanding: A serious drawback for any protocol to be used in the WSN-resource constrained environment. Further, they are vulnerable to the specific adversary models introduced in this paper. The contributions of this work are threefold. First, we analyze the desirable properties of a distributed mechanism for the detection of node replication attacks. Second, we show that the known solutions for this problem do not completely meet our requirements. Third, we propose a new self-healing, Randomized, Efficient, and Distributed (RED) protocol and ant colony algorithm for the detection of node replication attacks, and we show that it satisfies the introduced requirements. Finally, extensive simulations show that our protocol is highly efficient in communication, memory, and computation; is much more effective than competing solutions in the literature; and is resistant to the new kind of attacks introduced in this paper, while other solutions are not.
Measuring Vocabulary Consistency aims at evaluating the paper for consistency of vocabulary, to b... more Measuring Vocabulary Consistency aims at evaluating the paper for consistency of vocabulary, to be published. This is an application that helps the author to document his/her own paper that they have worked upon. The application works at a Corpus level which contains the words of any particular domain which is pre-generated. The input paper is traversed and the words related to the corpus domain are collected, say Dictionary of paper. The Corpus holds the word with their corresponding synonyms as high, medium and low. The words from "Paper Dictionary" are graded and compared with the corpus and it provides appropriate words that fix for their corresponding grade of vocabulary. Thus the final output will be the original document along with the suggested equivalent grading vocabulary for maintaining consistency of the paper.
In this paper, classification of microcalcification in digital mammogram based on Discrete Shearl... more In this paper, classification of microcalcification in digital mammogram based on Discrete Shearlet Transform (DST) is proposed. The clusters of microcalcifications in digital mammograms are a significant sign of breast cancer. In the diagnosis of mammographic clusters, a novel algorithm based on DST is developed to assist radiologists. First, the mammogram image is decomposed by using DST. Then the energy of each sub-band of shearlet transformed image is extracted. These energies are used as features to classify the given mammogram image into benign or malignant. The robust K-Nearest Neighbor (KNN) classifier is used in the classification stage. Microcalcification images in the Mammography Image Analysis society (MIAS) database are used to evaluate the proposed system. Experimental results show that the classification rate of the proposed system based on shearlet transform is satisfied.
In this paper, a fully automated lesion detection method based on color histogram is proposed to ... more In this paper, a fully automated lesion detection method based on color histogram is proposed to help the adept to make a more robust diagnosis. In the proposed method, a preprocessing is applied to remove the irrelevant information and specular reflection from the colposcopy cervical images before feature extraction. Then the color histogram features are extracted and the features are used as an input to KNN, NaiveBayes, and SVM classifier to detect the Aceto White Region (AWR). Segmentation results are evaluated on 240 images of colposcopy and accuracy of the proposed system is over 85%.
(OFDM) has recently been applied in wireless communication systems due to its high data rate tran... more (OFDM) has recently been applied in wireless communication systems due to its high data rate transmission capability with high bandwidth efficiency and its robustness to multi-path delay. UWB (Ultra Wide Band) OFDM communication was proposed for physical layer in the IEEE 802.15.3a standard which covers wideband communication in wireless personal area networks. Channel estimation is an important issue for wireless communication systems. Dynamic channel estimation is necessary before demodulation of UWB OFDM signals since the radio channel is time varying and frequency selective for wideband systems. A channel estimation scheme using a Takagi-Sugeno (T-S) fuzzy-based neural network under the time varying velocity of the mobile station in a UWB OFDM system is proposed in this paper. In our proposal, by utilizing the learning capability of adaptive neuro-fuzzy inference system (ANFIS), the ANFIS is trained with correct channel state information then the trained network is used as a channel estimator. To validate the performance of our proposed method, several simulation results are given and compared with Least Squares (LS) method. The estimated coefficients have been tracked using T-S fuzzy-based neural network and found that it gives more accurate prediction of channel coefficients as compared with Least Squares (LS) channel estimation technique.
In this paper, an Information system for forecasting processes based on unsupervised, supervised ... more In this paper, an Information system for forecasting processes based on unsupervised, supervised neural networks is developed. The unsupervised learning process is performed using Kohonen's Neural Networks (KNN) for clustering of the input space into affinity number of classes. For each class, the supervised learning process is performed using Feedforwared Neural Networks (FNN) The historical database that contains the data of the attributes of the forecasting process that cover two years is formed. The unsupervised process performs the role of front-end data compression. The ANN is used to learn the relationship among past, current, future daily load and weather patterns that obtained from energy Distribution Company, Libya. All input patterns information are stored in distributed form among the various connection weights. The Comparison of the forecasted values of ANN and with without KNN with actual values is made to demonstratrate that the forecasting accuracy with KNN is very encouraging. By the proposed Information system the computation time for the neural network learning with KNN can largely reduced.
Most users process short tasks using MapReduce. In other words, most tasks handled by the Map and... more Most users process short tasks using MapReduce. In other words, most tasks handled by the Map and Reduce functions require low response time. Currently, quite few users use MapReduce for 2D to 3D image processing, which is highly complicated and requires long execution time. However, in our opin-ion, MapReduce is exactly suitable for processing applications of high complexity and high computation. This paper implements MapReduce on an integrated 2D to 3D multi-user scheme, in which Map is respon-sible for image processing procedures of high complexity and high computation, and Reduce is respon-sible for integrating the intermediate data processed by Map for the final output. Different from short tasks, when several users compete simultaneously to acquire data from MapReduce for 2D to 3D appli-cations, data that waits to be processed by Map will be delayed by the current user and Reduce has to wait until the completion of all Map tasks to generate the final result. Therefore, a novel scheduling scheme, Dynamic Switch of Reduce Function (DSRF) Algorithm, is proposed in this paper for MapReduce to switch dynamically to the next task according to the achieved percentage of tasks and reduce the idle time of Reduce. By using Hadoop to implement our Map Reduce platform, we compare the performance of traditional Hadoop with our proposed scheme. The experimental results reveal that our proposed sched-uling scheme efficiently enhances MapReduce performance in running 2D to 3D applications.
Large no of information's are available in internet in terms of text, imagesdios, vidieos.Retrivi... more Large no of information's are available in internet in terms of text, imagesdios, vidieos.Retriving accurate of these information is very essential. For retrieving images different techniques are used automatic image annotation techniques are mostly used for image retrieval. Content-based image retrieval, the problem of searching large image repositories according to their content, has been the subject of a significant amount of computer vision research in the recent past. While early retrieval architectures were based on the query-by-example paradigm, which formulates image retrieval as the search for the best database match to a user-provided query image, it was quickly realized that the design of fully functional retrieval systems would require support for semantic queries. These are systems where the databases of images are annotated with semantic keywords, enabling the user to specify the query through a natural language description of the visual concepts of interest. This realization, combined with the cost of manual image labeling, generated significant interest in the problem of automatically extracting semantic descriptors from images.
The information is presented, transferred and shared using natural language even by naïve users. ... more The information is presented, transferred and shared using natural language even by naïve users. The biggest challenge and research area has been to enable machines understand and decipher what has been communicated to it through natural language. Free form text documents are found aplenty in this information era. In spite of various other mechanisms people have always found expressing their idea using unstructured manner than structured manner. Also knowledge acquisition is necessarily from text corpus which may contain scientific concepts, historical happenings, literature reviews, product reviews, tourist reports, environment impact reports etc. Event reporting in case of disasters or in case of special occasions is also generally done using free form text rather than structured methods since it allows more detailed descriptions to be added in. Much of these afore said text documents acting as an information source and the query posed by the user implicitly have a geographic or spatial reference component present in it. This logically leads to the conclusion by the previous studies that more than 80% of the searches are pertaining to geographic locations. Text documents imply the usage of natural language and as such it yields to explicit vague fuzzy descriptions involving linguistic terms such as near to, far from, to the east of, very close and also implicit vague spatial references. Fuzzy logic is an extension to the Boolean crisp logic to accommodate for the fuzziness of an element belonging to a set. This paper studies the feasibility of fuzzy logic techniques in resolving the spatial uncertainty in text.
The visualization of datasets with two or more dimensions is a vital role in a medical image anal... more The visualization of datasets with two or more dimensions is a vital role in a medical image analysis and research. It normally based on the orientation, region or progression (time) of structures in clinical and non clinical datasets. The actual quantification of the data is typically required to evaluate regions-of-interest (ROIs) or volumes-of-interest (VOIs). It may be either one slice or multiple slices throughout the dataset. The look up table (LUT) in an image datasets is used to highlight structures of interest, control the magnification level and adjust the transfer function. This paper focuses the image datasets of brain and explains the functional brain metabolism of a healthy and Alzheimer's disease in terms of clinical variables.
Construction of a 3-D face model from a 2-D face image is fundamentally important for face recogn... more Construction of a 3-D face model from a 2-D face image is fundamentally important for face recognition and animation because the 3-D face model is invariant to changes of viewpoint, illumination, background clutter, and occlusions. Given a coupled training set that contains pairs of 2-D faces and the corresponding 3-D faces, we train a novel Neural network (Feed Forward Back Propagation) to recover the 3-D face model from a single 2-D face image. The particular face can be reconstructed by its nearest neighbors, we can assume that the linear combination coefficients for a particular 2-D face image reconstruction are identical to those for the corresponding 3-D face model reconstruction. Therefore, we can reconstruct a 3-D face model by using 2-D face image based on the neural networks.
Software as a Service (SaaS) provides access to applications to end users over the Internet witho... more Software as a Service (SaaS) provides access to applications to end users over the Internet without upfront investment in infrastructure and software. To serve their customers, SaaS providers utilize resources of internal data centers or rent resources from a public Infrastructure as a Service (IaaS) provider. In-house hosting can increase administration and maintenance costs whereas renting from an IaaS provider can impact the service quality due to its variable performance. To overcome these limitations, innovative admission control and scheduling algorithms for SaaS providers to effectively utilize public Cloud resources to maximize profit by minimizing cost and improving customer satisfaction level are required. Furthermore, we conduct an extensive evaluation study to analyze which solution suits best in which scenario to maximize SaaS provider's profit.
Computer! As the name suggests, computes complex mathematical operations and performs the desired... more Computer! As the name suggests, computes complex mathematical operations and performs the desired operations. DNA computing is a new method of simulating bio-molecular DNA and computing by means of molecular technical computation. It is a new method of liquid computation by harnessing the enormous parallel computing ability and high memory density of bio-molecules, which brings potential challenges and opportunities to traditional cryptography. DNA cryptography is a new field of research in the recent years which encodes the data to be transmitted in a more secure fashion by using DNA CRYPTOGRAPHIC ALGORITHM [1] over the underlying INTERNATIONAL DATA ENCRYPTION ALGORITHM (IDEA) [9] . The cipher (Unreadable) text produced will be in DNA sequences making it more reliable towards security over transmission of the data. Thus providing an effective transaction of confidential data in military purpose, in cloud computing and in many other applications though it requires high tech lab constrains and implementation. Index Terms -CIPHER, DNA CRYPTOGRAPHY, ENCRYPTION, IDEA. BIBLIOGRAPHY Deepika.S and I am pursuing my final year( IV year) in B.E.
In gene expression dataset, classification is the task of involving high dimensionality and risk ... more In gene expression dataset, classification is the task of involving high dimensionality and risk since large number of features is irrelevant and redundant. The classification requires feature selection method and a classification; hence this paper proposed a method of choosing suitable combination of attribute selection and classifying algorithms for good accuracy in addition for computational efficiency, generalization performance and feature interpretability. In this paper, the comparative study had done by some well known feature selection methods such as FCBF, ReliefF,
The lifetime of Wireless Sensor Networks restricted by the energy level of the nodes .In traditio... more The lifetime of Wireless Sensor Networks restricted by the energy level of the nodes .In traditional WSN, if a sensor node dies the base station could not able to find the dead of the node. Even if predicts it can't able to get the sensing information before it going to die. The node can dies because of animal distraction, natural disaster, battery power consumption. The base station will be in critical situation because of loosing node information like identity of the node, reason for dead, and sensed information. In our proposed method the WSN nodes are arranged in cellular structure, to optimize the coverage area, reliability in getting information from the nodes and minimizing loss of information's are improved.
Mobile Adhoc Networks (MANET) is associate rising space of analysis within the communication netw... more Mobile Adhoc Networks (MANET) is associate rising space of analysis within the communication network world. MANET may be a group of wireless mobile nodes dynamically establishing a brief live network with none use of network infrastructure or centralized administration. Additionally to the high degree of quality, MANET nodes square measure distinguished by their restricted resources like power, bandwidth, processing, and memory. Routing in MANET is serious issue as a result of topology that is changeable because of nodes quality. Routing algorithms uses specific metrics to work out the optimum path between senders and receivers like shortest minimum price, minimum total power transmission and min-max power price etc. several routing protocols are purposed in previous few years. Particularly energy economical routing is most vital as a result of all the nodes square measure battery-powered. Failure of 1 node might have an effect on the complete network. If a node runs out of energy the likelihood of network partitioning are exaggerated. Since each mobile node has restricted power offer, energy depletion is become one in all the most threats to the period of time of the mobile adhoc network. Therefore routing in MANET ought to be in such the way that it'll use the remaining battery power in associate economical thanks to increase the life time of the network. In this paper we proposed Edge Node Calculations (ENC) for optimized link state routing protocol (OLSR) using Potential calculations in MANET. This concept provides better performance to finding path from source to destination and also improves throughput and reduced end-to-end delay.
This paper presents, a Feature Selection using Integer and Binary coded Genetic Algorithm to impr... more This paper presents, a Feature Selection using Integer and Binary coded Genetic Algorithm to improve the performance of SVM Classifier. Data Mining (DM) is the process of exploration and analysis, by automatic or semiautomatic means of large quantities of data in order to discover meaningful patterns and rules. DM methods can be divided into supervised and unsupervised learning techniques. Classification is a supervised learning technique. In this paper classification algorithms like Support Vector Machine (SVM) and Genetic Algorithm (GA) are used to find the classification accuracy for the Wisconsin Breast Cancer dataset. SVM is a class of supervised learning method. In SVM, Radial basis function and Polynomial kernel function are used. GA is a search and optimization method. In GA, Integer and Binary Coded Genetic Algorithm are used. Feature Selection is used to improve the accuracy of the SVM classifier. The comparison of Support Vector Machine and Genetic Algorithm are performed based on the classification accuracy and run time.
Attackers, usually busy to launch malicious threat to damage the compromised host. Botnet's are n... more Attackers, usually busy to launch malicious threat to damage the compromised host. Botnet's are newly developed technology by attackers and its duty to increase the traffic in DNS service to launch attacks. Due to increased traffic in DNS, botmaster's create a new channel between server and client; it has capability to command and control the Operating System and automatic generating more queries over DNS to increase the Traffic. Many botnet operators used HTTP server to pass the information. In this paper, we proposed viable approach called Wide Packet Inspection to analyze the DNS traffic to control and avoid the Botnet. This paper provides a countermeasure against botnet operators to slow down the bot activity.
Breast cancer is the leading cause of cancer related casualties among women all over the world. T... more Breast cancer is the leading cause of cancer related casualties among women all over the world. The mortality rate can be reduced significantly by detecting the disease at its premature stage. Among various screening programs X-ray mammography is an effective screening tool for breast cancer detection. Detection process for presence of microcalcifications in mammogram is cumbersome and time consuming for radiologists. The suspicion of breast cancer arises from the presence of microcalcification clusters. However the radiologists are prone to make false judgments of whether the suspected region is malignant or benign. We have developed a novel algorithm to assist radiologists in the diagnosis of microcalcification clusters. In this work we have investigated the performance of a Computer Aided Diagnosis (CAD) system for the detection of clustered microcalcifications and classifying it as benign or malignant. DWT (Dyadic Wavelet Transformation) features are extracted from preprocessed images and passed through support vector machine (SVM) classifier. The proposed methodology is carried out on mammogram images downloaded from DDSM database. The proposed system results exhibits good classification accuracy.
Wireless Sensor Networks (WSNs) are often deployed in such hostile environments where an adversar... more Wireless Sensor Networks (WSNs) are often deployed in such hostile environments where an adversary can physically capture some of the nodes, reprogram is an first step, and then, can replicate them in a large number of clones, control can be easily taking over the network .recently proposed few distributed solutions to address this fundamental problem. However, these solutions are not satisfactory. First, thing energy and memory demanding: A serious drawback for any protocol to be used in the WSN-resource constrained environment. Further, they are vulnerable to the specific adversary models introduced in this paper. The contributions of this work are threefold. First, we analyze the desirable properties of a distributed mechanism for the detection of node replication attacks. Second, we show that the known solutions for this problem do not completely meet our requirements. Third, we propose a new self-healing, Randomized, Efficient, and Distributed (RED) protocol and ant colony algorithm for the detection of node replication attacks, and we show that it satisfies the introduced requirements. Finally, extensive simulations show that our protocol is highly efficient in communication, memory, and computation; is much more effective than competing solutions in the literature; and is resistant to the new kind of attacks introduced in this paper, while other solutions are not.
Measuring Vocabulary Consistency aims at evaluating the paper for consistency of vocabulary, to b... more Measuring Vocabulary Consistency aims at evaluating the paper for consistency of vocabulary, to be published. This is an application that helps the author to document his/her own paper that they have worked upon. The application works at a Corpus level which contains the words of any particular domain which is pre-generated. The input paper is traversed and the words related to the corpus domain are collected, say Dictionary of paper. The Corpus holds the word with their corresponding synonyms as high, medium and low. The words from "Paper Dictionary" are graded and compared with the corpus and it provides appropriate words that fix for their corresponding grade of vocabulary. Thus the final output will be the original document along with the suggested equivalent grading vocabulary for maintaining consistency of the paper.