Jyoti Nandimath - Academia.edu (original) (raw)

Uploads

Papers by Jyoti Nandimath

Research paper thumbnail of A Survey Paper on Scalable & Routing Efficient Methods for Source Location Privacy in WSNs

International journal of engineering research and technology, Jan 3, 2014

The use of wireless sensor networks (WSNs) in many real time applications are growing significant... more The use of wireless sensor networks (WSNs) in many real time applications are growing significantly due to its wide range of benefits to end users. The major issue with WSNs is the security. Researchers have already presented various methods over WSN security, especially for privacy preservation. From the literature study, the privacy preserving security methods for WSN are having influence over the performance parameters like latency, energy efficiency, communication cost, throughput etc. WSNs are resource constrained, means sensor nodes having limited resources. Most existing methods use the PKI (public key infrastructure) for security purpose, but these methods consume more power of sensor nodes as well as not scalable. Thus to overcome these two limitations, recently the new privacy preservation method is introduced. This method proposed some criteria for the quantitative metrics source location privacy (SLP) for routing oriented methods in WSN. Using this method, the SLP is achieved with goal of efficient energy utilization via the two phase routing. It means SLP through the Routing to a randomly selected intermediate node (RSIN) and a network mixing ring (NMR). However this method is not scalable as required for most of real life applications, as well not evaluated for other performance metrics such as throughput, packet delivery ratio and end to end delay which are vital for any routing scheme in WSN. Therefore in this paper we are presenting the improved method with aim of achieving the network scalability and efficient routing performance while maintaining the source location privacy security.

Research paper thumbnail of Survey Paper on Fault Detection in Sensor Data

International Journal of Science and Research (IJSR), 2015

Research paper thumbnail of E-voting System Using Block-Chain

International Journal for Research in Applied Science and Engineering Technology

Voting in a democratic manner is extremely important in all countries. However, there are problem... more Voting in a democratic manner is extremely important in all countries. However, there are problems with transparency, low voter turnout, and security with conventional voting techniques such as ballot papers and electronic voting machines. The adoption of digital voting systems depends on security, which guarantees defense against data breaches and cyberattacks, particularly during crucial decisionmaking processes. A viable solution is provided by blockchain technology, which delivers unmatched security and transparency. Voter trust may be increased and votes can be protected from tampering thanks to its decentralized structure and encryption protocols. Through the utilization of blockchain technology, countries can bring their voting procedures up to date, promoting openness and integrity in democratic elections while resolving long-standing issues with security.

Research paper thumbnail of Content-Based Image Retrieval Using Features Extracted From Halftoning-Based Block Truncation Coding

IEEE Transactions on Image Processing, 2015

This paper presents a technique for Content-Based Image Retrieval (CBIR) by exploiting the advant... more This paper presents a technique for Content-Based Image Retrieval (CBIR) by exploiting the advantage of low complexity Ordered-Dither Block Truncation Coding (ODBTC) for the generation of image content descriptor. In encoding step, ODBTC compresses an image block into corresponding quantizers and bitmap image. Two image features are proposed to index an image, namely Color Co-occurrence Feature (CCF) and Bit Pattern Features (BPF), which are generated directly from ODBTC encoded data streams without performing the decoding process. The CCF and BPF of an image are simply derived from the two ODBTC quantizers and bitmap, respectively, by involving the visual codebook. Experimental results show that the proposed method is superior to the Block Truncation Coding (BTC) image retrieval systems and the other former methods, and thus prove that the ODBTC scheme is not only suited for image compression since of its simplicity, but also offers a simple and effective descriptor to index images in CBIR system.

Research paper thumbnail of Efficiently Detecting and Analyzing Spam Reviews Using Live Data Feed

In recent year, online reviews have become the most important resource of customer opinion. Exist... more In recent year, online reviews have become the most important resource of customer opinion. Existing research has been focused used on extraction, classification and summarization of opinion from reviews in websites, forums and blogs. Now-a-days consumer can obtain information for products and service from online review resources, which can help them make decision. The social tools provided by the content sharing applications allow online user to interact, to express their opinions and to read opinions from other users. But the spammers provide comments which are written intentionally to mislead users by redirecting them to web sites to increase their rating and to promote products less known on the market. Reading spam comments is a bad experience and a waste of time for most of the online users but can also be harming and cause damage to the reader. Several researchers in this field focused on only fake comments. But, our goal is to detect fake comments which are likely to represe...

Research paper thumbnail of A Network Intrusion Detection System Framework based on Hadoop and GPGPU

In IT industry the business data grows exponentially, which results in concern to enhance the sec... more In IT industry the business data grows exponentially, which results in concern to enhance the security system by implementing effective NIDS (Network Intrusion Detection System).The quick response to detecting intrusion an essential feature of any NIDS system, but due to the huge amount of data obtained from organizations which impacts the performance of NIDS. The reason could be of wide range like network speed, amount of data from servers, and an algorithm which directly or indirectly impact the performance. This paper deals with design consideration of NIDS framework which is based on Hadoop and GPGPU (General Purposed Graphical Processing Unit) approach. The proposed NIDS system handles network traffic through Hadoop Framework and intrusion detection functionality will carry-out by GPGPU. The proposed approach improves the NIDS performance and its capability is to provide quick response to various types of network attacks. We have configured our proposed system with Hadoop Data-...

Research paper thumbnail of An Ontology-based Comprehensive D-matrix Construction for Accurate FDD

There is need to gather information regarding servable symptoms and failure modes to modify the f... more There is need to gather information regarding servable symptoms and failure modes to modify the fault dependency matrix which can be helpful to build Accurate and efficient fault diagnosis. Dependency matrix is an organized diagnostic model to pinch the graded system-level fault diagnostic information. Organizing this information is typically based on the previously known knowledge and research. It’s not enough to collect the information like symptoms, related diagnostic mechanism only once since technology is improving day by day. It is a challenging task to regularly update the D-matrix to have best result. To represent unstructured knowledge, ontology based data mining technology can be helpful which deals with grouping of unstructured data considering similarities and differences between them. Ontology is constructed which will describe the commonly observed correlation in fault diagnostic domains. By using different text mining algorithm, necessary artifacts like symptoms, fail...

Research paper thumbnail of Classification of High Dimensional Class Imbalanced Data Using Data mining Techniques

As increase in data dimensionality classification of data increased. In industries or organizatio... more As increase in data dimensionality classification of data increased. In industries or organizations fault detection is important task. Due to imbalanced of data classification process has problem. In standard algorithm of classification majority classes have priority for classification and minority classes have less priority for classification therefore it is not suitable for minority classes fault detection from data is applied for only majority classes and less for minority classes. Incremental clustering algorithm solved this problem but it reduced data attribute. To maximize the accuracy, time, and memory for this we proposed a feature selection algorithm for better performance of classification and fault detection.

Research paper thumbnail of Secure Distributed Accountability Framework for Data Sharing in Cloud Environment

INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY, 2014

Cloud computing is used to provide scalable services which are easily used over the internet as p... more Cloud computing is used to provide scalable services which are easily used over the internet as per the requirement. A major feature of the cloud services is that users data are remotely processed in unknown machines that users do not know or users are not operating these machines. While using these services provided by cloud computing, users fear of losing their own data. The content of data can be financial, health, personal. To resolve this problem, we use information accountability in decentralized format to keep track of the usage of the users data over the cloud. It is object oriented approach that enables enclosing our logging mechanism together with users data and apply access policies with respect to users data. We use JAR programming which provides the dynamic and traveling object functionality and to track any access to userâs data will call authentication and automated logging mechanism to the JAR files. Each access to users cloud data will be getting recorded in separat...

Research paper thumbnail of Design consideration of Network Intrusion detection system using Hadoop and GPGPU

2015 International Conference on Pervasive Computing (ICPC), 2015

Modern computing has primarily shifted towards the distributed environment using commodity resour... more Modern computing has primarily shifted towards the distributed environment using commodity resources which results in increase in data and its security concern. This paper deals with design consideration of Network Intrusion Detection System (NIDS) based on the Hadoop framework and acceleration of its performance by using General Purpose Graphical Processing Unit (GPGPU). The large volume of data from an entire infrastructure is assigned to Hadoop framework and intrusion detections are carried out on GPGPU. This approach improves NIDS performance and it enables to provide quick response to various attacks on the network. In order to perform the general purposed computation on the GPU, NVidia provides the Compute Unified Device Architecture (CUDA) which is a parallel programming model which performs high-end complex operations using GPU. In order to process large volumes of data in distributed networks, Hadoop framework has to configure with various supporting ecosystems like Flume, Pig, Hive and HBase. These ecosystems enable the Hadoop framework to handle streaming data on the network and large log files on servers. The proposed system is capable of performing analytics over intrusion pattern and their behavior on the network, which helps a network administrator to configure network security policy and settings. Analytics over intrusion is done by using a Score-Weight approach called as Pattern Frequency Inverse Cluster Frequency (PF-ICF). The design consideration of accelerated NIDS is a solution towards the performance issues of various NIDS that faces due to the large volumes of the network traffic.

Research paper thumbnail of Monitoring Information Leakage in a Web Browser

Communications in Computer and Information Science, 2013

ABSTRACT

Research paper thumbnail of Big data analysis using Apache Hadoop

2013 IEEE 14th International Conference on Information Reuse & Integration (IRI), 2013

The paradigm of processing huge datasets has been shifted from centralized architecture to distri... more The paradigm of processing huge datasets has been shifted from centralized architecture to distributed architecture. As the enterprises faced issues of gathering large chunks of data they found that the data cannot be processed using any of the existing centralized architecture solutions. Apart from time constraints, the enterprises faced issues of efficiency, performance and elevated infrastructure cost with the data processing in the centralized environment. With the help of distributed architecture these large organizations were able to overcome the problems of extracting relevant information from a huge data dump. One of the best open source tools used in the market to harness the distributed architecture in order to solve the data processing problems is Apache Hadoop. Using Apache Hadoop's various components such as data clusters, map-reduce algorithms and distributed processing, we will resolve various location-based complex data problems and provide the relevant information back into the system, thereby increasing the user experience.

Research paper thumbnail of A Survey Paper on Scalable & Routing Efficient Methods for Source Location Privacy in WSNs

International journal of engineering research and technology, Jan 3, 2014

The use of wireless sensor networks (WSNs) in many real time applications are growing significant... more The use of wireless sensor networks (WSNs) in many real time applications are growing significantly due to its wide range of benefits to end users. The major issue with WSNs is the security. Researchers have already presented various methods over WSN security, especially for privacy preservation. From the literature study, the privacy preserving security methods for WSN are having influence over the performance parameters like latency, energy efficiency, communication cost, throughput etc. WSNs are resource constrained, means sensor nodes having limited resources. Most existing methods use the PKI (public key infrastructure) for security purpose, but these methods consume more power of sensor nodes as well as not scalable. Thus to overcome these two limitations, recently the new privacy preservation method is introduced. This method proposed some criteria for the quantitative metrics source location privacy (SLP) for routing oriented methods in WSN. Using this method, the SLP is achieved with goal of efficient energy utilization via the two phase routing. It means SLP through the Routing to a randomly selected intermediate node (RSIN) and a network mixing ring (NMR). However this method is not scalable as required for most of real life applications, as well not evaluated for other performance metrics such as throughput, packet delivery ratio and end to end delay which are vital for any routing scheme in WSN. Therefore in this paper we are presenting the improved method with aim of achieving the network scalability and efficient routing performance while maintaining the source location privacy security.

Research paper thumbnail of Survey Paper on Fault Detection in Sensor Data

International Journal of Science and Research (IJSR), 2015

Research paper thumbnail of E-voting System Using Block-Chain

International Journal for Research in Applied Science and Engineering Technology

Voting in a democratic manner is extremely important in all countries. However, there are problem... more Voting in a democratic manner is extremely important in all countries. However, there are problems with transparency, low voter turnout, and security with conventional voting techniques such as ballot papers and electronic voting machines. The adoption of digital voting systems depends on security, which guarantees defense against data breaches and cyberattacks, particularly during crucial decisionmaking processes. A viable solution is provided by blockchain technology, which delivers unmatched security and transparency. Voter trust may be increased and votes can be protected from tampering thanks to its decentralized structure and encryption protocols. Through the utilization of blockchain technology, countries can bring their voting procedures up to date, promoting openness and integrity in democratic elections while resolving long-standing issues with security.

Research paper thumbnail of Content-Based Image Retrieval Using Features Extracted From Halftoning-Based Block Truncation Coding

IEEE Transactions on Image Processing, 2015

This paper presents a technique for Content-Based Image Retrieval (CBIR) by exploiting the advant... more This paper presents a technique for Content-Based Image Retrieval (CBIR) by exploiting the advantage of low complexity Ordered-Dither Block Truncation Coding (ODBTC) for the generation of image content descriptor. In encoding step, ODBTC compresses an image block into corresponding quantizers and bitmap image. Two image features are proposed to index an image, namely Color Co-occurrence Feature (CCF) and Bit Pattern Features (BPF), which are generated directly from ODBTC encoded data streams without performing the decoding process. The CCF and BPF of an image are simply derived from the two ODBTC quantizers and bitmap, respectively, by involving the visual codebook. Experimental results show that the proposed method is superior to the Block Truncation Coding (BTC) image retrieval systems and the other former methods, and thus prove that the ODBTC scheme is not only suited for image compression since of its simplicity, but also offers a simple and effective descriptor to index images in CBIR system.

Research paper thumbnail of Efficiently Detecting and Analyzing Spam Reviews Using Live Data Feed

In recent year, online reviews have become the most important resource of customer opinion. Exist... more In recent year, online reviews have become the most important resource of customer opinion. Existing research has been focused used on extraction, classification and summarization of opinion from reviews in websites, forums and blogs. Now-a-days consumer can obtain information for products and service from online review resources, which can help them make decision. The social tools provided by the content sharing applications allow online user to interact, to express their opinions and to read opinions from other users. But the spammers provide comments which are written intentionally to mislead users by redirecting them to web sites to increase their rating and to promote products less known on the market. Reading spam comments is a bad experience and a waste of time for most of the online users but can also be harming and cause damage to the reader. Several researchers in this field focused on only fake comments. But, our goal is to detect fake comments which are likely to represe...

Research paper thumbnail of A Network Intrusion Detection System Framework based on Hadoop and GPGPU

In IT industry the business data grows exponentially, which results in concern to enhance the sec... more In IT industry the business data grows exponentially, which results in concern to enhance the security system by implementing effective NIDS (Network Intrusion Detection System).The quick response to detecting intrusion an essential feature of any NIDS system, but due to the huge amount of data obtained from organizations which impacts the performance of NIDS. The reason could be of wide range like network speed, amount of data from servers, and an algorithm which directly or indirectly impact the performance. This paper deals with design consideration of NIDS framework which is based on Hadoop and GPGPU (General Purposed Graphical Processing Unit) approach. The proposed NIDS system handles network traffic through Hadoop Framework and intrusion detection functionality will carry-out by GPGPU. The proposed approach improves the NIDS performance and its capability is to provide quick response to various types of network attacks. We have configured our proposed system with Hadoop Data-...

Research paper thumbnail of An Ontology-based Comprehensive D-matrix Construction for Accurate FDD

There is need to gather information regarding servable symptoms and failure modes to modify the f... more There is need to gather information regarding servable symptoms and failure modes to modify the fault dependency matrix which can be helpful to build Accurate and efficient fault diagnosis. Dependency matrix is an organized diagnostic model to pinch the graded system-level fault diagnostic information. Organizing this information is typically based on the previously known knowledge and research. It’s not enough to collect the information like symptoms, related diagnostic mechanism only once since technology is improving day by day. It is a challenging task to regularly update the D-matrix to have best result. To represent unstructured knowledge, ontology based data mining technology can be helpful which deals with grouping of unstructured data considering similarities and differences between them. Ontology is constructed which will describe the commonly observed correlation in fault diagnostic domains. By using different text mining algorithm, necessary artifacts like symptoms, fail...

Research paper thumbnail of Classification of High Dimensional Class Imbalanced Data Using Data mining Techniques

As increase in data dimensionality classification of data increased. In industries or organizatio... more As increase in data dimensionality classification of data increased. In industries or organizations fault detection is important task. Due to imbalanced of data classification process has problem. In standard algorithm of classification majority classes have priority for classification and minority classes have less priority for classification therefore it is not suitable for minority classes fault detection from data is applied for only majority classes and less for minority classes. Incremental clustering algorithm solved this problem but it reduced data attribute. To maximize the accuracy, time, and memory for this we proposed a feature selection algorithm for better performance of classification and fault detection.

Research paper thumbnail of Secure Distributed Accountability Framework for Data Sharing in Cloud Environment

INTERNATIONAL JOURNAL OF COMPUTERS & TECHNOLOGY, 2014

Cloud computing is used to provide scalable services which are easily used over the internet as p... more Cloud computing is used to provide scalable services which are easily used over the internet as per the requirement. A major feature of the cloud services is that users data are remotely processed in unknown machines that users do not know or users are not operating these machines. While using these services provided by cloud computing, users fear of losing their own data. The content of data can be financial, health, personal. To resolve this problem, we use information accountability in decentralized format to keep track of the usage of the users data over the cloud. It is object oriented approach that enables enclosing our logging mechanism together with users data and apply access policies with respect to users data. We use JAR programming which provides the dynamic and traveling object functionality and to track any access to userâs data will call authentication and automated logging mechanism to the JAR files. Each access to users cloud data will be getting recorded in separat...

Research paper thumbnail of Design consideration of Network Intrusion detection system using Hadoop and GPGPU

2015 International Conference on Pervasive Computing (ICPC), 2015

Modern computing has primarily shifted towards the distributed environment using commodity resour... more Modern computing has primarily shifted towards the distributed environment using commodity resources which results in increase in data and its security concern. This paper deals with design consideration of Network Intrusion Detection System (NIDS) based on the Hadoop framework and acceleration of its performance by using General Purpose Graphical Processing Unit (GPGPU). The large volume of data from an entire infrastructure is assigned to Hadoop framework and intrusion detections are carried out on GPGPU. This approach improves NIDS performance and it enables to provide quick response to various attacks on the network. In order to perform the general purposed computation on the GPU, NVidia provides the Compute Unified Device Architecture (CUDA) which is a parallel programming model which performs high-end complex operations using GPU. In order to process large volumes of data in distributed networks, Hadoop framework has to configure with various supporting ecosystems like Flume, Pig, Hive and HBase. These ecosystems enable the Hadoop framework to handle streaming data on the network and large log files on servers. The proposed system is capable of performing analytics over intrusion pattern and their behavior on the network, which helps a network administrator to configure network security policy and settings. Analytics over intrusion is done by using a Score-Weight approach called as Pattern Frequency Inverse Cluster Frequency (PF-ICF). The design consideration of accelerated NIDS is a solution towards the performance issues of various NIDS that faces due to the large volumes of the network traffic.

Research paper thumbnail of Monitoring Information Leakage in a Web Browser

Communications in Computer and Information Science, 2013

ABSTRACT

Research paper thumbnail of Big data analysis using Apache Hadoop

2013 IEEE 14th International Conference on Information Reuse & Integration (IRI), 2013

The paradigm of processing huge datasets has been shifted from centralized architecture to distri... more The paradigm of processing huge datasets has been shifted from centralized architecture to distributed architecture. As the enterprises faced issues of gathering large chunks of data they found that the data cannot be processed using any of the existing centralized architecture solutions. Apart from time constraints, the enterprises faced issues of efficiency, performance and elevated infrastructure cost with the data processing in the centralized environment. With the help of distributed architecture these large organizations were able to overcome the problems of extracting relevant information from a huge data dump. One of the best open source tools used in the market to harness the distributed architecture in order to solve the data processing problems is Apache Hadoop. Using Apache Hadoop's various components such as data clusters, map-reduce algorithms and distributed processing, we will resolve various location-based complex data problems and provide the relevant information back into the system, thereby increasing the user experience.