Dr Punam Bedi - Academia.edu (original) (raw)

Papers by Dr Punam Bedi

Research paper thumbnail of Towards Optimal NLP Solutions: Analyzing GPT and LLaMA-2 Models Across Model Scale, Dataset Size, and Task Diversity

Engineering, Technology and Applied science research/Engineering, Technology and Applied Science Research, Jun 1, 2024

This study carries out a comprehensive comparison of fine-tuned GPT models (GPT-2, GPT-3, GPT-3.5... more This study carries out a comprehensive comparison of fine-tuned GPT models (GPT-2, GPT-3, GPT-3.5) and LLaMA-2 models (LLaMA-2 7B, LLaMA-2 13B, LLaMA-2 70B) in text classification, addressing dataset sizes, model scales, and task diversity. Since its inception in 2018, the GPT series has been pivotal in advancing NLP, with each iteration introducing substantial enhancements. Despite its progress, detailed analyses, especially against competitive open-source models like the LLaMA-2 series in text classification, remain scarce. The current study fills this gap by fine-tuning these models across varied datasets, focusing on enhancing task-specific performance in hate speech and offensive language detection, fake news classification, and sentiment analysis. The learning efficacy and efficiency of the GPT and LLaMA-2 models were evaluated, providing a nuanced guide to choosing optimal models for NLP tasks based on architectural benefits and adaptation efficiency with limited data and resources. In particular, even with datasets as small as 1,000 rows per class, the F1 scores for the GPT-3.5 and LLaMA-2 models exceeded 0.9, reaching 0.99 with complete datasets. Additionally, the LLaMA-2 13B and 70B models outperformed GPT-3, demonstrating their superior efficiency and effectiveness in text classification. Both the GPT and LLaMA-2 series showed commendable performance on all three tasks, underscoring their ability to handle a diversity of tasks. Based on the size, performance, and resources required for fine-tuning the model, this study identifies LLaMA-2 13B as the most optimal model for NLP tasks.

Research paper thumbnail of Session-aware recommender system using double deep reinforcement learning

Journal of Intelligent Information Systems

Research paper thumbnail of Versioning OWL Ontologies using Temporal Tags

World Academy of Science, Engineering and Technology, International Journal of Computer, Electrical, Automation, Control and Information Engineering, Mar 25, 2007

Ontologies play an important role in semantic web applications and are often developed by differe... more Ontologies play an important role in semantic web applications and are often developed by different groups and continues to evolve over time. The knowledge in ontologies changes very rapidly that make the applications outdated if they continue to use old versions or unstable if they jump to new versions. Temporal frames using frame versioning and slot versioning are used to take care of dynamic nature of the ontologies. The paper proposes new tags and restructured OWL format enabling the applications to work with the old or new version of ontologies. Gene Ontology, a very dynamic ontology, has been used as a case study to explain the OWL Ontology with Temporal Tags.

Research paper thumbnail of Trust Based Recommender System for the Semantic Web, Proc. IJCAI'07

This paper proposes the design of a recommender system that uses knowledge stored in the form of ... more This paper proposes the design of a recommender system that uses knowledge stored in the form of ontologies. The interactions amongst the peer agents for generating recommendations are based on the trust network that exists between them. Recommendations about a product given by peer agents are in the form of Intuitionistic Fuzzy Sets specified using degree of membership, non membership and uncertainty. In literature, the recommender systems use databases to generate recommendations. The presented design uses ontologies, a knowledge representation technique for creating annotated content for Semantic Web. Seeing the potential and popularity of ontologies among researchers, we believe that ontologies will be build and maintained in numerous knowledge domains for the Semantic Web and future applications. The presented recommender system uses temporal ontologies that absorb the effect of changes in the ontologies due to the dynamic nature of domains, in addition to the benefits of ontologies. A case study of tourism recommender system is chosen to generate the recommendations for the selection of destination, travel agents and the flight schedule. A comparison of the generated recommendations with the manual recommendations by peers establishes the validity of the presented recommender system.

Research paper thumbnail of CSE-IDS: Using cost-sensitive deep learning and ensemble algorithms to handle class imbalance in network-based intrusion detection systems

Computers & Security, 2022

Research paper thumbnail of Encrypted Traffic Classification Using eXtreme Gradient Boosting Algorithm

Advances in intelligent systems and computing, Aug 29, 2021

Research paper thumbnail of Dark Web

IGI Global eBooks, 2020

The World Wide Web is a part of the Internet that provides data dissemination facility to people.... more The World Wide Web is a part of the Internet that provides data dissemination facility to people. The contents of the Web are crawled and indexed by search engines so that they can be retrieved, ranked, and displayed as a result of users' search queries. These contents that can be easily retrieved using Web browsers and search engines comprise the Surface Web. All information that cannot be crawled by search engines' crawlers falls under Deep Web. Deep Web content never appears in the results displayed by search engines. Though this part of the Web remains hidden, it can be reached using targeted search over normal Web browsers. Unlike Deep Web, there exists a portion of the World Wide Web that cannot be accessed without special software. This is known as the Dark Web. This chapter describes how the Dark Web differs from the Deep Web and elaborates on the commonly used software to enter the Dark Web. It highlights the illegitimate and legitimate sides of the Dark Web and specifies the role played by cryptocurrencies in the expansion of Dark Web's user base.

Research paper thumbnail of Effect of Activation Functions on the Performance of Deep Learning Algorithms for Network Intrusion Detection Systems

Lecture notes in electrical engineering, Sep 24, 2019

Increased capability and complexity of present-day networks is a product of advancements in techn... more Increased capability and complexity of present-day networks is a product of advancements in technology which has strengthened inter-human connectivity like never before. But technological advancements empower both the developer as well as the attacker. As a result, the severity of network-based attacks have also escalated immensely. The need of the hour is to develop sophisticated intrusion detection systems that are equipped with state of the art technologies like deep learning. Several deep learning architectures for anomaly based network intrusion detection system have been proposed in literature and different authors have worked with different types of activation functions using the same algorithm and obtained different results. Due to this, performance comparison between different works based on the same algorithm differs and thus they cannot be compared. Also the use of traditional intrusion detection datasets (DARPA, KDD98, KDD99) does not provide an accurate measure of the effectiveness of deep learning algorithms for intrusion detection because these datasets lack many modern day attacks and characteristics of real time traffic. To fill these research gaps, we analyze the effect of activation functions on the performance of two deep learning algorithms: Deep Artificial Neural Network (DNN) and Convolutional Neural Network (CNN) on two recent intrusion detection datasets: NSL-KDD and UNSW-NB15 in this paper. This paper attempts to select the best activation function to tune DNN and CNN models to attain maximum accuracy in minimum time for network intrusion detection systems.

Research paper thumbnail of LIO-IDS: Handling class imbalance using LSTM and improved one-vs-one technique in intrusion detection system

Computer Networks, Jun 1, 2021

Research paper thumbnail of A Survey on Intrusion Detection and Prevention Systems

SN computer science, Jun 10, 2023

Smart Grids integrate the traditional power grid with information processing and communication te... more Smart Grids integrate the traditional power grid with information processing and communication technologies. In particular, substation intelligent devices can now communicate with each other digitally to enable remote information gathering, monitoring, and control. There have been many efforts to promote global communication standards. The IEC-61850 international standard addresses substation communication networks and systems. Despite the many benefits, this standardized communication poses new cyber-security challenges. Also, traditional Intrusion Detection Systems (IDSs) may not be suitable for digital substations, given their critical components and stringent time requirements. We present an in-depth analysis of attacks exploiting IEC-61850 substations and recent research efforts for detecting and preventing them. Our main contribution is an original taxonomy comprising design and evaluation aspects for substation-specific IDSs. This taxonomy includes IDS's architectures, detection approaches, analysis, actions, data sources, detection range, validation strategies, and metrics. Additionally, we present a compilation of the detection rules deployed by the state-of-art IDSs and assess their resiliency to five types of attacks. Our assessment reveals that some attacks are covered by currently-deployed IDSs, but, particularly, further advancement is necessary to deal with masquerade attacks. Finally, we discuss trends, open issues, and future research topics.

Research paper thumbnail of Bot Development for Military Wargaming Simulation

Communications in Computer and Information Science, 2018

Over the years many techniques have been used by the decision makers to test their theories in a ... more Over the years many techniques have been used by the decision makers to test their theories in a near real world simulated situation. Military war games are one such platform where these theories can be tested and the outcome is recorded. War gaming has evolved from its starting as a board game to now widely played as a computer game. This paper focuses on a military war game simulation which is being used to develop a bot using artificial intelligence techniques. Various modules are developed for working of the bot. Route planning is one of them which is used by units to find a path in the game. Thus, an algorithm is proposed using the A* algorithm as a base. A* algorithm is modified to use influence maps in order to find a safe path for the units which helps the bot succeed in the game. The modified A* algorithm is implemented and compared with a traditional A* algorithm and modified algorithm is found to be more optimal.

Research paper thumbnail of Session Based Recommendations using CNN-LSTM with Fuzzy Time Series

Artificial Intelligence and Speech Technology, 2022

Research paper thumbnail of Map construction and localization using Lego Mindstorms NXT

Journal of Automation, Mobile Robotics and Intelligent Systems, 2013

Maps are very useful for understanding unknown places before visiting them as maps represent spat... more Maps are very useful for understanding unknown places before visiting them as maps represent spatial relationships between various objects in a region. Using robots for map construction is an important field these days as robots can reach places which may be inaccessible to human beings. This paper presents a method to use the data obtained from a single ultrasonic sensor mounted on a robot, to construct a map and localize the robot within that map. Map of the previously unknown environment is created with the help of a mobile robot, built using Lego Mindstorms NXT assembled in a modified TriBot configuration. The robot is equipped with an ultrasonic sensor and is controlled from a computer system running a MATLAB program, which communicates with the NXT over a USB or Bluetooth connection and performs complex calculations that are not possible for the NXT itself. After the map construction, the robot finds its position in the map by using a particle filter. Implementation has been done in MATLAB programming environment using RWTH-Mindstorms NXT Toolbox and has been successfully tested for map construction of a room and localization within that room with the use of a TriBot.

Research paper thumbnail of 18 Using deep learning for image-based plant disease detection

Internet of Things and Machine Learning in Agriculture, 2021

Research paper thumbnail of PlantGhostNet: An Efficient Novel Convolutional Neural Network Model to Identify Plant Diseases Automatically

2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), 2021

Plants are an integral part of the food chain. They provide food to the majority of the world&#39... more Plants are an integral part of the food chain. They provide food to the majority of the world's population. Besides this, the economy of many countries depends on agriculture directly or indirectly. The productivity of agriculture is directly dependent on the health of crops. Therefore, the identification of diseases in plants during their initial growing phases provides more profit to farmers and improves the economies of many agrarian countries. This paper proposes a novel Convolutional Neural Network (CNN) model named PlantGhostNet for automatic plant disease detection. The proposed PlantGhostNet model reduces the amount of trainable parameters significantly. To decrease the trainable parameters, the Ghost Module is used. This research work also uses Squeeze-and-Excitation Module for performance improvement of the proposed model. To the best of our knowledge, there is no research work present in the literature that utilizes the combination Ghost Module and Squeeze-and-Excitation Module for plant disease detection. The PlantGhostNet model is used to identify the Bacterial Spot disease of peach plants. However, the proposed work can be applied to diagnose other plant diseases as well. The PlantGhostNet model achieves 99.75 percent training accuracy and 99.51 percent validation accuracy in detecting Bacterial Spot disease of peach plants. High accuracy and less amount of trainable parameters make the PlantGhostNet model suitable to be deployed in low computational power devices such as smartphones, tablets, etc.

Research paper thumbnail of Multifaceted Reciprocal Recommendations for Online Dating

2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), 2021

Recommender Systems (RS) aim to filter relevant items from the huge pool of information available... more Recommender Systems (RS) aim to filter relevant items from the huge pool of information available to users and assist users by predicting their future preferences. In traditional item-to-user based RS, items are recommended to user and preferences of the user are considered for generating recommendations. Reciprocal Recommender Systems (RRS) are people-to-people systems which recommend users to each other. Thus, the preferences of both the users should be satisfied in order to produce successful recommendations. In RRS, two unilateral user-to-user preference scores are aggregated to generate reciprocal recommendations that should be accepted by both the users involved. In this paper, we propose a reciprocal recommendation algorithm that computes unilateral preference scores based on multiple aspects including multi-criteria preferences of a user, popularity-awareness, demographic information and availability of users. Experimental study conducted with speed-dating experiment data set demonstrate the effectiveness of the proposed approach.

Research paper thumbnail of Sentiment Analysis Using Fuzzy-Deep Learning

Proceedings of ICETIT 2019, 2019

Sentiment Analysis also termed as opinion mining is a classification process which is used to det... more Sentiment Analysis also termed as opinion mining is a classification process which is used to determine the polarity associated with user reviews in the form of text or images or speech. Due to the rapid growth in usage of social media such as forums, social networks, micro blogs etc., the need for sentiment analysis has simultaneously increased. Sentiment analysis can be helpful in improving marketing strategy for a product or customer services, providing information about in general public sentiment for a political party or candidate etc. Over the years, various techniques have been developed to provide user with better sentiment classification. These techniques have evolved from lexicon based to machine learning and now to deep learning. But there is an inherent uncertainty in natural language which could not be handled even by the most advanced deep learning techniques. Deep learning networks perform automatic feature extraction from given data. But, fuzzy logic helps us to deal with this uncertainty by providing us with decision making capabilities in the presence of ambiguity. Our aim is to improve sentiment analysis prediction for textual data by incorporating fuzziness with deep learning. So, in this paper we have combined the learning capabilities of deep learning and uncertainty handling abilities of fuzzy logic to provide more appropriate sentiment prediction to the user. We have used LSTM, a type of Recurrent Neural Network (RNN) for sentiment prediction. These networks have helped us to improve prediction accuracy as they are capable of dealing with long-term dependencies in the data.

Research paper thumbnail of Plant disease detection using hybrid model based on convolutional autoencoder and convolutional neural network

Artificial Intelligence in Agriculture, 2021

Abstract Plants are susceptive to various diseases in their growing phases. Early detection of di... more Abstract Plants are susceptive to various diseases in their growing phases. Early detection of diseases in plants is one of the most challenging problems in agriculture. If the diseases are not identified in the early stages, then they may adversely affect the total yield, resulting in a decrease in the farmers' profits. To overcome this problem, many researchers have presented different state-of-the-art systems based on Deep Learning and Machine Learning approaches. However, most of these systems either use millions of training parameters or have low classification accuracies. This paper proposes a novel hybrid model based on Convolutional Autoencoder (CAE) network and Convolutional Neural Network (CNN) for automatic plant disease detection. To the best of our knowledge, a hybrid system based on CAE and CNN to detect plant diseases automatically has not been proposed in any state-of-the-art systems present in the literature. In this work, the proposed hybrid model is applied to detect Bacterial Spot disease present in peach plants using their leaf images, however, it can be used for any plant disease detection. The experiments performed in this paper use a publicly available dataset named PlantVillage to get the leaf images of peach plants. The proposed system achieves 99.35% training accuracy and 98.38% testing accuracy using only 9,914 training parameters. The proposed hybrid model requires lesser number of training parameters as compared to other approaches existing in the literature. This, in turn, significantly decreases the time required to train the model for automatic plant disease detection and the time required to identify the disease in plants using the trained model.

Research paper thumbnail of Dark Web

Encyclopedia of Criminal Activities and the Deep Web, 2020

The World Wide Web is a part of the Internet that provides data dissemination facility to people.... more The World Wide Web is a part of the Internet that provides data dissemination facility to people. The contents of the Web are crawled and indexed by search engines so that they can be retrieved, ranked, and displayed as a result of users' search queries. These contents that can be easily retrieved using Web browsers and search engines comprise the Surface Web. All information that cannot be crawled by search engines' crawlers falls under Deep Web. Deep Web content never appears in the results displayed by search engines. Though this part of the Web remains hidden, it can be reached using targeted search over normal Web browsers. Unlike Deep Web, there exists a portion of the World Wide Web that cannot be accessed without special software. This is known as the Dark Web. This chapter describes how the Dark Web differs from the Deep Web and elaborates on the commonly used software to enter the Dark Web. It highlights the illegitimate and legitimate sides of the Dark Web and spe...

Research paper thumbnail of Blind Quantitative Steganalysis Using CNN–Long Short-Term Memory Architecture

Strategic System Assurance and Business Analytics, 2020

Research paper thumbnail of Towards Optimal NLP Solutions: Analyzing GPT and LLaMA-2 Models Across Model Scale, Dataset Size, and Task Diversity

Engineering, Technology and Applied science research/Engineering, Technology and Applied Science Research, Jun 1, 2024

This study carries out a comprehensive comparison of fine-tuned GPT models (GPT-2, GPT-3, GPT-3.5... more This study carries out a comprehensive comparison of fine-tuned GPT models (GPT-2, GPT-3, GPT-3.5) and LLaMA-2 models (LLaMA-2 7B, LLaMA-2 13B, LLaMA-2 70B) in text classification, addressing dataset sizes, model scales, and task diversity. Since its inception in 2018, the GPT series has been pivotal in advancing NLP, with each iteration introducing substantial enhancements. Despite its progress, detailed analyses, especially against competitive open-source models like the LLaMA-2 series in text classification, remain scarce. The current study fills this gap by fine-tuning these models across varied datasets, focusing on enhancing task-specific performance in hate speech and offensive language detection, fake news classification, and sentiment analysis. The learning efficacy and efficiency of the GPT and LLaMA-2 models were evaluated, providing a nuanced guide to choosing optimal models for NLP tasks based on architectural benefits and adaptation efficiency with limited data and resources. In particular, even with datasets as small as 1,000 rows per class, the F1 scores for the GPT-3.5 and LLaMA-2 models exceeded 0.9, reaching 0.99 with complete datasets. Additionally, the LLaMA-2 13B and 70B models outperformed GPT-3, demonstrating their superior efficiency and effectiveness in text classification. Both the GPT and LLaMA-2 series showed commendable performance on all three tasks, underscoring their ability to handle a diversity of tasks. Based on the size, performance, and resources required for fine-tuning the model, this study identifies LLaMA-2 13B as the most optimal model for NLP tasks.

Research paper thumbnail of Session-aware recommender system using double deep reinforcement learning

Journal of Intelligent Information Systems

Research paper thumbnail of Versioning OWL Ontologies using Temporal Tags

World Academy of Science, Engineering and Technology, International Journal of Computer, Electrical, Automation, Control and Information Engineering, Mar 25, 2007

Ontologies play an important role in semantic web applications and are often developed by differe... more Ontologies play an important role in semantic web applications and are often developed by different groups and continues to evolve over time. The knowledge in ontologies changes very rapidly that make the applications outdated if they continue to use old versions or unstable if they jump to new versions. Temporal frames using frame versioning and slot versioning are used to take care of dynamic nature of the ontologies. The paper proposes new tags and restructured OWL format enabling the applications to work with the old or new version of ontologies. Gene Ontology, a very dynamic ontology, has been used as a case study to explain the OWL Ontology with Temporal Tags.

Research paper thumbnail of Trust Based Recommender System for the Semantic Web, Proc. IJCAI'07

This paper proposes the design of a recommender system that uses knowledge stored in the form of ... more This paper proposes the design of a recommender system that uses knowledge stored in the form of ontologies. The interactions amongst the peer agents for generating recommendations are based on the trust network that exists between them. Recommendations about a product given by peer agents are in the form of Intuitionistic Fuzzy Sets specified using degree of membership, non membership and uncertainty. In literature, the recommender systems use databases to generate recommendations. The presented design uses ontologies, a knowledge representation technique for creating annotated content for Semantic Web. Seeing the potential and popularity of ontologies among researchers, we believe that ontologies will be build and maintained in numerous knowledge domains for the Semantic Web and future applications. The presented recommender system uses temporal ontologies that absorb the effect of changes in the ontologies due to the dynamic nature of domains, in addition to the benefits of ontologies. A case study of tourism recommender system is chosen to generate the recommendations for the selection of destination, travel agents and the flight schedule. A comparison of the generated recommendations with the manual recommendations by peers establishes the validity of the presented recommender system.

Research paper thumbnail of CSE-IDS: Using cost-sensitive deep learning and ensemble algorithms to handle class imbalance in network-based intrusion detection systems

Computers & Security, 2022

Research paper thumbnail of Encrypted Traffic Classification Using eXtreme Gradient Boosting Algorithm

Advances in intelligent systems and computing, Aug 29, 2021

Research paper thumbnail of Dark Web

IGI Global eBooks, 2020

The World Wide Web is a part of the Internet that provides data dissemination facility to people.... more The World Wide Web is a part of the Internet that provides data dissemination facility to people. The contents of the Web are crawled and indexed by search engines so that they can be retrieved, ranked, and displayed as a result of users' search queries. These contents that can be easily retrieved using Web browsers and search engines comprise the Surface Web. All information that cannot be crawled by search engines' crawlers falls under Deep Web. Deep Web content never appears in the results displayed by search engines. Though this part of the Web remains hidden, it can be reached using targeted search over normal Web browsers. Unlike Deep Web, there exists a portion of the World Wide Web that cannot be accessed without special software. This is known as the Dark Web. This chapter describes how the Dark Web differs from the Deep Web and elaborates on the commonly used software to enter the Dark Web. It highlights the illegitimate and legitimate sides of the Dark Web and specifies the role played by cryptocurrencies in the expansion of Dark Web's user base.

Research paper thumbnail of Effect of Activation Functions on the Performance of Deep Learning Algorithms for Network Intrusion Detection Systems

Lecture notes in electrical engineering, Sep 24, 2019

Increased capability and complexity of present-day networks is a product of advancements in techn... more Increased capability and complexity of present-day networks is a product of advancements in technology which has strengthened inter-human connectivity like never before. But technological advancements empower both the developer as well as the attacker. As a result, the severity of network-based attacks have also escalated immensely. The need of the hour is to develop sophisticated intrusion detection systems that are equipped with state of the art technologies like deep learning. Several deep learning architectures for anomaly based network intrusion detection system have been proposed in literature and different authors have worked with different types of activation functions using the same algorithm and obtained different results. Due to this, performance comparison between different works based on the same algorithm differs and thus they cannot be compared. Also the use of traditional intrusion detection datasets (DARPA, KDD98, KDD99) does not provide an accurate measure of the effectiveness of deep learning algorithms for intrusion detection because these datasets lack many modern day attacks and characteristics of real time traffic. To fill these research gaps, we analyze the effect of activation functions on the performance of two deep learning algorithms: Deep Artificial Neural Network (DNN) and Convolutional Neural Network (CNN) on two recent intrusion detection datasets: NSL-KDD and UNSW-NB15 in this paper. This paper attempts to select the best activation function to tune DNN and CNN models to attain maximum accuracy in minimum time for network intrusion detection systems.

Research paper thumbnail of LIO-IDS: Handling class imbalance using LSTM and improved one-vs-one technique in intrusion detection system

Computer Networks, Jun 1, 2021

Research paper thumbnail of A Survey on Intrusion Detection and Prevention Systems

SN computer science, Jun 10, 2023

Smart Grids integrate the traditional power grid with information processing and communication te... more Smart Grids integrate the traditional power grid with information processing and communication technologies. In particular, substation intelligent devices can now communicate with each other digitally to enable remote information gathering, monitoring, and control. There have been many efforts to promote global communication standards. The IEC-61850 international standard addresses substation communication networks and systems. Despite the many benefits, this standardized communication poses new cyber-security challenges. Also, traditional Intrusion Detection Systems (IDSs) may not be suitable for digital substations, given their critical components and stringent time requirements. We present an in-depth analysis of attacks exploiting IEC-61850 substations and recent research efforts for detecting and preventing them. Our main contribution is an original taxonomy comprising design and evaluation aspects for substation-specific IDSs. This taxonomy includes IDS's architectures, detection approaches, analysis, actions, data sources, detection range, validation strategies, and metrics. Additionally, we present a compilation of the detection rules deployed by the state-of-art IDSs and assess their resiliency to five types of attacks. Our assessment reveals that some attacks are covered by currently-deployed IDSs, but, particularly, further advancement is necessary to deal with masquerade attacks. Finally, we discuss trends, open issues, and future research topics.

Research paper thumbnail of Bot Development for Military Wargaming Simulation

Communications in Computer and Information Science, 2018

Over the years many techniques have been used by the decision makers to test their theories in a ... more Over the years many techniques have been used by the decision makers to test their theories in a near real world simulated situation. Military war games are one such platform where these theories can be tested and the outcome is recorded. War gaming has evolved from its starting as a board game to now widely played as a computer game. This paper focuses on a military war game simulation which is being used to develop a bot using artificial intelligence techniques. Various modules are developed for working of the bot. Route planning is one of them which is used by units to find a path in the game. Thus, an algorithm is proposed using the A* algorithm as a base. A* algorithm is modified to use influence maps in order to find a safe path for the units which helps the bot succeed in the game. The modified A* algorithm is implemented and compared with a traditional A* algorithm and modified algorithm is found to be more optimal.

Research paper thumbnail of Session Based Recommendations using CNN-LSTM with Fuzzy Time Series

Artificial Intelligence and Speech Technology, 2022

Research paper thumbnail of Map construction and localization using Lego Mindstorms NXT

Journal of Automation, Mobile Robotics and Intelligent Systems, 2013

Maps are very useful for understanding unknown places before visiting them as maps represent spat... more Maps are very useful for understanding unknown places before visiting them as maps represent spatial relationships between various objects in a region. Using robots for map construction is an important field these days as robots can reach places which may be inaccessible to human beings. This paper presents a method to use the data obtained from a single ultrasonic sensor mounted on a robot, to construct a map and localize the robot within that map. Map of the previously unknown environment is created with the help of a mobile robot, built using Lego Mindstorms NXT assembled in a modified TriBot configuration. The robot is equipped with an ultrasonic sensor and is controlled from a computer system running a MATLAB program, which communicates with the NXT over a USB or Bluetooth connection and performs complex calculations that are not possible for the NXT itself. After the map construction, the robot finds its position in the map by using a particle filter. Implementation has been done in MATLAB programming environment using RWTH-Mindstorms NXT Toolbox and has been successfully tested for map construction of a room and localization within that room with the use of a TriBot.

Research paper thumbnail of 18 Using deep learning for image-based plant disease detection

Internet of Things and Machine Learning in Agriculture, 2021

Research paper thumbnail of PlantGhostNet: An Efficient Novel Convolutional Neural Network Model to Identify Plant Diseases Automatically

2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), 2021

Plants are an integral part of the food chain. They provide food to the majority of the world&#39... more Plants are an integral part of the food chain. They provide food to the majority of the world's population. Besides this, the economy of many countries depends on agriculture directly or indirectly. The productivity of agriculture is directly dependent on the health of crops. Therefore, the identification of diseases in plants during their initial growing phases provides more profit to farmers and improves the economies of many agrarian countries. This paper proposes a novel Convolutional Neural Network (CNN) model named PlantGhostNet for automatic plant disease detection. The proposed PlantGhostNet model reduces the amount of trainable parameters significantly. To decrease the trainable parameters, the Ghost Module is used. This research work also uses Squeeze-and-Excitation Module for performance improvement of the proposed model. To the best of our knowledge, there is no research work present in the literature that utilizes the combination Ghost Module and Squeeze-and-Excitation Module for plant disease detection. The PlantGhostNet model is used to identify the Bacterial Spot disease of peach plants. However, the proposed work can be applied to diagnose other plant diseases as well. The PlantGhostNet model achieves 99.75 percent training accuracy and 99.51 percent validation accuracy in detecting Bacterial Spot disease of peach plants. High accuracy and less amount of trainable parameters make the PlantGhostNet model suitable to be deployed in low computational power devices such as smartphones, tablets, etc.

Research paper thumbnail of Multifaceted Reciprocal Recommendations for Online Dating

2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), 2021

Recommender Systems (RS) aim to filter relevant items from the huge pool of information available... more Recommender Systems (RS) aim to filter relevant items from the huge pool of information available to users and assist users by predicting their future preferences. In traditional item-to-user based RS, items are recommended to user and preferences of the user are considered for generating recommendations. Reciprocal Recommender Systems (RRS) are people-to-people systems which recommend users to each other. Thus, the preferences of both the users should be satisfied in order to produce successful recommendations. In RRS, two unilateral user-to-user preference scores are aggregated to generate reciprocal recommendations that should be accepted by both the users involved. In this paper, we propose a reciprocal recommendation algorithm that computes unilateral preference scores based on multiple aspects including multi-criteria preferences of a user, popularity-awareness, demographic information and availability of users. Experimental study conducted with speed-dating experiment data set demonstrate the effectiveness of the proposed approach.

Research paper thumbnail of Sentiment Analysis Using Fuzzy-Deep Learning

Proceedings of ICETIT 2019, 2019

Sentiment Analysis also termed as opinion mining is a classification process which is used to det... more Sentiment Analysis also termed as opinion mining is a classification process which is used to determine the polarity associated with user reviews in the form of text or images or speech. Due to the rapid growth in usage of social media such as forums, social networks, micro blogs etc., the need for sentiment analysis has simultaneously increased. Sentiment analysis can be helpful in improving marketing strategy for a product or customer services, providing information about in general public sentiment for a political party or candidate etc. Over the years, various techniques have been developed to provide user with better sentiment classification. These techniques have evolved from lexicon based to machine learning and now to deep learning. But there is an inherent uncertainty in natural language which could not be handled even by the most advanced deep learning techniques. Deep learning networks perform automatic feature extraction from given data. But, fuzzy logic helps us to deal with this uncertainty by providing us with decision making capabilities in the presence of ambiguity. Our aim is to improve sentiment analysis prediction for textual data by incorporating fuzziness with deep learning. So, in this paper we have combined the learning capabilities of deep learning and uncertainty handling abilities of fuzzy logic to provide more appropriate sentiment prediction to the user. We have used LSTM, a type of Recurrent Neural Network (RNN) for sentiment prediction. These networks have helped us to improve prediction accuracy as they are capable of dealing with long-term dependencies in the data.

Research paper thumbnail of Plant disease detection using hybrid model based on convolutional autoencoder and convolutional neural network

Artificial Intelligence in Agriculture, 2021

Abstract Plants are susceptive to various diseases in their growing phases. Early detection of di... more Abstract Plants are susceptive to various diseases in their growing phases. Early detection of diseases in plants is one of the most challenging problems in agriculture. If the diseases are not identified in the early stages, then they may adversely affect the total yield, resulting in a decrease in the farmers' profits. To overcome this problem, many researchers have presented different state-of-the-art systems based on Deep Learning and Machine Learning approaches. However, most of these systems either use millions of training parameters or have low classification accuracies. This paper proposes a novel hybrid model based on Convolutional Autoencoder (CAE) network and Convolutional Neural Network (CNN) for automatic plant disease detection. To the best of our knowledge, a hybrid system based on CAE and CNN to detect plant diseases automatically has not been proposed in any state-of-the-art systems present in the literature. In this work, the proposed hybrid model is applied to detect Bacterial Spot disease present in peach plants using their leaf images, however, it can be used for any plant disease detection. The experiments performed in this paper use a publicly available dataset named PlantVillage to get the leaf images of peach plants. The proposed system achieves 99.35% training accuracy and 98.38% testing accuracy using only 9,914 training parameters. The proposed hybrid model requires lesser number of training parameters as compared to other approaches existing in the literature. This, in turn, significantly decreases the time required to train the model for automatic plant disease detection and the time required to identify the disease in plants using the trained model.

Research paper thumbnail of Dark Web

Encyclopedia of Criminal Activities and the Deep Web, 2020

The World Wide Web is a part of the Internet that provides data dissemination facility to people.... more The World Wide Web is a part of the Internet that provides data dissemination facility to people. The contents of the Web are crawled and indexed by search engines so that they can be retrieved, ranked, and displayed as a result of users' search queries. These contents that can be easily retrieved using Web browsers and search engines comprise the Surface Web. All information that cannot be crawled by search engines' crawlers falls under Deep Web. Deep Web content never appears in the results displayed by search engines. Though this part of the Web remains hidden, it can be reached using targeted search over normal Web browsers. Unlike Deep Web, there exists a portion of the World Wide Web that cannot be accessed without special software. This is known as the Dark Web. This chapter describes how the Dark Web differs from the Deep Web and elaborates on the commonly used software to enter the Dark Web. It highlights the illegitimate and legitimate sides of the Dark Web and spe...

Research paper thumbnail of Blind Quantitative Steganalysis Using CNN–Long Short-Term Memory Architecture

Strategic System Assurance and Business Analytics, 2020