Mohd Shoaib - Academia.edu (original) (raw)

Book Chapters by Mohd Shoaib

Research paper thumbnail of Deep Learning-Based Plant Disease Detection Using Android Apps

Artificial Intelligence Applications in Agriculture and Food Quality Improvement

Disease detection in plants is crucial for preventing losses in yield and agricultural productivi... more Disease detection in plants is crucial for preventing losses in yield and agricultural productivity. Historically, disease identification has been supported by agriculture extension organizations, which were difficult to access from villages. Farmers have to go to their field and manually monitor plant disease. The aim of this work is to develop an Android application that provides an easy-to-use platform for farmers to identify diseases in their crops. The mobile application will help to take responsive action according to the disease detected in their plants and can be easily used by anyone who is interested in analyzing the disease of the plants. This work reports on the classification of 26 diseases in 14 crop species using 54,306 images from PlantVillage dataset using a convolutional neural network approach. The models used are Inception-v3 and MobileNet. The correct prediction of the correct crop-diseases pair in 38 classes decides the criteria for performance measurement. The...

Research paper thumbnail of Cyber Forensic Lab Setup and Its Requirement

Advances in Digital Crime, Forensics, and Cyber Terrorism, 2020

The year 1978 was the year when the first computer-related crime took place; it was alteration or... more The year 1978 was the year when the first computer-related crime took place; it was alteration or deletion of data. The day-by-day level and intensity of cybercrime has strengthened and is getting stronger in nature in the current era. So, to achieve accuracy during the investigation, an intensive investigation environment or lab is needed. This will help the investigation team in various ways. More advanced techniques and tools are used in a current age lab setup, and solutions to forensically examine a variety of digital devices apart from computers are made.

Research paper thumbnail of Introduction to Email, Web, and Message Forensics

Advances in Digital Crime, Forensics, and Cyber Terrorism, 2020

With the advent of information and communication technologies, e-mail is one of the prime tools f... more With the advent of information and communication technologies, e-mail is one of the prime tools for communication. Almost everyone uses emails for business communications as well as for sharing personal information. E-mail is vulnerable against cybercrimes. This chapter focuses on the basic concepts of web forensics and then provides an insight about the e-mailing policies, email architecture, and existing investigation techniques used by forensic investigators. Most of the organizations implement some standard e-mailing policies; however, it is not sufficient to handle and prevent the digital crimes. We need to monitor and investigate emails for proper check. This chapter discusses some prominent tools and techniques through which forensic experts can gather and analyze data about suspected email accounts for investigation and produce evidence in the court of law.

Research paper thumbnail of Big Data and Analytics

BigDataandAnalyticshavebecomekeyconceptswithinthecorporateworld,bothcommercially andfromaninforma... more BigDataandAnalyticshavebecomekeyconceptswithinthecorporateworld,bothcommercially andfromaninformationtechnology(IT)perspective.Thispaperpresentstheresultsofaglobal quantitative analysis of 400 IT leaders from different industries, which examined their attitudes towarddedicatedrolesforanInformationArchitectandaDataScientist.Theresultsillustratethe importanceoftheserolesattheintersectionofbusinessandtechnology.Theyalsoshowthattobuild sustainableandquantifiablebusinessresultsanddefineanorganization'scompetitivepositioning, bothrolesneedtobededicated,ratherthansharedacrossdifferentpeople.Theresearchalsoshowed thatthosededicatedrolescontributeactivelytoasustainablecompetitivepositioningmainlydriven byvisualizationofcomplexmatters.

Research paper thumbnail of Big Data Analytics Tools and Platform in Big Data Landscape

Research Anthology on Big Data Analytics, Architectures, and Applications, 2022

The term big data analytics refers to mining and analyzing of the voluminous amount of data in bi... more The term big data analytics refers to mining and analyzing of the voluminous amount of data in big data by using various tools and platforms. Some of the popular tools are Apache Hadoop, Apache Spark, HBase, Storm, Grid Gain, HPCC, Casandra, Pig, Hive, and No SQL, etc. These tools are used depending on the parameter taken for big data analysis. So, we need a comparative analysis of such analytical tools to choose best and simpler way of analysis to gain more optimal throughput and efficient mining. This chapter contributes to a comparative study of big data analytics tools based on different aspects such as their functionality, pros, and cons based on characteristics that can be used to determine the best and most efficient among them. Through the comparative study, people are capable of using such tools in a more efficient way.

Research paper thumbnail of Big Data Analytics Tools and Platform in Big Data Landscape

Hershey, PA: IGI Global, 2018

The term big data analytics refers to mining and analyzing of the voluminous amount of data in bi... more The term big data analytics refers to mining and analyzing of the voluminous amount of data in big data by using various tools and platforms. Some of the popular tools are Apache Hadoop, Apache Spark, HBase, Storm, Grid Gain, HPCC, Casandra, Pig, Hive, and No SQL, etc. These tools are used depending on the parameter taken for big data analysis. So, we need a comparative analysis of such analytical tools to choose best and simpler way of analysis to gain more optimal throughput and efficient mining. This chapter contributes to a comparative study of big data analytics tools based on different aspects such as their functionality, pros, and cons based on characteristics that can be used to determine the best and most efficient among them. Through the comparative study, people are capable of using such tools in a more efficient way.

Papers by Mohd Shoaib

Research paper thumbnail of Detection of Malicious Node in MANET: issues and Challenges on Intrusion Detection

Research paper thumbnail of e₹—The digital currency in India: Challenges and future prospects

BenchCouncil Transactions on Benchmarks, Standards and Evaluations, 2023

The Reserve Bank of India (RBI) has recently launched the country’s first pilot project for the d... more The Reserve Bank of India (RBI) has recently launched the country’s first pilot project for the digital currency known as the digital rupee or e-Rupee (e₹). The launch of the digital rupee represents a significant advancement in the “Digital India” revolution. It will be a fantastic opportunity for India since it might make conducting business easier while enhancing the security and resilience of the overall payments system. Digital currency attempts to rapidly progress monetary policy to disrupt physical money, lower the cost of financial transactions, and reshape how the money will circulate. Although the effects of digital currency cannot be foreseen, it is extremely important to thoroughly research digital currency and its effects on the operational stage. The development of a digital currency infrastructure has some challenges in terms of performance, scalability, and different usage scenarios. The article clarifies what e₹ is. How does it work? What makes it different from cryptocurrencies? What are the major challenges and prospects for it in India?

Research paper thumbnail of URL based Phishing Detection using Machine Learning

2023 6th International Conference on Information Systems and Computer Networks (ISCON), 2023

Research paper thumbnail of Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding

— Today`s era is all about Big data. The stream of data which is continuously generated by Intern... more — Today`s era is all about Big data. The stream of data which is continuously generated by Internet requires a huge storage capacity. In spite of the availability of various data center and content delivery networks, it also imposes constraints on it. In this scenario, we require compression mechanism that can successfully reduce the bulk amount of data and also capable of perfectly reconstruct it. In this paper we have work on the restoration of compressed image with lossless feature using an existing novel method of data folding technique[1]. This approach is more feasible for continuous-tone images which comes in spatial domain by reducing the machine computational complexity and with smart compression as compression to existing techniques. The idea of data folding is followed in the way of dividing the images in rows and columns and then the column is folded iteratively followed by another row folding. And this process is repeated until we cover all rows and columns present in original one and hence we obtain a reduced sized image which has higher potency. For the effectiveness of approach we have compared with some of existing standard approaches of image compression like SPHIT and JPEG-LS. INTRODUCTION Generally image describes about the visual opinion of certain object. In some applications, this objective can be fulfilled by only representing image regardless of their size. Image compression has a revolutionary impact in the area of transmission and data storage irrespective of the limited size of the storage. Due to importance of compression, various techniques and methods have been proposed used to store and transmit the images over the network. Another characteristics of these algorithms are that they also reduced the compression time of the process as well as it also insure the integrity of the graphics images while processing it. Since all information are now stored in cloud and can be view and downloaded through web, it makes graphics images portable so that it can be fetched with no time. Various researches and experiments have been conducted to evaluate the efficiency of a good image compression technique. Various factors have been proposed like compression ratio, compression size, and entropy for compression technique. Compression Size is the reduction in volume of the source file after performing the compression technique. The ratio of the original file size (in bits) to the compressed image size in bits is called the compression ratio of the compression technique. Entropy defines the amount of bits which are used to represent the original image after coding is performed. Generally the entropy of an image compression algorithm is given by Shannon's formula: , 2 (1) During the designing of a compression technique for an image, there are numerous parameter that must be taken into account for better achievement of compression efficiency and computational complexity of the processing. These parameters are more concerns as it varies from one environment to another, from one bandwidth to another. The organization of the paper is as follows. Section II contains the description of basic image compression principals. Section III illustrate the literature works that have been carries out in past. In section IV, we have presented the proposed compression method based on data folding and compare the results with Lossless SPHIT and JPEG-LS method, followed by the results and discussions in section V. The last section presents the conclusion.

Research paper thumbnail of Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding

International Journal of Engineering and Technology, 2017

Today`s era is all about Big data. The stream of data which is continuously generated by Internet... more Today`s era is all about Big data. The stream of data which is continuously generated by Internet requires a huge storage capacity. In spite of the availability of various data center and content delivery networks, it also imposes constraints on it. In this scenario, we require compression mechanism that can successfully reduce the bulk amount of data and also capable of perfectly reconstruct it. In this paper we have work on the restoration of compressed image with lossless feature using an existing novel method of data folding technique[1]. This approach is more feasible for continuous-tone images which comes in spatial domain by reducing the machine computational complexity and with smart compression as compression to existing techniques. The idea of data folding is followed in the way of dividing the images in rows and columns and then the column is folded iteratively followed by another row folding. And this process is repeated until we cover all rows and columns present in original one and hence we obtain a reduced sized image which has higher potency. For the effectiveness of approach we have compared with some of existing standard approaches of image compression like SPHIT and JPEG-LS.

Research paper thumbnail of Discovering Web through Crawler: A Review

Proceedings of National Conference on Recent Trends in Parallel Computing (RTPC - 2014)

There are billions of pages on World Wide Web where each page is denoted by URLs. Finding relevan... more There are billions of pages on World Wide Web where each page is denoted by URLs. Finding relevant information from these URLs is not easy. The information to be sought has to be found quickly, efficiently and very relevant. A web crawler is used to find what information each URLs contain. Web crawler traverses the World Wide Web in systematic manner, downloads the page and sends the information over to search engine so that it get indexed. There are various types of web crawlers and each provides some improvement over the other. This paper presents an overview of web crawler, its architecture and identifies types of crawlers with their architecture, namely incremental, parallel, distributed, focused and hidden web crawler.

Research paper thumbnail of Detection of Malicious Node in MANET: Issues and Challenges on Intrusion Detection

Proceedings of National Conference on Recent Trends in Parallel Computing (RTPC - 2014)

The mobile ad-hoc network is more exposed to being attacked, their some fundamental Properties, s... more The mobile ad-hoc network is more exposed to being attacked, their some fundamental Properties, such as dynamic topology, limited power and limited bandwidth, it is very hard to achieve absolute security in the mobile ad hoc network. Attack avoidance method like encryption and authentication are not enough for reducing the possibilities of attacks. However, these methods are designed to prevent for a set of possible known attacks. These methods are not able to prevent newer attacks that are originated in the existing security measures. So, the mechanism is needed to detect and response for these newer attacks. This paper explores and classifies current techniques of Intrusion Detection System (IDS) on MANET and we have study various intrusion detection techniques in MANET.

Research paper thumbnail of A Comparative Review of CPU Scheduling Algorithms

Proceedings of National Conference on Recent Trends in Parallel Computing (RTPC - 2014)

A number of programs can be in memory at the same time, allows overlap of CPU and I/O. CPU schedu... more A number of programs can be in memory at the same time, allows overlap of CPU and I/O. CPU scheduling deals with the problem of choosing a process from the ready queue to be executed by the CPU. It is difficult and time consuming to develop CPU scheduling algorithm and to understand its impact because of need to modify and test operating system and measure the performance of real applications. As processor is the important resource, CPU scheduling becomes very important in accomplishing the operating system (OS) design goals. The goal of CPU scheduling is to minimize the average turnaround time and average waiting time in order to allow as many as possible running processes at all time in order to make best use of CPU. This paper attempts to summarize major CPU scheduling algorithms proposed till date. We look at the algorithms such as FCFS, SJF, SRTF, Round Robin, Priority scheduling, HRRN and LJF.

Research paper thumbnail of URL Ordering based Performance Evaluation of Web Crawler

Proceedings of IEEE International Conference on Advances in Engineering & Technology Research (ICAETR - 2014)

There are billions of Web pages on World Wide Web which can be accessed via internet. All of us r... more There are billions of Web pages on World Wide Web which can be accessed via internet. All of us rely on usage of internet for source of information. This source of information is available on web in various forms such as Websites, databases, images, sound, videos and many more. The search results given by search engine are classified on basis of many techniques such as keyword matches, link analysis, or many other techniques. Search engines provide information gathered from their own indexed databases. These indexed databases contain downloaded information from web pages. Whenever a query is provided by user, the information is fetched from these indexed pages. The Web Crawler is used to download and store web pages. Web crawler of these search engines is expert in crawling various Web pages to gather huge source of information. Web Crawler is developed which orders URLs on the basis of their content similarity to a query and structural similarity. Results are provided over five parameters: Top URLs, Precision, Content, Structural and Total Similarity for a keyword.

Research paper thumbnail of Comparative Study of Different Web Mining Algorithms to Discover Knowledge on the Web

Proceedings of Elsevier Second International Conference on Emerging Research in Computing, Information, Communication and Application (ERCICA-2014)

Nowadays the World Wide Web (commonly called as Web) is used widely and it has impacted on almost... more Nowadays the World Wide Web (commonly called as Web) is used widely and it has impacted on almost every facet of our lives. To search and retrieve the information from the web requires an effective and efficient technique as it has become a challenge due to expanding size and complexity of web. Web Mining tackles this problem by gathering useful information from web by using its three categories web structure mining, web content mining and web usage mining. In this paper discussion is done by explaining the area of WebMining, its categories and algorithms associated with it. The algorithms discussed are PageRank, SimRank, TF-IDF, k- nearest neighbour, PageGather and CDL4. Then we summarize the algorithms over parameters such as its working, input parameters, complexity and their pros and cons. Also we analyze discussed algorithms over the parameters: relevance, their technique and regression analysis.

Research paper thumbnail of A Comprehensive Survey of Page Replacement Algorithms

Efficient page replacement algorithms are needed in virtual memory system to make a decision whic... more Efficient page replacement algorithms are needed in virtual memory system to make a decision which pages to evict from memory in case of a page fault. Many algorithms have been proposed over the years. Each algorithm tries to incur minimum overhead while attempting to minimize the page fault rate. Formulating newer approaches of page replacement which could adapt to changing workloads have been the main focus of research as newer memory access patterns were explored. This paper attempts to summarize major page replacement algorithms proposed till date. We look at the traditional algorithms such as LRU and CLOCK, and also study the recent approaches such as LIRS, CLOCK-Pro, Working Set, WSClock, Dynamic Sort.

Research paper thumbnail of Deep Learning-Based Plant Disease Detection Using Android Apps

Artificial Intelligence Applications in Agriculture and Food Quality Improvement

Disease detection in plants is crucial for preventing losses in yield and agricultural productivi... more Disease detection in plants is crucial for preventing losses in yield and agricultural productivity. Historically, disease identification has been supported by agriculture extension organizations, which were difficult to access from villages. Farmers have to go to their field and manually monitor plant disease. The aim of this work is to develop an Android application that provides an easy-to-use platform for farmers to identify diseases in their crops. The mobile application will help to take responsive action according to the disease detected in their plants and can be easily used by anyone who is interested in analyzing the disease of the plants. This work reports on the classification of 26 diseases in 14 crop species using 54,306 images from PlantVillage dataset using a convolutional neural network approach. The models used are Inception-v3 and MobileNet. The correct prediction of the correct crop-diseases pair in 38 classes decides the criteria for performance measurement. The...

Research paper thumbnail of Cyber Forensic Lab Setup and Its Requirement

Advances in Digital Crime, Forensics, and Cyber Terrorism, 2020

The year 1978 was the year when the first computer-related crime took place; it was alteration or... more The year 1978 was the year when the first computer-related crime took place; it was alteration or deletion of data. The day-by-day level and intensity of cybercrime has strengthened and is getting stronger in nature in the current era. So, to achieve accuracy during the investigation, an intensive investigation environment or lab is needed. This will help the investigation team in various ways. More advanced techniques and tools are used in a current age lab setup, and solutions to forensically examine a variety of digital devices apart from computers are made.

Research paper thumbnail of Introduction to Email, Web, and Message Forensics

Advances in Digital Crime, Forensics, and Cyber Terrorism, 2020

With the advent of information and communication technologies, e-mail is one of the prime tools f... more With the advent of information and communication technologies, e-mail is one of the prime tools for communication. Almost everyone uses emails for business communications as well as for sharing personal information. E-mail is vulnerable against cybercrimes. This chapter focuses on the basic concepts of web forensics and then provides an insight about the e-mailing policies, email architecture, and existing investigation techniques used by forensic investigators. Most of the organizations implement some standard e-mailing policies; however, it is not sufficient to handle and prevent the digital crimes. We need to monitor and investigate emails for proper check. This chapter discusses some prominent tools and techniques through which forensic experts can gather and analyze data about suspected email accounts for investigation and produce evidence in the court of law.

Research paper thumbnail of Big Data and Analytics

BigDataandAnalyticshavebecomekeyconceptswithinthecorporateworld,bothcommercially andfromaninforma... more BigDataandAnalyticshavebecomekeyconceptswithinthecorporateworld,bothcommercially andfromaninformationtechnology(IT)perspective.Thispaperpresentstheresultsofaglobal quantitative analysis of 400 IT leaders from different industries, which examined their attitudes towarddedicatedrolesforanInformationArchitectandaDataScientist.Theresultsillustratethe importanceoftheserolesattheintersectionofbusinessandtechnology.Theyalsoshowthattobuild sustainableandquantifiablebusinessresultsanddefineanorganization'scompetitivepositioning, bothrolesneedtobededicated,ratherthansharedacrossdifferentpeople.Theresearchalsoshowed thatthosededicatedrolescontributeactivelytoasustainablecompetitivepositioningmainlydriven byvisualizationofcomplexmatters.

Research paper thumbnail of Big Data Analytics Tools and Platform in Big Data Landscape

Research Anthology on Big Data Analytics, Architectures, and Applications, 2022

The term big data analytics refers to mining and analyzing of the voluminous amount of data in bi... more The term big data analytics refers to mining and analyzing of the voluminous amount of data in big data by using various tools and platforms. Some of the popular tools are Apache Hadoop, Apache Spark, HBase, Storm, Grid Gain, HPCC, Casandra, Pig, Hive, and No SQL, etc. These tools are used depending on the parameter taken for big data analysis. So, we need a comparative analysis of such analytical tools to choose best and simpler way of analysis to gain more optimal throughput and efficient mining. This chapter contributes to a comparative study of big data analytics tools based on different aspects such as their functionality, pros, and cons based on characteristics that can be used to determine the best and most efficient among them. Through the comparative study, people are capable of using such tools in a more efficient way.

Research paper thumbnail of Big Data Analytics Tools and Platform in Big Data Landscape

Hershey, PA: IGI Global, 2018

The term big data analytics refers to mining and analyzing of the voluminous amount of data in bi... more The term big data analytics refers to mining and analyzing of the voluminous amount of data in big data by using various tools and platforms. Some of the popular tools are Apache Hadoop, Apache Spark, HBase, Storm, Grid Gain, HPCC, Casandra, Pig, Hive, and No SQL, etc. These tools are used depending on the parameter taken for big data analysis. So, we need a comparative analysis of such analytical tools to choose best and simpler way of analysis to gain more optimal throughput and efficient mining. This chapter contributes to a comparative study of big data analytics tools based on different aspects such as their functionality, pros, and cons based on characteristics that can be used to determine the best and most efficient among them. Through the comparative study, people are capable of using such tools in a more efficient way.

Research paper thumbnail of Detection of Malicious Node in MANET: issues and Challenges on Intrusion Detection

Research paper thumbnail of e₹—The digital currency in India: Challenges and future prospects

BenchCouncil Transactions on Benchmarks, Standards and Evaluations, 2023

The Reserve Bank of India (RBI) has recently launched the country’s first pilot project for the d... more The Reserve Bank of India (RBI) has recently launched the country’s first pilot project for the digital currency known as the digital rupee or e-Rupee (e₹). The launch of the digital rupee represents a significant advancement in the “Digital India” revolution. It will be a fantastic opportunity for India since it might make conducting business easier while enhancing the security and resilience of the overall payments system. Digital currency attempts to rapidly progress monetary policy to disrupt physical money, lower the cost of financial transactions, and reshape how the money will circulate. Although the effects of digital currency cannot be foreseen, it is extremely important to thoroughly research digital currency and its effects on the operational stage. The development of a digital currency infrastructure has some challenges in terms of performance, scalability, and different usage scenarios. The article clarifies what e₹ is. How does it work? What makes it different from cryptocurrencies? What are the major challenges and prospects for it in India?

Research paper thumbnail of URL based Phishing Detection using Machine Learning

2023 6th International Conference on Information Systems and Computer Networks (ISCON), 2023

Research paper thumbnail of Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding

— Today`s era is all about Big data. The stream of data which is continuously generated by Intern... more — Today`s era is all about Big data. The stream of data which is continuously generated by Internet requires a huge storage capacity. In spite of the availability of various data center and content delivery networks, it also imposes constraints on it. In this scenario, we require compression mechanism that can successfully reduce the bulk amount of data and also capable of perfectly reconstruct it. In this paper we have work on the restoration of compressed image with lossless feature using an existing novel method of data folding technique[1]. This approach is more feasible for continuous-tone images which comes in spatial domain by reducing the machine computational complexity and with smart compression as compression to existing techniques. The idea of data folding is followed in the way of dividing the images in rows and columns and then the column is folded iteratively followed by another row folding. And this process is repeated until we cover all rows and columns present in original one and hence we obtain a reduced sized image which has higher potency. For the effectiveness of approach we have compared with some of existing standard approaches of image compression like SPHIT and JPEG-LS. INTRODUCTION Generally image describes about the visual opinion of certain object. In some applications, this objective can be fulfilled by only representing image regardless of their size. Image compression has a revolutionary impact in the area of transmission and data storage irrespective of the limited size of the storage. Due to importance of compression, various techniques and methods have been proposed used to store and transmit the images over the network. Another characteristics of these algorithms are that they also reduced the compression time of the process as well as it also insure the integrity of the graphics images while processing it. Since all information are now stored in cloud and can be view and downloaded through web, it makes graphics images portable so that it can be fetched with no time. Various researches and experiments have been conducted to evaluate the efficiency of a good image compression technique. Various factors have been proposed like compression ratio, compression size, and entropy for compression technique. Compression Size is the reduction in volume of the source file after performing the compression technique. The ratio of the original file size (in bits) to the compressed image size in bits is called the compression ratio of the compression technique. Entropy defines the amount of bits which are used to represent the original image after coding is performed. Generally the entropy of an image compression algorithm is given by Shannon's formula: , 2 (1) During the designing of a compression technique for an image, there are numerous parameter that must be taken into account for better achievement of compression efficiency and computational complexity of the processing. These parameters are more concerns as it varies from one environment to another, from one bandwidth to another. The organization of the paper is as follows. Section II contains the description of basic image compression principals. Section III illustrate the literature works that have been carries out in past. In section IV, we have presented the proposed compression method based on data folding and compare the results with Lossless SPHIT and JPEG-LS method, followed by the results and discussions in section V. The last section presents the conclusion.

Research paper thumbnail of Comparative Analysis of Lossless Image Compression techniques SPHIT, JPEG-LS and Data Folding

International Journal of Engineering and Technology, 2017

Today`s era is all about Big data. The stream of data which is continuously generated by Internet... more Today`s era is all about Big data. The stream of data which is continuously generated by Internet requires a huge storage capacity. In spite of the availability of various data center and content delivery networks, it also imposes constraints on it. In this scenario, we require compression mechanism that can successfully reduce the bulk amount of data and also capable of perfectly reconstruct it. In this paper we have work on the restoration of compressed image with lossless feature using an existing novel method of data folding technique[1]. This approach is more feasible for continuous-tone images which comes in spatial domain by reducing the machine computational complexity and with smart compression as compression to existing techniques. The idea of data folding is followed in the way of dividing the images in rows and columns and then the column is folded iteratively followed by another row folding. And this process is repeated until we cover all rows and columns present in original one and hence we obtain a reduced sized image which has higher potency. For the effectiveness of approach we have compared with some of existing standard approaches of image compression like SPHIT and JPEG-LS.

Research paper thumbnail of Discovering Web through Crawler: A Review

Proceedings of National Conference on Recent Trends in Parallel Computing (RTPC - 2014)

There are billions of pages on World Wide Web where each page is denoted by URLs. Finding relevan... more There are billions of pages on World Wide Web where each page is denoted by URLs. Finding relevant information from these URLs is not easy. The information to be sought has to be found quickly, efficiently and very relevant. A web crawler is used to find what information each URLs contain. Web crawler traverses the World Wide Web in systematic manner, downloads the page and sends the information over to search engine so that it get indexed. There are various types of web crawlers and each provides some improvement over the other. This paper presents an overview of web crawler, its architecture and identifies types of crawlers with their architecture, namely incremental, parallel, distributed, focused and hidden web crawler.

Research paper thumbnail of Detection of Malicious Node in MANET: Issues and Challenges on Intrusion Detection

Proceedings of National Conference on Recent Trends in Parallel Computing (RTPC - 2014)

The mobile ad-hoc network is more exposed to being attacked, their some fundamental Properties, s... more The mobile ad-hoc network is more exposed to being attacked, their some fundamental Properties, such as dynamic topology, limited power and limited bandwidth, it is very hard to achieve absolute security in the mobile ad hoc network. Attack avoidance method like encryption and authentication are not enough for reducing the possibilities of attacks. However, these methods are designed to prevent for a set of possible known attacks. These methods are not able to prevent newer attacks that are originated in the existing security measures. So, the mechanism is needed to detect and response for these newer attacks. This paper explores and classifies current techniques of Intrusion Detection System (IDS) on MANET and we have study various intrusion detection techniques in MANET.

Research paper thumbnail of A Comparative Review of CPU Scheduling Algorithms

Proceedings of National Conference on Recent Trends in Parallel Computing (RTPC - 2014)

A number of programs can be in memory at the same time, allows overlap of CPU and I/O. CPU schedu... more A number of programs can be in memory at the same time, allows overlap of CPU and I/O. CPU scheduling deals with the problem of choosing a process from the ready queue to be executed by the CPU. It is difficult and time consuming to develop CPU scheduling algorithm and to understand its impact because of need to modify and test operating system and measure the performance of real applications. As processor is the important resource, CPU scheduling becomes very important in accomplishing the operating system (OS) design goals. The goal of CPU scheduling is to minimize the average turnaround time and average waiting time in order to allow as many as possible running processes at all time in order to make best use of CPU. This paper attempts to summarize major CPU scheduling algorithms proposed till date. We look at the algorithms such as FCFS, SJF, SRTF, Round Robin, Priority scheduling, HRRN and LJF.

Research paper thumbnail of URL Ordering based Performance Evaluation of Web Crawler

Proceedings of IEEE International Conference on Advances in Engineering & Technology Research (ICAETR - 2014)

There are billions of Web pages on World Wide Web which can be accessed via internet. All of us r... more There are billions of Web pages on World Wide Web which can be accessed via internet. All of us rely on usage of internet for source of information. This source of information is available on web in various forms such as Websites, databases, images, sound, videos and many more. The search results given by search engine are classified on basis of many techniques such as keyword matches, link analysis, or many other techniques. Search engines provide information gathered from their own indexed databases. These indexed databases contain downloaded information from web pages. Whenever a query is provided by user, the information is fetched from these indexed pages. The Web Crawler is used to download and store web pages. Web crawler of these search engines is expert in crawling various Web pages to gather huge source of information. Web Crawler is developed which orders URLs on the basis of their content similarity to a query and structural similarity. Results are provided over five parameters: Top URLs, Precision, Content, Structural and Total Similarity for a keyword.

Research paper thumbnail of Comparative Study of Different Web Mining Algorithms to Discover Knowledge on the Web

Proceedings of Elsevier Second International Conference on Emerging Research in Computing, Information, Communication and Application (ERCICA-2014)

Nowadays the World Wide Web (commonly called as Web) is used widely and it has impacted on almost... more Nowadays the World Wide Web (commonly called as Web) is used widely and it has impacted on almost every facet of our lives. To search and retrieve the information from the web requires an effective and efficient technique as it has become a challenge due to expanding size and complexity of web. Web Mining tackles this problem by gathering useful information from web by using its three categories web structure mining, web content mining and web usage mining. In this paper discussion is done by explaining the area of WebMining, its categories and algorithms associated with it. The algorithms discussed are PageRank, SimRank, TF-IDF, k- nearest neighbour, PageGather and CDL4. Then we summarize the algorithms over parameters such as its working, input parameters, complexity and their pros and cons. Also we analyze discussed algorithms over the parameters: relevance, their technique and regression analysis.

Research paper thumbnail of A Comprehensive Survey of Page Replacement Algorithms

Efficient page replacement algorithms are needed in virtual memory system to make a decision whic... more Efficient page replacement algorithms are needed in virtual memory system to make a decision which pages to evict from memory in case of a page fault. Many algorithms have been proposed over the years. Each algorithm tries to incur minimum overhead while attempting to minimize the page fault rate. Formulating newer approaches of page replacement which could adapt to changing workloads have been the main focus of research as newer memory access patterns were explored. This paper attempts to summarize major page replacement algorithms proposed till date. We look at the traditional algorithms such as LRU and CLOCK, and also study the recent approaches such as LIRS, CLOCK-Pro, Working Set, WSClock, Dynamic Sort.