Rajendra purohit | JIET, Jodhpur (original) (raw)
Papers by Rajendra purohit
arXiv (Cornell University), Apr 9, 2024
International journal of engineering research and technology, Apr 24, 2018
Distributed application executes on multiple nodes of remote sites. Due to involvement of multipl... more Distributed application executes on multiple nodes of remote sites. Due to involvement of multiple nodes, it requires error recovery algorithms. Traditional message passing techniques were proposed to design these error recovery techniques in past. These techniques generate heavy network traffic and other network overheads. The distributed system also suffers from the "Domino Effect", as it may roll back the transaction to its initial state. Here in this paper we are proposing mobile agents as a solution for error recovery in distributed environment as well as to nullify the domino effect. This new approach reduces the network traffic and provides most updated system information for decision making. I.
Communications in computer and information science, 2018
After an early requirement design, project managers mostly use the requirement specifications to ... more After an early requirement design, project managers mostly use the requirement specifications to get an estimate of functional size of software which helps in estimating effort required and tentative cost of the software. An accurate estimate is necessary to be able to negotiate price of a software project and to plan and schedule project activities. Function Point sizing method for estimation is used frequently to estimate functional size of software. Another popular method of functional sizing is Use Case Points (UCP). UCP method of estimation although less used than FP based estimation, but is simpler than FP based method. One reason for this is - FP sizing metric often uses COCOMO-II or other effort estimation method to convert size estimate in FPs into effort estimate where as direct conversion formula can be used for converting size in UCPs to effort estimate. This paper compares results of both approaches for two mid-size business applications and tries to understand the correlation between the results of two approaches.
International journal of smart sensors and ad hoc networks, 2012
The SCVM, which stands for Storage Concentrator Virtual Machine, creates IP SAN storage as virtua... more The SCVM, which stands for Storage Concentrator Virtual Machine, creates IP SAN storage as virtual machine, providing the user with the ability to consolidate their virtual data center. The SCVM can fully manage and has features for the storage over local as well as networked VMs. This can be attained through virtual switch or physical network connection. It provides an advanced, fully featured iSCSI SAN / Storage within an SAN Server. The user does not need another box for the storage. One just has to create an iSCSI Virtual SAN Appliance along with the Server Virtual Machines within the same hardware platform. This is not only Increases the productivity, but also simplifies the management, and reduces the power and rack space by simply loading an SCVM in a Virtual Machine. By creating an iSCSI Target within a Virtual Server, SCVM's customers may reallocate existing hardware resources to create business continuity and disaster recovery solution. This is achieved by any of two ways, be it SCVM's synchronous mirroring within the datacenter and distributed campus, or be it the asynchronous mirroring (replication) between remote facilities. I.
International journal of engineering research and technology, Apr 24, 2018
This paper presents basics of digital image processing. Image Processing is very popular topic in... more This paper presents basics of digital image processing. Image Processing is very popular topic in the field of research and development. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. In Image processing any form of signal processing for which the input is an digital image; the output of image processing can be either an image or a set of characteristics or parameters related to the image. Most imageprocessing techniques involve treating the image as a twodimensional signal and applying standard signal-processing techniques to it. Image processing were developed in 1960s and in 2000 digital image processing has become the most common form of image processing due to its versatility and low cost. In broader sense, Image processing is divided into two major branches; image enhancement and image restoration. Fourier transform is most popular image transforms. The Fourier Transform is used in a wide range of applications. Image Processing is the act of examining images for the purpose of identifying objects and judging their significance. An image analyst studies the remotely sensed data and attempt to detect, identify, classify, measure and evaluate the significance of physical and cultural objects, their patterns and spatial relationship through logical processes.
International journal of engineering research and technology, Mar 13, 2013
As broadband is gaining worldwide popularity the scope to connect more people via broadband is si... more As broadband is gaining worldwide popularity the scope to connect more people via broadband is simultaneously increasing. Broadband wireless access networks are providing more capacity and coverage. Where wired connections are costly and sometimes not easy to implement, at those places Wireless networking has offered an alternative solution to the problem of information access. They have changed the way people communicate and share information by eliminating troublesome factors of distance and location. This paper presents an outline of broadband technologies with bandwidth management.
Social Science Research Network, 2019
A Mobile Ad hoc Network (MANET) is a self-arranging (self-ruling) arrangement of portable switche... more A Mobile Ad hoc Network (MANET) is a self-arranging (self-ruling) arrangement of portable switches (and related hosts) associated by remote connections the association of which structure a discretionary topology. The switches are allowed to move haphazardly and compose themselves self-assertively, in this way, the system's remote topology may change quickly and unusually. Such a system may work in an independent manner or might be associated with the bigger web working as a crossover fixed/specially appointed system. Swarm Intelligence or SI is a developing field that is recently noticeable by many researchers in the field of network routing. Swarm Intelligence is mainly exhibiting complex behaviors that come up from simple individual behaviors and interactions, which is commonly observed in nature, mainly among social insects like ants, bees, fishes, etc. and they have individual little intelligence and follows basic rules using local information obtained from the surroundings. In this paper, traditional and SI based routing protocols have been discussed and performance comparisons given with is help of different output graphs for routing protocols such as Ant-based AODV (AAODV) and Honeybee based AODV (HAODV) using the NS-2 simulation tool. The performance of routing protocols has been studied in terms of different parameters like throughput and packet delivery ratio (PDR).
IJEIR, 2012
Storage replication is one of the back bones for network environments. While many forms of Networ... more Storage replication is one of the back bones for network environments. While many forms of Network Attached Storage (NAS), Storage Area Networks (SAN) and other forms of network storage exist, there is a need for a reliable storage replication technique between distant sites (> 1 Km). Such technology allows setting new standards and removes demerits of network failover and failback systems for virtual servers; specifically, the growing storage need for effective disaster recovery (DR) planning. The purpose of this manuscript is to identify growing technologies such as IPSAN that allow with remote storage cluster replication for virtual servers. This study (Cluster Replication) provides an analysis of improving the uptime and availability of SAN. For higher levels of availability, mirrored images maintained in Active/Active Cluster mirroring can provide a system with No Single Points of Failure, which is designed to improve the overall uptime of the storage system for organizations with 7x24x365 requirements.
Smart innovation, systems and technologies, 2023
The log-based analysis and trouble-shooting has remained prevalent and commonly used approach for... more The log-based analysis and trouble-shooting has remained prevalent and commonly used approach for centralized and time-haring systems. However, for parallel and distributed systems where happen-before relations are not directly available between the events, it become a challenge to fully depend on log-based analysis in such instances. This article attempts to provide solutions using log-based performance analysis of centralized system, and demonstrates the results and their effectiveness, as well presents the challenges and proposes solutions for performance analysis in distributed and parallel systems.
Journal of Information and Optimization Sciences
Multi-core design intends to serve a large market with user-oriented and highproductivity managem... more Multi-core design intends to serve a large market with user-oriented and highproductivity management as opposed to any other parallel system. Small numbers of processors, a frequent feature of current multi-core systems, are ideal for future generation of CPUs, where automated parallelization succeeds on shared space architectures. The multi-core compiler optimization platform CETUS (high-level to high-level compiler) offers initiates automatic parallelization in compiled programmes. This compiler’s infrastructure is built with C programmes in mind and is user-friendly and simple to use. It offers the significant parallelization passes and also the underlying empowering techniques, allows source-to-source conversions, and delivers these features. This compiler has undergone numerous benchmark investigations (techniques) and approach implementation iterations. It might enhance the programs’ parallel performance. The main drawback of advanced optimising compilers, however, is that the...
Images in the real world are subject to various forms of degradation during image capture, acquis... more Images in the real world are subject to various forms of degradation during image capture, acquisition, storage, transmission and reproduction. Images are everywhere in our daily life. This is not only because image is a widely used medium of communications, but also because it is an easy and compact way to represent the physical world. Processing of digital images with the help of digital computers known as Digital Image Processing. One of the most applicable areas in Image Processing methods is to enhance the pictorial information for human perception. Image restoration is a method to clearing the degraded image to obtain the original image. For years researchers have been working in developing new techniques that can restore the original image from degraded image. The aim of this paper is to demonstrate the different types of techniques for image Restorations.
Communications in computer and information science, 2018
Software Effort Estimation is an onerous but still inevitable task project managers have to perfo... more Software Effort Estimation is an onerous but still inevitable task project managers have to perform. Project managers often face the dilemma of selection of estimation approach whenever any new project opportunity comes across. Estimation is required for not only setting a price and bidding rounds but also for planning, budgeting, staffing and scheduling of project related tasks. This paper reviews major cost estimation techniques that are relevant in current scenario. The primary conclusion is - all estimation approaches have few advantages and disadvantages and are often complimentary in their characteristics. Observation and Evaluation of several approaches can be insightful and can help in selecting an estimation technique or combination of techniques best suited for a particular project.
Processing of digital images with the help of digital computers known as Digital Image Processing... more Processing of digital images with the help of digital computers known as Digital Image Processing. One of the most applicable areas in Image Processing methods is to enhance the pictorial information for human perception. Image restoration is a method to clearing the degraded image to obtain the original image. For years researchers have been working in developing new techniques that can restore the original image from degraded image. The aim of this paper is to demonstrate the implementation of different types of techniques for image Restorations in MATLAB. MATLAB is very powerful tool for image processing because it support all types of image format and conversion between them. It also support all types of datatypes.
2019 2nd International Conference on Power Energy, Environment and Intelligent Control (PEEIC), 2019
Algorithms for Intelligent Systems, 2020
Advances in Intelligent Systems and Computing, 2021
Online reviews become helpful while designing a product, it motivates consumers to post a review ... more Online reviews become helpful while designing a product, it motivates consumers to post a review for the product. The fact that customer is satisfied by the product purchased by him/her can be understood by the review posted by him/her. If the review posted is positive and recommend further to purchase the product to other users, then the customer is satisfied else if negative review or comment is made and a warning experience is posted that means the customer is dissatisfied with the product purchase. MLgis at fastest growing pace in area of computer science, having numerous applications in different types of fields. Machine learning tools have capacity to self-learning patterns which has the ability to adapt and learn. As the amount of data is increasing and becoming easily available, so it's a better option to state that smart analysis of data that is often considered as a key element for progressing technology. Users frequently make errors throughout analysis, when relationships are established among multiple features. The process begins with the pre-processing techniques for feature selection which includes elimination of stop words, tokenization, stemming and lower casing. In this paper basically two techniques are used: Naïve Bayes (NB) algorithm and optimized feature selection using Naïve Bayes. The paper compares the accuracy, recall, precision and f-measure of the model of NB and optimized feature selection using NB text classification algorithms on online reviews of products to predict its trend in market. The optimized feature selection plays an important part when working with the data mining algorithms; it is capable of reducing the ambiguity of the processor as the vector space of features are reduced. The results are discussed in the detailed manner comparing the algorithms performance metrics.
Data Science and Analytics, 2018
After an early requirement design, project managers mostly use the requirement specifications to ... more After an early requirement design, project managers mostly use the requirement specifications to get an estimate of functional size of software which helps in estimating effort required and tentative cost of the software. An accurate estimate is necessary to be able to negotiate price of a software project and to plan and schedule project activities. Function Point sizing method for estimation is used frequently to estimate functional size of software. Another popular method of functional sizing is Use Case Points (UCP). UCP method of estimation although less used than FP based estimation, but is simpler than FP based method. One reason for this is - FP sizing metric often uses COCOMO-II or other effort estimation method to convert size estimate in FPs into effort estimate where as direct conversion formula can be used for converting size in UCPs to effort estimate. This paper compares results of both approaches for two mid-size business applications and tries to understand the correlation between the results of two approaches.
International journal of engineering research and technology, 2018
This paper presents basics of digital image processing. Image Processing is very popular topic in... more This paper presents basics of digital image processing. Image Processing is very popular topic in the field of research and development. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. In Image processing any form of signal processing for which the input is an digital image; the output of image processing can be either an image or a set of characteristics or parameters related to the image. Most imageprocessing techniques involve treating the image as a twodimensional signal and applying standard signal-processing techniques to it. Image processing were developed in 1960s and in 2000 digital image processing has become the most common form of image processing due to its versatility and low cost. In broader sense, Image processing is divided into two major branches; image enhancement and image restoration. Fourier transform is most popular image transforms. The Fourier Transform is used in a wide ran...
Data Science and Analytics, 2018
Software Effort Estimation is an onerous but still inevitable task project managers have to perfo... more Software Effort Estimation is an onerous but still inevitable task project managers have to perform. Project managers often face the dilemma of selection of estimation approach whenever any new project opportunity comes across. Estimation is required for not only setting a price and bidding rounds but also for planning, budgeting, staffing and scheduling of project related tasks. This paper reviews major cost estimation techniques that are relevant in current scenario. The primary conclusion is - all estimation approaches have few advantages and disadvantages and are often complimentary in their characteristics. Observation and Evaluation of several approaches can be insightful and can help in selecting an estimation technique or combination of techniques best suited for a particular project.
arXiv (Cornell University), Apr 9, 2024
International journal of engineering research and technology, Apr 24, 2018
Distributed application executes on multiple nodes of remote sites. Due to involvement of multipl... more Distributed application executes on multiple nodes of remote sites. Due to involvement of multiple nodes, it requires error recovery algorithms. Traditional message passing techniques were proposed to design these error recovery techniques in past. These techniques generate heavy network traffic and other network overheads. The distributed system also suffers from the "Domino Effect", as it may roll back the transaction to its initial state. Here in this paper we are proposing mobile agents as a solution for error recovery in distributed environment as well as to nullify the domino effect. This new approach reduces the network traffic and provides most updated system information for decision making. I.
Communications in computer and information science, 2018
After an early requirement design, project managers mostly use the requirement specifications to ... more After an early requirement design, project managers mostly use the requirement specifications to get an estimate of functional size of software which helps in estimating effort required and tentative cost of the software. An accurate estimate is necessary to be able to negotiate price of a software project and to plan and schedule project activities. Function Point sizing method for estimation is used frequently to estimate functional size of software. Another popular method of functional sizing is Use Case Points (UCP). UCP method of estimation although less used than FP based estimation, but is simpler than FP based method. One reason for this is - FP sizing metric often uses COCOMO-II or other effort estimation method to convert size estimate in FPs into effort estimate where as direct conversion formula can be used for converting size in UCPs to effort estimate. This paper compares results of both approaches for two mid-size business applications and tries to understand the correlation between the results of two approaches.
International journal of smart sensors and ad hoc networks, 2012
The SCVM, which stands for Storage Concentrator Virtual Machine, creates IP SAN storage as virtua... more The SCVM, which stands for Storage Concentrator Virtual Machine, creates IP SAN storage as virtual machine, providing the user with the ability to consolidate their virtual data center. The SCVM can fully manage and has features for the storage over local as well as networked VMs. This can be attained through virtual switch or physical network connection. It provides an advanced, fully featured iSCSI SAN / Storage within an SAN Server. The user does not need another box for the storage. One just has to create an iSCSI Virtual SAN Appliance along with the Server Virtual Machines within the same hardware platform. This is not only Increases the productivity, but also simplifies the management, and reduces the power and rack space by simply loading an SCVM in a Virtual Machine. By creating an iSCSI Target within a Virtual Server, SCVM's customers may reallocate existing hardware resources to create business continuity and disaster recovery solution. This is achieved by any of two ways, be it SCVM's synchronous mirroring within the datacenter and distributed campus, or be it the asynchronous mirroring (replication) between remote facilities. I.
International journal of engineering research and technology, Apr 24, 2018
This paper presents basics of digital image processing. Image Processing is very popular topic in... more This paper presents basics of digital image processing. Image Processing is very popular topic in the field of research and development. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. In Image processing any form of signal processing for which the input is an digital image; the output of image processing can be either an image or a set of characteristics or parameters related to the image. Most imageprocessing techniques involve treating the image as a twodimensional signal and applying standard signal-processing techniques to it. Image processing were developed in 1960s and in 2000 digital image processing has become the most common form of image processing due to its versatility and low cost. In broader sense, Image processing is divided into two major branches; image enhancement and image restoration. Fourier transform is most popular image transforms. The Fourier Transform is used in a wide range of applications. Image Processing is the act of examining images for the purpose of identifying objects and judging their significance. An image analyst studies the remotely sensed data and attempt to detect, identify, classify, measure and evaluate the significance of physical and cultural objects, their patterns and spatial relationship through logical processes.
International journal of engineering research and technology, Mar 13, 2013
As broadband is gaining worldwide popularity the scope to connect more people via broadband is si... more As broadband is gaining worldwide popularity the scope to connect more people via broadband is simultaneously increasing. Broadband wireless access networks are providing more capacity and coverage. Where wired connections are costly and sometimes not easy to implement, at those places Wireless networking has offered an alternative solution to the problem of information access. They have changed the way people communicate and share information by eliminating troublesome factors of distance and location. This paper presents an outline of broadband technologies with bandwidth management.
Social Science Research Network, 2019
A Mobile Ad hoc Network (MANET) is a self-arranging (self-ruling) arrangement of portable switche... more A Mobile Ad hoc Network (MANET) is a self-arranging (self-ruling) arrangement of portable switches (and related hosts) associated by remote connections the association of which structure a discretionary topology. The switches are allowed to move haphazardly and compose themselves self-assertively, in this way, the system's remote topology may change quickly and unusually. Such a system may work in an independent manner or might be associated with the bigger web working as a crossover fixed/specially appointed system. Swarm Intelligence or SI is a developing field that is recently noticeable by many researchers in the field of network routing. Swarm Intelligence is mainly exhibiting complex behaviors that come up from simple individual behaviors and interactions, which is commonly observed in nature, mainly among social insects like ants, bees, fishes, etc. and they have individual little intelligence and follows basic rules using local information obtained from the surroundings. In this paper, traditional and SI based routing protocols have been discussed and performance comparisons given with is help of different output graphs for routing protocols such as Ant-based AODV (AAODV) and Honeybee based AODV (HAODV) using the NS-2 simulation tool. The performance of routing protocols has been studied in terms of different parameters like throughput and packet delivery ratio (PDR).
IJEIR, 2012
Storage replication is one of the back bones for network environments. While many forms of Networ... more Storage replication is one of the back bones for network environments. While many forms of Network Attached Storage (NAS), Storage Area Networks (SAN) and other forms of network storage exist, there is a need for a reliable storage replication technique between distant sites (> 1 Km). Such technology allows setting new standards and removes demerits of network failover and failback systems for virtual servers; specifically, the growing storage need for effective disaster recovery (DR) planning. The purpose of this manuscript is to identify growing technologies such as IPSAN that allow with remote storage cluster replication for virtual servers. This study (Cluster Replication) provides an analysis of improving the uptime and availability of SAN. For higher levels of availability, mirrored images maintained in Active/Active Cluster mirroring can provide a system with No Single Points of Failure, which is designed to improve the overall uptime of the storage system for organizations with 7x24x365 requirements.
Smart innovation, systems and technologies, 2023
The log-based analysis and trouble-shooting has remained prevalent and commonly used approach for... more The log-based analysis and trouble-shooting has remained prevalent and commonly used approach for centralized and time-haring systems. However, for parallel and distributed systems where happen-before relations are not directly available between the events, it become a challenge to fully depend on log-based analysis in such instances. This article attempts to provide solutions using log-based performance analysis of centralized system, and demonstrates the results and their effectiveness, as well presents the challenges and proposes solutions for performance analysis in distributed and parallel systems.
Journal of Information and Optimization Sciences
Multi-core design intends to serve a large market with user-oriented and highproductivity managem... more Multi-core design intends to serve a large market with user-oriented and highproductivity management as opposed to any other parallel system. Small numbers of processors, a frequent feature of current multi-core systems, are ideal for future generation of CPUs, where automated parallelization succeeds on shared space architectures. The multi-core compiler optimization platform CETUS (high-level to high-level compiler) offers initiates automatic parallelization in compiled programmes. This compiler’s infrastructure is built with C programmes in mind and is user-friendly and simple to use. It offers the significant parallelization passes and also the underlying empowering techniques, allows source-to-source conversions, and delivers these features. This compiler has undergone numerous benchmark investigations (techniques) and approach implementation iterations. It might enhance the programs’ parallel performance. The main drawback of advanced optimising compilers, however, is that the...
Images in the real world are subject to various forms of degradation during image capture, acquis... more Images in the real world are subject to various forms of degradation during image capture, acquisition, storage, transmission and reproduction. Images are everywhere in our daily life. This is not only because image is a widely used medium of communications, but also because it is an easy and compact way to represent the physical world. Processing of digital images with the help of digital computers known as Digital Image Processing. One of the most applicable areas in Image Processing methods is to enhance the pictorial information for human perception. Image restoration is a method to clearing the degraded image to obtain the original image. For years researchers have been working in developing new techniques that can restore the original image from degraded image. The aim of this paper is to demonstrate the different types of techniques for image Restorations.
Communications in computer and information science, 2018
Software Effort Estimation is an onerous but still inevitable task project managers have to perfo... more Software Effort Estimation is an onerous but still inevitable task project managers have to perform. Project managers often face the dilemma of selection of estimation approach whenever any new project opportunity comes across. Estimation is required for not only setting a price and bidding rounds but also for planning, budgeting, staffing and scheduling of project related tasks. This paper reviews major cost estimation techniques that are relevant in current scenario. The primary conclusion is - all estimation approaches have few advantages and disadvantages and are often complimentary in their characteristics. Observation and Evaluation of several approaches can be insightful and can help in selecting an estimation technique or combination of techniques best suited for a particular project.
Processing of digital images with the help of digital computers known as Digital Image Processing... more Processing of digital images with the help of digital computers known as Digital Image Processing. One of the most applicable areas in Image Processing methods is to enhance the pictorial information for human perception. Image restoration is a method to clearing the degraded image to obtain the original image. For years researchers have been working in developing new techniques that can restore the original image from degraded image. The aim of this paper is to demonstrate the implementation of different types of techniques for image Restorations in MATLAB. MATLAB is very powerful tool for image processing because it support all types of image format and conversion between them. It also support all types of datatypes.
2019 2nd International Conference on Power Energy, Environment and Intelligent Control (PEEIC), 2019
Algorithms for Intelligent Systems, 2020
Advances in Intelligent Systems and Computing, 2021
Online reviews become helpful while designing a product, it motivates consumers to post a review ... more Online reviews become helpful while designing a product, it motivates consumers to post a review for the product. The fact that customer is satisfied by the product purchased by him/her can be understood by the review posted by him/her. If the review posted is positive and recommend further to purchase the product to other users, then the customer is satisfied else if negative review or comment is made and a warning experience is posted that means the customer is dissatisfied with the product purchase. MLgis at fastest growing pace in area of computer science, having numerous applications in different types of fields. Machine learning tools have capacity to self-learning patterns which has the ability to adapt and learn. As the amount of data is increasing and becoming easily available, so it's a better option to state that smart analysis of data that is often considered as a key element for progressing technology. Users frequently make errors throughout analysis, when relationships are established among multiple features. The process begins with the pre-processing techniques for feature selection which includes elimination of stop words, tokenization, stemming and lower casing. In this paper basically two techniques are used: Naïve Bayes (NB) algorithm and optimized feature selection using Naïve Bayes. The paper compares the accuracy, recall, precision and f-measure of the model of NB and optimized feature selection using NB text classification algorithms on online reviews of products to predict its trend in market. The optimized feature selection plays an important part when working with the data mining algorithms; it is capable of reducing the ambiguity of the processor as the vector space of features are reduced. The results are discussed in the detailed manner comparing the algorithms performance metrics.
Data Science and Analytics, 2018
After an early requirement design, project managers mostly use the requirement specifications to ... more After an early requirement design, project managers mostly use the requirement specifications to get an estimate of functional size of software which helps in estimating effort required and tentative cost of the software. An accurate estimate is necessary to be able to negotiate price of a software project and to plan and schedule project activities. Function Point sizing method for estimation is used frequently to estimate functional size of software. Another popular method of functional sizing is Use Case Points (UCP). UCP method of estimation although less used than FP based estimation, but is simpler than FP based method. One reason for this is - FP sizing metric often uses COCOMO-II or other effort estimation method to convert size estimate in FPs into effort estimate where as direct conversion formula can be used for converting size in UCPs to effort estimate. This paper compares results of both approaches for two mid-size business applications and tries to understand the correlation between the results of two approaches.
International journal of engineering research and technology, 2018
This paper presents basics of digital image processing. Image Processing is very popular topic in... more This paper presents basics of digital image processing. Image Processing is very popular topic in the field of research and development. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. In Image processing any form of signal processing for which the input is an digital image; the output of image processing can be either an image or a set of characteristics or parameters related to the image. Most imageprocessing techniques involve treating the image as a twodimensional signal and applying standard signal-processing techniques to it. Image processing were developed in 1960s and in 2000 digital image processing has become the most common form of image processing due to its versatility and low cost. In broader sense, Image processing is divided into two major branches; image enhancement and image restoration. Fourier transform is most popular image transforms. The Fourier Transform is used in a wide ran...
Data Science and Analytics, 2018
Software Effort Estimation is an onerous but still inevitable task project managers have to perfo... more Software Effort Estimation is an onerous but still inevitable task project managers have to perform. Project managers often face the dilemma of selection of estimation approach whenever any new project opportunity comes across. Estimation is required for not only setting a price and bidding rounds but also for planning, budgeting, staffing and scheduling of project related tasks. This paper reviews major cost estimation techniques that are relevant in current scenario. The primary conclusion is - all estimation approaches have few advantages and disadvantages and are often complimentary in their characteristics. Observation and Evaluation of several approaches can be insightful and can help in selecting an estimation technique or combination of techniques best suited for a particular project.