International Journal Of engineering and Computer Science (IJECS) | International Journal Of engineering and computer science (original) (raw)
[www.ijecs.in\] vol 5-issue 4 by International Journal Of engineering and Computer Science (IJECS)
In the study of Human-computer-interaction (HCI) the design and use of technology using digitaliz... more In the study of Human-computer-interaction (HCI) the design and use of technology using digitalized computer systems mainly focusing on the particular interfaces between people and computers. There is an ongoing research that taking place till today using Human-computer-interactions especially on visually impaired people. This system mainly introduces the thematic study on " Blind and visually impaired people Human computer and access to Graphics " represents a current research study towards solution for impaired people and brings together a new researchers and practitioners. Here, we are approaching one of the methods which can be useful for the visually impaired people in the form of recognising the clothing patterns. Choosing clothing pattern is one of the challenging tasks for visually impaired people. We matured a camera-based model to notice the clothing patterns. The clothing patterns are categories as five types like (plaid, striped, pattern less, horizontal-vertical, irregular etc) and it identifies 11 clothing colors. The system mainly integrates with the microphone, camera, Bluetooth, earpiece for audio signal. The output of our system is given by audio signal. To recognize clothing patterns, we propose a Hough line Transformation for the detection of pattern and canny detection for detection of edges in the clothing pattern. we proposed the CCNY Clothing Pattern dataset and other different pattern datasets to our method. Using various other performances our method is under the study. In this project we are using OpenCV library for capture the images. Thought such a system would support more independence in blind person's daily life.
—In the cloud, for achieving access control and keeping data confidential, the data owners could ... more —In the cloud, for achieving access control and keeping data confidential, the data owners could adopt attribute-based encryption to encrypt the stored data. Users with limited computing power are however more likely to delegate the mask of the decryption task to the cloud servers to reduce the computing cost. As a result, attribute-based encryption with delegation emerges. Still, there are caveats and questions remaining in the previous relevant works. For instance, during the delegation, the cloud servers could tamper or replace the delegated ciphertext and respond a forged computing result with malicious intent. They may also cheat the eligible users by responding them that they are ineligible for the purpose of cost saving. Furthermore, during the encryption, the access policies may not be flexible enough as well. Since policy for general circuits enables to achieve the strongest form of access control, a construction for realizing circuit ciphertext-policy attribute-based hybrid encryption with verifiable delegation has been considered in our work. In such a system, combined with verifiable computation and encrypt-then-mac mechanism, the data confidentiality, the fine-grained access control and the correctness of the delegated computing results are well guaranteed at the same time. Besides, our scheme achieves security against chosen-plaintext attacks under the k-multilinear Decisional Diffie-Hellman assumption. Moreover, an extensive simulation campaign confirms the feasibility and efficiency of the proposed solution.
In our day today life quiz competition is rapidly increasing. So to get the appropriate results F... more In our day today life quiz competition is rapidly increasing. So to get the appropriate results Fastest finger first (FFF) is used to know the players respond time. It is rapidly used in institute level as well as commercial level. In early days it is very tedious work to know who has buzzer the alarm first in fastest finger first. So the solution for this problem is to use PIC microcontroller 16F877A. This project is designed as a product based. Also it has a vast impact on industrial level for security purpose.
The huge blast of information and Internet gadgets has prompted fast approach of Big Data in late... more The huge blast of information and Internet gadgets has prompted fast approach of Big Data in later past. Administration industry which is a noteworthy client for these Big Data applications will prompt real change to the conveyance process and new bits of knowledge into utilization example and work processes, which thusly will help with new worldwide conveyance models incorporating new innovations and dispersion of work comprehensively. The Service Industry will utilize Big Data for different choices making information framework and making the work process more ideal. The idea of large scale manufacturing lead to Industrial Revolution, likewise Big Data is relied upon to drive new types of financial movement in Service industry with connected human capital, achieving new level of monetary action, development, and development.
Using the cloud storage, users store their data on the cloud without the burden of data storage a... more Using the cloud storage, users store their data on the cloud without the burden of data storage and maintenance and services and high-quality applications from a shared pool of configurable computing resources. Cryptography is probably the most important aspect of communications security and is becoming increasingly important as a basic building block for computer security. As data sharing is an important functionality in cloud storage, In this paper we show that how to securely, efficiently and flexibly share data with others in cloud storage, Cloud-validation based Flexible Distributed, Migration, ciphertext with aggregate key encryption for data stored in cloud. This scheme provides secure data storage and retrieval. Along with the security the access policy is also hidden for hiding the user's identity. This scheme is so powerful since we use aggregate encryption and string matching algorithms in a single scheme. The scheme detects any change made to the original file and if found clear the error's. The algorithm used here are very simple so that large number of data can be stored in cloud without any problems. The security, authentication, confidentiality are comparable to the centralized approaches. A set of constant-size cipher texts such that efficient delegation of decryption rights for any set of cipher texts is possible the best.
—Visual Cryptography (VC), a cryptographic technique which allows visual information to be encryp... more —Visual Cryptography (VC), a cryptographic technique which allows visual information to be encrypted in such a way that the decryption can be performed by the Human Visual System (HVS), without the help of computers. Visual Cryptography Scheme (VCS) eliminates complex computation problem in decryption process, by stacking operation we can restore the secret image. This property makes VC especially useful for the low computation load requirement. During encryption, the image is encrypted and then it is divided into two shares. Two shares are superimposed to reveal the secret image in the decryption process. The objective of our project is to get the better quality of decrypted image with the same size as the original image. The OR based VCS degrades the contrast by its monotone property. In XOR based scheme, the share images are superimposed with the help of XOR operation which results in perfect reconstruction. Hence, the XOR operation is proposed in decoding process to enhance the contrast, quality, and to reduce noise.
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori ... more Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori ... more Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively... more Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively monitor physical or environmental conditions, and send their information to a " sink " node over multi hop wireless communication links. The sensor nodes battery energy depletion will significantly affect the network lifetime of a WSN. Most researchers have aimed to design energy-aware routing protocol to minimize the usage of the battery energy to prolong network lifetimes. This paper proposes a sink relocation approach for efficient utilization of sensor " s battery energy called Energy Efficient Sink Relocation Scheme (E-SRS) which considers regulation of transmission coverage range depend on the residual energy of a sensor node. The EE-SRS proposed in this paper discusses the algorithm to find the optimal place for relocating the sink, and " when and where to relocate the sink ". The EE-SRS algorithm is developed and simulated using network simulator. The performance analysis has also been done in terms of the network lifetime, throughput and packet delay.
Moving object detection and tracking are the more important and challenging task in video surveil... more Moving object detection and tracking are the more important and challenging task in video surveillance and computer vision applications. Object detection is the procedure of finding the non-stationary entities in the image sequences. Detection is the first step towards tracking the moving object in the video. Object representation is the next important step to track. Tracking is the method of identifying, the position of the moving object in the video. Identifying the position is much more challenging task then detecting the moving object in a video. Object tracking is applied in numerous applications like in robot vision, monitoring the traffic, Video surveillance, Video in-painting and Simulation. Here we are going to present a brief review of numerous object detection, object classification and object tracking algorithms available.
In this paper various techniques used for CRM in data mining are defined and compared with each o... more In this paper various techniques used for CRM in data mining are defined and compared with each other. Data mining is a useful and powerful tool for any organization especially for marketing people. Data mining is used in managing relationships with customers. Data mining process can be extremely useful for Medical practitioners for extracting hidden medical knowledge. It would otherwise be impossible for traditional pattern matching and mapping strategies to be so effective and precise in prognosis or diagnosis without data mining techniques.
With the increase in amout of data and information one has to deal with,now a days,going through ... more With the increase in amout of data and information one has to deal with,now a days,going through all the documents is a time consuming process.We are implementing an android application that helps organizations such as law firms to manage the hundreds of documents and to get summary of these documents.We are also using concept of ontology for this application.Ontology is basically the relationship between entities.The application that we are implementing allow the users to search for files in the database,upload files and summarize multiple documents.
Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence ... more Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence it is often called a restriction plate. The flow exerts some force on the plate due to impact of jet. The orifice plate acts an as obstacle for the flow. Here in our work we have done static analysis for three different geometries for orifice maintaining net impact area and orifice area same in all three cases. At the end we calculated maximum stress and maximum deformation for all the three geometries of orifice for the assumed working conditions, and found the best geometry which has the minimum stress and minimum deformation.
Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. Wi... more Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. With the help of GDI, circuits requiring large number of transistors can be realized with the help of quite less number of transistors. This approach tends to optimize various performance parameters such as area, delay, power dissipation. In this paper GDI cell has been applied in realizing various combinational circuits. One of the novel design has been proposed (XOR circuit using only nMOS) for providing low power in digital circuit. Based on simulation results their waveforms have been analyzed and various performance parameters have been calculated. These parameters are then compared to standard CMOS logic. The schematic and layout are drawn on 120nm technology file on a Dsch tool and their analysis is done on a Microwind 3.1 tool and BSIM simulator.
Numerical integration is an important ingradient within many techniques of applied mathematics,en... more Numerical integration is an important ingradient within many techniques of applied mathematics,engineering and scinietific applications, this is due to the need for accurate and efficient integration schemes over complex integration domains and the arbitrary functions as their corresponding integrands. In this paper,we propose a method to discretise the physical domain in the shape of a linear polyhedron into an assemblage of all hexahedral finite elements. The idea is to generate a coarse mesh of all tetrahedrons for the given domain,Then divide each of these tetrahedron further into a refined mesh of all tetrahedrons, if necessary. Then finally, we divide each of these tetrahedron into four hexahedra.We have further demonstrated that each of these hexahedra can be divided into and hexahedra. This generates an all hexahedral finite element mesh which can be used for various applications In order to achieve this we first establish a relation between the arbitrary linear tetrahedron and the standard tetrahedron.We then decompose the standard tetrahedron into four hexahedra. We transform each of these hexahedra into a 2-cube and discover an interesting fact that the Jacobian of these transformations is same and the transformations are also the same but in different order for all the four hexahedra.This fact can be used with great advantage to generate the numerical integration scheme for the standard tetrahedron and hence for the arbitrary linear tetrahedron. We have proposed three numerical schemes which decompose a arbitrary linear tetrahedron into 4, 4(hexahedra.These numerical schemes are applied to solve typical integrals over a unit cube and irregular heptahedron using Gauss Legendre Quadrature Rules. Matlab codes are developed and appended to this paper.
For a PV array, system monitoring is considered important to analyze the stability and performanc... more For a PV array, system monitoring is considered important to analyze the stability and performance. The simple monitoring system involves a data logging system with wired cables for transmitting data. Removing all those drawbacks observed in the existing system this proposed work is designed for the wireless monitoring of photovoltaic cell as a high precision solar array monitoring system. It is planned to measure the basic PV array characteristics like Module Temperature (T), Open Circuit Voltage (Voc), Short Circuit Current (Isc) and wirelessly transmit the data into real time GUI in the computer. The GUI was developed using the PROCESSING software. The commercially available WPAN hardware module ZigBee is used for implementation with API protocol for exchanging information. A sensor node with XBee and a set of analog sensors (eliminating the use of controllers at the sensor node) for measuring current and voltage generated in the PV array has been deployed. A coordinator node with Atmel microcontroller and Xbee connected with a PC to analyze the parameters.
In the era of globalization internet plays a vital role in all spheres of life and industries.Int... more In the era of globalization internet plays a vital role in all spheres of life and industries.Internet is very famous nowadays for satisfying people with various services related to various different fields. It is a very versatile facility which can help you in completing many tasks easily and conveniently with few clicks. It can be any work of daily usage or any specific service which needs a lot of research and formalities to be done beforehand, as well as this marketing is not an exception either. Online marketing, which is also called internet marketing, involves use of interactive,virtual spaces for the sake of promoting and selling goods and services. In fact, new synchronous, internet-based communication technologies had contributed to the restructuration of major economic sectors including marketing. Being cost-effective, flexible, and fast and enjoying an onunprecedented global reach, internet marketing has brought about different businesses incredible gainsrity, privacy, etc, emerged in the field of marketing from implementation of virtual space produce.
In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for hig... more In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for highly non-linear benchmark electromechanical systems such as Vehicle Active Suspension System (VASS) and Magnetic Suspension System (MSS) discussed in the literature. The proposed controller design is implemented using the most successful heuristic algorithms, such as Particle Swarm Optimization (PSO), Bacterial Foraging Optimization (BFO) and Firefly Algorithm (FA). A weighted sum of objective function comprising the overshoot (Mp), settling time (Ts), integral square error (ISE) and integral absolute error (IAE) is chosen to guide the heuristic search in order to find the controller parameters such as Kp, Ki, and Kd. In the proposed work, the major aim is to compare the performance of the considered heuristic algorithms for the controller design problem. The simulation work is implemented using the Matlab software and the performance of this study is validated using Mp, Ts, ISE and IAE values for the reference tracking and disturbance rejection operations. This study confirms that, FA offers faster convergence compared with the PSO and BFO algorithms.
This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We i... more This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We introduce a new MultiChord Protocol which is another variant of Chord Protocol defined over overlay network of Distributed Hash table. MultiChord inherits basic properties of Chord protocol with some added new features.
In the study of Human-computer-interaction (HCI) the design and use of technology using digitaliz... more In the study of Human-computer-interaction (HCI) the design and use of technology using digitalized computer systems mainly focusing on the particular interfaces between people and computers. There is an ongoing research that taking place till today using Human-computer-interactions especially on visually impaired people. This system mainly introduces the thematic study on " Blind and visually impaired people Human computer and access to Graphics " represents a current research study towards solution for impaired people and brings together a new researchers and practitioners. Here, we are approaching one of the methods which can be useful for the visually impaired people in the form of recognising the clothing patterns. Choosing clothing pattern is one of the challenging tasks for visually impaired people. We matured a camera-based model to notice the clothing patterns. The clothing patterns are categories as five types like (plaid, striped, pattern less, horizontal-vertical, irregular etc) and it identifies 11 clothing colors. The system mainly integrates with the microphone, camera, Bluetooth, earpiece for audio signal. The output of our system is given by audio signal. To recognize clothing patterns, we propose a Hough line Transformation for the detection of pattern and canny detection for detection of edges in the clothing pattern. we proposed the CCNY Clothing Pattern dataset and other different pattern datasets to our method. Using various other performances our method is under the study. In this project we are using OpenCV library for capture the images. Thought such a system would support more independence in blind person's daily life.
—In the cloud, for achieving access control and keeping data confidential, the data owners could ... more —In the cloud, for achieving access control and keeping data confidential, the data owners could adopt attribute-based encryption to encrypt the stored data. Users with limited computing power are however more likely to delegate the mask of the decryption task to the cloud servers to reduce the computing cost. As a result, attribute-based encryption with delegation emerges. Still, there are caveats and questions remaining in the previous relevant works. For instance, during the delegation, the cloud servers could tamper or replace the delegated ciphertext and respond a forged computing result with malicious intent. They may also cheat the eligible users by responding them that they are ineligible for the purpose of cost saving. Furthermore, during the encryption, the access policies may not be flexible enough as well. Since policy for general circuits enables to achieve the strongest form of access control, a construction for realizing circuit ciphertext-policy attribute-based hybrid encryption with verifiable delegation has been considered in our work. In such a system, combined with verifiable computation and encrypt-then-mac mechanism, the data confidentiality, the fine-grained access control and the correctness of the delegated computing results are well guaranteed at the same time. Besides, our scheme achieves security against chosen-plaintext attacks under the k-multilinear Decisional Diffie-Hellman assumption. Moreover, an extensive simulation campaign confirms the feasibility and efficiency of the proposed solution.
In our day today life quiz competition is rapidly increasing. So to get the appropriate results F... more In our day today life quiz competition is rapidly increasing. So to get the appropriate results Fastest finger first (FFF) is used to know the players respond time. It is rapidly used in institute level as well as commercial level. In early days it is very tedious work to know who has buzzer the alarm first in fastest finger first. So the solution for this problem is to use PIC microcontroller 16F877A. This project is designed as a product based. Also it has a vast impact on industrial level for security purpose.
The huge blast of information and Internet gadgets has prompted fast approach of Big Data in late... more The huge blast of information and Internet gadgets has prompted fast approach of Big Data in later past. Administration industry which is a noteworthy client for these Big Data applications will prompt real change to the conveyance process and new bits of knowledge into utilization example and work processes, which thusly will help with new worldwide conveyance models incorporating new innovations and dispersion of work comprehensively. The Service Industry will utilize Big Data for different choices making information framework and making the work process more ideal. The idea of large scale manufacturing lead to Industrial Revolution, likewise Big Data is relied upon to drive new types of financial movement in Service industry with connected human capital, achieving new level of monetary action, development, and development.
Using the cloud storage, users store their data on the cloud without the burden of data storage a... more Using the cloud storage, users store their data on the cloud without the burden of data storage and maintenance and services and high-quality applications from a shared pool of configurable computing resources. Cryptography is probably the most important aspect of communications security and is becoming increasingly important as a basic building block for computer security. As data sharing is an important functionality in cloud storage, In this paper we show that how to securely, efficiently and flexibly share data with others in cloud storage, Cloud-validation based Flexible Distributed, Migration, ciphertext with aggregate key encryption for data stored in cloud. This scheme provides secure data storage and retrieval. Along with the security the access policy is also hidden for hiding the user's identity. This scheme is so powerful since we use aggregate encryption and string matching algorithms in a single scheme. The scheme detects any change made to the original file and if found clear the error's. The algorithm used here are very simple so that large number of data can be stored in cloud without any problems. The security, authentication, confidentiality are comparable to the centralized approaches. A set of constant-size cipher texts such that efficient delegation of decryption rights for any set of cipher texts is possible the best.
—Visual Cryptography (VC), a cryptographic technique which allows visual information to be encryp... more —Visual Cryptography (VC), a cryptographic technique which allows visual information to be encrypted in such a way that the decryption can be performed by the Human Visual System (HVS), without the help of computers. Visual Cryptography Scheme (VCS) eliminates complex computation problem in decryption process, by stacking operation we can restore the secret image. This property makes VC especially useful for the low computation load requirement. During encryption, the image is encrypted and then it is divided into two shares. Two shares are superimposed to reveal the secret image in the decryption process. The objective of our project is to get the better quality of decrypted image with the same size as the original image. The OR based VCS degrades the contrast by its monotone property. In XOR based scheme, the share images are superimposed with the help of XOR operation which results in perfect reconstruction. Hence, the XOR operation is proposed in decoding process to enhance the contrast, quality, and to reduce noise.
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori ... more Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori ... more Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively... more Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively monitor physical or environmental conditions, and send their information to a " sink " node over multi hop wireless communication links. The sensor nodes battery energy depletion will significantly affect the network lifetime of a WSN. Most researchers have aimed to design energy-aware routing protocol to minimize the usage of the battery energy to prolong network lifetimes. This paper proposes a sink relocation approach for efficient utilization of sensor " s battery energy called Energy Efficient Sink Relocation Scheme (E-SRS) which considers regulation of transmission coverage range depend on the residual energy of a sensor node. The EE-SRS proposed in this paper discusses the algorithm to find the optimal place for relocating the sink, and " when and where to relocate the sink ". The EE-SRS algorithm is developed and simulated using network simulator. The performance analysis has also been done in terms of the network lifetime, throughput and packet delay.
Moving object detection and tracking are the more important and challenging task in video surveil... more Moving object detection and tracking are the more important and challenging task in video surveillance and computer vision applications. Object detection is the procedure of finding the non-stationary entities in the image sequences. Detection is the first step towards tracking the moving object in the video. Object representation is the next important step to track. Tracking is the method of identifying, the position of the moving object in the video. Identifying the position is much more challenging task then detecting the moving object in a video. Object tracking is applied in numerous applications like in robot vision, monitoring the traffic, Video surveillance, Video in-painting and Simulation. Here we are going to present a brief review of numerous object detection, object classification and object tracking algorithms available.
In this paper various techniques used for CRM in data mining are defined and compared with each o... more In this paper various techniques used for CRM in data mining are defined and compared with each other. Data mining is a useful and powerful tool for any organization especially for marketing people. Data mining is used in managing relationships with customers. Data mining process can be extremely useful for Medical practitioners for extracting hidden medical knowledge. It would otherwise be impossible for traditional pattern matching and mapping strategies to be so effective and precise in prognosis or diagnosis without data mining techniques.
With the increase in amout of data and information one has to deal with,now a days,going through ... more With the increase in amout of data and information one has to deal with,now a days,going through all the documents is a time consuming process.We are implementing an android application that helps organizations such as law firms to manage the hundreds of documents and to get summary of these documents.We are also using concept of ontology for this application.Ontology is basically the relationship between entities.The application that we are implementing allow the users to search for files in the database,upload files and summarize multiple documents.
Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence ... more Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence it is often called a restriction plate. The flow exerts some force on the plate due to impact of jet. The orifice plate acts an as obstacle for the flow. Here in our work we have done static analysis for three different geometries for orifice maintaining net impact area and orifice area same in all three cases. At the end we calculated maximum stress and maximum deformation for all the three geometries of orifice for the assumed working conditions, and found the best geometry which has the minimum stress and minimum deformation.
Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. Wi... more Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. With the help of GDI, circuits requiring large number of transistors can be realized with the help of quite less number of transistors. This approach tends to optimize various performance parameters such as area, delay, power dissipation. In this paper GDI cell has been applied in realizing various combinational circuits. One of the novel design has been proposed (XOR circuit using only nMOS) for providing low power in digital circuit. Based on simulation results their waveforms have been analyzed and various performance parameters have been calculated. These parameters are then compared to standard CMOS logic. The schematic and layout are drawn on 120nm technology file on a Dsch tool and their analysis is done on a Microwind 3.1 tool and BSIM simulator.
Numerical integration is an important ingradient within many techniques of applied mathematics,en... more Numerical integration is an important ingradient within many techniques of applied mathematics,engineering and scinietific applications, this is due to the need for accurate and efficient integration schemes over complex integration domains and the arbitrary functions as their corresponding integrands. In this paper,we propose a method to discretise the physical domain in the shape of a linear polyhedron into an assemblage of all hexahedral finite elements. The idea is to generate a coarse mesh of all tetrahedrons for the given domain,Then divide each of these tetrahedron further into a refined mesh of all tetrahedrons, if necessary. Then finally, we divide each of these tetrahedron into four hexahedra.We have further demonstrated that each of these hexahedra can be divided into and hexahedra. This generates an all hexahedral finite element mesh which can be used for various applications In order to achieve this we first establish a relation between the arbitrary linear tetrahedron and the standard tetrahedron.We then decompose the standard tetrahedron into four hexahedra. We transform each of these hexahedra into a 2-cube and discover an interesting fact that the Jacobian of these transformations is same and the transformations are also the same but in different order for all the four hexahedra.This fact can be used with great advantage to generate the numerical integration scheme for the standard tetrahedron and hence for the arbitrary linear tetrahedron. We have proposed three numerical schemes which decompose a arbitrary linear tetrahedron into 4, 4(hexahedra.These numerical schemes are applied to solve typical integrals over a unit cube and irregular heptahedron using Gauss Legendre Quadrature Rules. Matlab codes are developed and appended to this paper.
For a PV array, system monitoring is considered important to analyze the stability and performanc... more For a PV array, system monitoring is considered important to analyze the stability and performance. The simple monitoring system involves a data logging system with wired cables for transmitting data. Removing all those drawbacks observed in the existing system this proposed work is designed for the wireless monitoring of photovoltaic cell as a high precision solar array monitoring system. It is planned to measure the basic PV array characteristics like Module Temperature (T), Open Circuit Voltage (Voc), Short Circuit Current (Isc) and wirelessly transmit the data into real time GUI in the computer. The GUI was developed using the PROCESSING software. The commercially available WPAN hardware module ZigBee is used for implementation with API protocol for exchanging information. A sensor node with XBee and a set of analog sensors (eliminating the use of controllers at the sensor node) for measuring current and voltage generated in the PV array has been deployed. A coordinator node with Atmel microcontroller and Xbee connected with a PC to analyze the parameters.
In the era of globalization internet plays a vital role in all spheres of life and industries.Int... more In the era of globalization internet plays a vital role in all spheres of life and industries.Internet is very famous nowadays for satisfying people with various services related to various different fields. It is a very versatile facility which can help you in completing many tasks easily and conveniently with few clicks. It can be any work of daily usage or any specific service which needs a lot of research and formalities to be done beforehand, as well as this marketing is not an exception either. Online marketing, which is also called internet marketing, involves use of interactive,virtual spaces for the sake of promoting and selling goods and services. In fact, new synchronous, internet-based communication technologies had contributed to the restructuration of major economic sectors including marketing. Being cost-effective, flexible, and fast and enjoying an onunprecedented global reach, internet marketing has brought about different businesses incredible gainsrity, privacy, etc, emerged in the field of marketing from implementation of virtual space produce.
In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for hig... more In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for highly non-linear benchmark electromechanical systems such as Vehicle Active Suspension System (VASS) and Magnetic Suspension System (MSS) discussed in the literature. The proposed controller design is implemented using the most successful heuristic algorithms, such as Particle Swarm Optimization (PSO), Bacterial Foraging Optimization (BFO) and Firefly Algorithm (FA). A weighted sum of objective function comprising the overshoot (Mp), settling time (Ts), integral square error (ISE) and integral absolute error (IAE) is chosen to guide the heuristic search in order to find the controller parameters such as Kp, Ki, and Kd. In the proposed work, the major aim is to compare the performance of the considered heuristic algorithms for the controller design problem. The simulation work is implemented using the Matlab software and the performance of this study is validated using Mp, Ts, ISE and IAE values for the reference tracking and disturbance rejection operations. This study confirms that, FA offers faster convergence compared with the PSO and BFO algorithms.
This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We i... more This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We introduce a new MultiChord Protocol which is another variant of Chord Protocol defined over overlay network of Distributed Hash table. MultiChord inherits basic properties of Chord protocol with some added new features.
This is implemented for the purpose of getting a fully automized electricity billing system. This... more This is implemented for the purpose of getting a fully automized electricity billing system. This is aim to measure and monitor the electricty consumed by consumers in a locality and transmitting the consumed power to the station as well as issuing the bill of consumed power automatically. It is also aims to find the malpractices in the meter. Using this system the Electricity Board can access all data regarding the consumed power in each home and in each station whenever required. From the data the Board can find out power theft also it is also offers a system to charge extra payment for the excess usage of power at peak time (6.00-10.00pm) .Online payment is also possible for our new system. GSM is using for automating the system. The consumed unit transmission, alerts and bill reception are achieved by the GSM module in the client side as set by the user. Server station is also served by a GSM module for transmission and reception of data .
—In Today's scenario, files are not secure. They are fetch by any means of attack by eavesdropper... more —In Today's scenario, files are not secure. They are fetch by any means of attack by eavesdropper like cracking the pins, crashing the OS by viruses, malwares, and plenty of ways. We today can't sure that files protection wizards are secure and data can't be reached to the attacker. But if files are encrypted then even files are accessed original data remains confidential. Therefore, this paper represents the File Encryption System based on Symmetric Key Cryptography. I proposed the strategy to encrypt the files/even multiple files can be encrypted by compressing the files into one 'rar' file and it uses Blowfish as encryption/decryption standard and Cipher Block Chain Mode to perform the operations. I implemented compression function for 64-bit Initialization Vector(IV), use CBC mode with Blowfish and RC4 for 256-bit keystream. It is more efficient and secure then other general encryption process.
The cloud is redefining the IT architectures of various business domains. Organizations have clea... more The cloud is redefining the IT architectures of various business domains. Organizations have clearly recognized the unlimited benefits of cloud like dynamic payloads on the technical side and elastic financial models on the commercial side which guarantees in greater than before efficiency. The benefits of cloud computing can be maximized fully by applying novel technologies in security and risk management processes. The risk factors in terms of security is much more in public cloud computing compared to traditional computing which are bases on datacenter. In a highly shared ad hoc cloud environment which is in the control of instance-to-instance network connectivity security of the applications and sensitive data is a big challenge faced by the cloud providers. As the entire stack of applications, platform, infrastructure of the cloud is designed and managed by service providers the cloud users are uncertain about the security. This paper studies the generic security challenges in an ad hoc cloud environment .the security challenges and mitigations are discussed in section I. A survey was conducted with users of cloud from domains like health care, education and retail business. Analysis of survey data and the results are also discussed in section II. Introduction Ad hoc clouds rely on scalable virtualization technology which gives its user access to set of non-dedicated computing resources. The users of an ad hoc cloud are in a highly shared environment where resources are dynamically provisioned and organizations share the same remotely located physical hardware with strangers. Security and privacy has been the biggest challenges for organizations who are already using cloud as end to end solution for their IT needs and who are still considering moving into cloud. In a SPI model of cloud SaaS demands security at all levels of user identity and data access and maintain integrity and continuity of the applications. In IaaS secure networking and trusted computing should be the prime concern, and whereas at PaaS demands protection at the resource-management level.[1]
Wireshark is a network protocol analyser. Wireshark is able to intercept packets transmitted over... more Wireshark is a network protocol analyser. Wireshark is able to intercept packets transmitted over the network and compile statistics about network usage, allow the user to view content that is being accessed by other network users, and store usage information for offline access. This paper depicts the comparison of Wireshark, with one other similar tool, Network Miner, which is a Network Forensic Analysis Tool (NFAT), based on different parameters: graphical user interface (basic), packet information and traffic analysis. Network Miner can be used as a passive network sniffer/packet capturing tool and can parse PCAP files for off-line analysis.
As in real world, in virtual world also there are people who want to take advantage of you by exp... more As in real world, in virtual world also there are
people who want to take advantage of you by exploiting you
whether it would be your money, your status or your personal
information etc. MALWARE helps these people
accomplishing their goals. The security of modern computer
systems depends on the ability by the users to keep software,
OS and antivirus products up-to-date. To protect legitimate
users from these threats, I made a tool
(ADVANCE DYNAMIC MALWARE ANAYSIS USING API
HOOKING) that will inform you about every task that
software (malware) is doing over your machine at run-time
Index Terms— API Hooking, Hooking, DLL injection, Detour
It has always been an arduous task for the election commission to conduct free and fair polls in ... more It has always been an arduous task for the election commission to conduct free and fair polls in our country, the largest democracy in the world. Crore of rupees have been spent on this to make sure that the elections are riot free. But, now-a-days it has become common for some forces to indulge in rigging which may eventually lead to a result contrary to the actual verdict given by the people. This paper aims to present a new voting system employing biometrics in order to avoid rigging and to enhance the accuracy and speed of the process. The system uses thumb impression for voter identification as we know that the thumb impression of every human being has a unique pattern. Thus it would have an edge over the present day voting systems. As a pre-poll procedure, a database consisting of the thumb impressions of all the eligible voters in a constituency is created. During elections, the thumb impression of a voter is entered as input to the system. This is then compared with the available records in the database. If the particular pattern matches with anyone in the available record, access to cast a vote is granted. But in case the pattern doesn't match with the records of the database or in case of repetition, access to cast a vote is denied or the vote gets rejected. Also the police station nearby to the election poll booth is informed about the identity of the imposter. All the voting machines are connected in a network, through which data transfer takes place to the main host. The result is instantaneous and counting is done finally at the main host itself. The overall cost for conducting elections gets reduced and so does the maintenance cost of the systems.
This paper presents an interactive model based system for the management of production and contro... more This paper presents an interactive model based system for the management of production and controlling process in spinning mill using embedded system and Internet of Things. This system consists of various sensors to measure and store different parameters that are calculated to find the production rate. Apart from a comprehensive presentation of the set of the modules the system is composed of, together with their interrelationships, the above characteristics are analyzed, and their impact on the production control system is explained.. The system is also related to two control process namely air cooler controller and moisture mixer sprayer controller. This process is currently done manually which is being automated.Making Automated in this system we can effectively control the quality of the yarn that is produced in those alike industries. The system's attributes are presented with the aid of data structure diagrams, while the complete algorithm concerning the arduino module, in a algorithm form is provided and it presents a survey of all such systems.
Most often in our daily life we have to carry lot of cards such as credit cards, debit cards and ... more Most often in our daily life we have to carry lot of cards such as credit cards, debit cards and some other special cards for toll system ERP, parking and personal identification purpose. Currently smart card implementations can be seen around the world but they are not unified i.e. each developers uses different programming standards and data structures. The smart card will provide service to the user only within a university campus or an organization. In order to make available such multiple application access using a single card to every individual person we have planned to use RFID technology, which is cost effective. As RFID technology is used in the proposed concept, the programming standards and data structures will be unified. Unlike smart card, the RFID card can be used by every individual person to access different applications. Thus, a person needs not to carry number of cards; he can just carry a single card for different purpose.
Now a day people tend to seek knowledge or information from internet that concern with health thr... more Now a day people tend to seek knowledge or information from internet that concern with health through online healthcare services. The basic aim of this system is to bridge the vocabulary gap between the health providers by proving instant replies to the questions posted by patients. Automatic generated content for healthcare services are chosen instead of traditional community generated systems because they are reliable, compatible, and provide instant replies. This paper proposes a scheme to code the medical record using local mining and global approaches. Local mining aims to code the medical records by extracting the medical concepts from individual record and then mapping them to terminologies based on external authenticated vocabularies. Local Mining establishes a tri-stage framework to accomplish this task. Global learning aims to learn missing key concepts and propagates precise terminologies among underlying connected records over a large collection.
— In India, everyday many lives are affected because the patients are not timely and properly ope... more — In India, everyday many lives are affected because the patients are not timely and properly operated. Also for real time parameter values are not efficiently measured in clinic as well as in hospitals. Sometimes it becomes difficult for hospitals to frequently check patients' conditions. Also continuous monitoring of ICU patients is not possible. To deal with these types of situations, our system is beneficial. Our system is designed to be used in hospitals for measuring and monitoring various parameters like temperature, ECG, heart beat etc. The results can be recorded using Raspberry Pi displayed on a LCD display. Also the results can be sent to server using GSM module. Doctors can login to a website and view those results.
Image binarization is the procedure of separating pixel values into two parts, black as foregroun... more Image binarization is the procedure of separating pixel values into two parts, black as foreground and white as background. In
binarization document image is converted into binary image using thresholding techniques. Many binarization algorithms have been
proposed for different types of degraded document images. The main objective of image enhancement is to modify attributes of an image to
make it more suitable for a given task . To remove the noise and improve the quality of the document binarization techniques are used.
Thresholding is one of such binarzation technique which is used for this purpose. Thresholding is further divided into the global and local
thresholding techniques. In the document with uniform contrast delivery of background and foreground, global thresholding has been
found to be best technique. Local thresholding , is an approach for the situations in which single value thresholding does not yield proper
result, to overcome this a hybrid approach is introduced which is a combination of local with Otsu’s thresholding.
[1][2][3]Department of Information Technology [1][2][3 Abstract: Cloud computing promises to sign... more [1][2][3]Department of Information Technology [1][2][3 Abstract: Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. Because of these new computing and communication paradigm there arise data security challenges. Even though existing techniques use security mechanisms, data theft attacks prevention fails. To overcome this we can use decoy technology to secure data stored in cloud. Although, Fog Computing is defined as the extension of the Cloud Computing paradigm, its distinctive characteristics in the location sensitivity, wireless connectivity, and geographical accessibility create new security and forensics issues and challenges which have not been well studied in Cloud security and Cloud forensics. We monitor data access in the cloud and detect abnormal data access patterns. When unauthorized access is suspected and then verified using challenge questions, we launch a disinformation attack by returning large amounts of decoy information to the attacker. This protects against the misuse of the user's real data. Experiments conducted in a local file setting provide evidence that this approach may provide unprecedented levels of user data security in a Cloud environment.
— Material management is related to planning, controlling and organizing the flow of material fro... more — Material management is related to planning, controlling and organizing the flow of material from availability to requirement. Mapping excess material in appropriate location considering multiple criteria is one of the administrative decision making task. Material utilization is a multi –criteria decision making problem consist of several conflicting criteria must verified before taking any decision. A systematic methodology is proposed in this paper based on TOPSIS and AHP method. This paper discusses about important mechanism that provide guideline to decision maker for material mobilization and material utilization using TOPSIS and AHP methods. The combination of TOPSIS and AHP provide faster, reliable and convenient way for decision maker evaluating the most important criteria. Using TOPSIS and AHP method, we can provide best tool for decision maker so that material management will achieve easily.
Bluetooth technology unplugs our digital peripherals. In short, it is a wireless replacement for ... more Bluetooth technology unplugs our digital peripherals. In short, it is a wireless replacement for many of the cables we currently use to transmit voice and data signals. Bluetooth radio modules use Gaussian Frequency Shift Keying (GFSK) for modulation. Bluetooth employs an FHSS spreading technique, changing frequencies at a rate of 1600 times per second-160 times the rate at which a wireless LAN changes frequencies. This paper focuses on the attacks and security issues in Bluetooth technology.
In Cloud, we describe the ways to enable protection of Multimedia contents from redistributing. W... more In Cloud, we describe the ways to enable protection of Multimedia contents from redistributing. Web has billions of documents including video, audio and images, but there is no central management system, where duplication of contents is more common. It is said that each and every document has a duplicate copy. This is more prevelant in videos stored in multiple formats, versions, size etc. and are found unaware by content creators when modified and republished using Video Editorial Tools. This may lead to security problems and also reduplicating the identity of owners and also loss of revenue to content creators. This also occupies a enormous space over the web. In cloud storage too it is more common involving both public and private clouds. But the private cloud is said to be more secure when compared to the public cloud. So to avoid this situation some of the techniques have been used to avoid duplication of contents and focused mainly over the 3D-video contents.
Steganography is used to encrypt any secret information like password, text and picture, audio be... more Steganography is used to encrypt any secret information like password, text and picture, audio behind original cover file. Original message is converted into cipher text by using mystery key and then hidden into the LSB of original image. The current work signifies cryptostegnography of audio and video which is the combination of image steganography, audio and video steganography by making use of Forensics Technique as a tool for authentication. The main aim is to hide secret data behind image and audio of video file. As video is the utilization of many still frames of images and audio, thus for hiding secret information any frames can be selected for audio and video. Suitable algorithm such as AES for security and authentication image processing is used, hence data security can be increased. And for data embedding, use 4LSB algorithm
In the modern age of Internet, usage of social media is growing rapidly on internet, organizing t... more In the modern age of Internet, usage of social media is growing rapidly on internet, organizing the data, interpreting and supervising User generated content (UGC) has become one of the major concerns. Updating new topics on internet is not a big task but searching topics on the web from a vast volume of UGC is one of the major challenges in the society. In this paper we deal with web search result clustering for improving the search result returned by the search engines. However there are several algorithms that already exist such as Lingo, K-means etc. In this paper basically we work on descriptive-centric algorithm for web search result clustering called IFCWR algorithm. Maximum numbers of clusters are randomly selected by using Forgy's strategy, and it iteratively merges clusters until most relevant results are obtained. Every merge operation executes Fuzzy C-means algorithm for web search result clustering. In Fuzzy C-means, clusters are merged based on cosine similarity and create a new solution (current solution) with this new configuration of centroids. In this paper we investigate the Fuzzy C-means algorithm, performing pre-processing of search query algorithm and try to giving the best solution.
Pilot contamination posts a elementary limit on the performance of huge multiple-input–multiple-o... more Pilot contamination posts a elementary limit on the performance of huge multiple-input–multiple-output (MIMO) antenna systems owing to failure in correct channel estimation. To address this drawback, we tend to propose estimation of solely the channel parameters of the specified links during a target cell, however those of the interference links from adjacent cells. The desired estimation is, nonetheless, AN underdetermined system. During this paper, we show that if the propagation properties of huge MIMO systems will be exploited, it's potential to get a correct estimate of the channel parameters. Our strategy is impressed by the observation that for a cellular network, the channel from user instrumentality to a base station consists of solely a number of clustered methods in space. With an awfully massive antenna array, signals may be discovered under extraordinarily sharp regions in space. As a result, if the signals are discovered within the beam domain (using Fourier transform), the channel is around thin, i.e., the channel matrix contains only a little fraction of huge elements, and different elements are near zero. This observation then permits channel estimation based on thin Bayesian learning strategies, wherever thin channel components may be reconstructed employing a little variety of observations. Results illustrate that compared to traditional estimators; the planned approach achieves far better performance in terms of the channel estimation accuracy and doable rates in the presence of pilot contamination.
This document shows the concept for a broad topic and ambiguous query, different types of users m... more This document shows the concept for a broad topic and ambiguous query, different types of users may have different search goals when they submit the query to the search engine. The inference and analysis of user search goals can be very useful in improving search engine relevance information and user experience. In this paper, we propose a novel approach to infer user search goals by analyzing search engine query logs. First, we propose a framework to search different user search goals for a query by making cluster to the proposed feedback sessions. Feedback sessions are constructed from user click-through logs i.e. user response and can efficiently reflect the information needs to users. Second, we propose a novel approach to create pseudo-documents to better represent the feedback sessions for clustering. Finally, we propose a new criterion Classified Average Precision (CAP) to calculate the performance of inferring user search goals. Experimental results are presented using user click-through logs from a commercial search engine to check the effectiveness of our proposed methods.
Data mining is the task of discovering useful and interested patterns from the huge amount of the... more Data mining is the task of discovering useful and interested patterns from the huge amount of the data where the data can be stored in databases, data warehouses and other information repositories. Data mining comprises an integration of techniques from various disciplines such as data visualization, database technology, information retrieval, high performance computing, machine learning and pattern recognition, etc. The classification of multi-dimensional data is one of the major challenges in data mining and data warehousing. In a classification problem, each object is defined by its attribute values in multidimensional space. Some of the existing systems consider the data analysis might identify the set of candidate data cubes for exploratory analysis based on domain knowledge. Unfortunately, conditions occurred for such assumptions are not valid and these include high dimensional databases, which are difficult or impossible to pre-calculate the dimensions and cubes. Some proposed system is formulated automatically find out the dimensions and cubes, which holds the informative and interesting data. In high dimensional datasets, the data analysis procedures need to be integrated with each other. Based on the information theoretic measures like Entropy is used to filter out the irrelevant data from the dataset in order to formulate a more compact, manageable and useful schema.
The Mobile Ad hoc Networks are considered as a new paradigm of infrastructure-less mobile wireles... more The Mobile Ad hoc Networks are considered as a new paradigm of infrastructure-less mobile wireless communication systems. Routing in MANETs is considered as a challenging task due to the unpredictable changes in the network topology, Nodes can dynamically join and leave the network without any warning and All nodes in network are energy dependent and efficient energy utilization is one of the important criteria in MANET. In this research work, propose the Dynamic Efficient Power Consumption Congestion Control Scheme (DEPCCCS) for congestion control and improving the quality of service in mobile network. The dynamic behavior of network connection is not maintained for long time. In a mobile network nodes are not aware about their energy status, some situation routing packets are consumes more energy. In this paper, the DEPCCCS is incorporated in the routing protocol to reduces that possibility of destination finding by maintain the record of location of each node in network respect to wireless base station is used. The central base station stores locations of the mobile nodes in a position table. The proposed protocol dynamically calculates every nodes energy status and their current location and speed for the minimum energy consumption for mobile nodes. And also the proposed scheme presents congestion control and power consumption. By simulation results, shown that our proposed technique attains better delivery ratio and throughput with less delay and energy consumption when compared with the existing technique.
Pipelining is a concept which improves the performance of processor. A five stage pipelined RISC ... more Pipelining is a concept which improves the performance of processor. A five stage pipelined RISC processor has stages as instruction fetch, decode, execute, memory, write back. RISC has a simpler and faster instruction set architecture. The aim of paper is to design instruction fetch unit and ALU which are part of RISC processor architecture. Instruction fetch is designed to read the instructions present in memory. ALU is in the execution stage of pipelining which performs all computations i.e. arithmetic and logical operations. Xilinx 8.1i is used to simulate the design using VHDL language.
Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segme... more Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segmentation, atlas selection and combination are considered as two key factors affecting the performance. Recently, manifold learning based atlas selection methods have emerged as very promising methods. However, due to the complexity of structures in raw images, it is difficult to get accurate atlas selection results only measuring the distance between raw images on the manifolds. Although the distance between the regions to be segmented across images can be readily obtained by the label images, it is infeasible to directly compute the distance between the test image (gray) and the label images (binary). Here is a small try to solve this problem by proposing a label image constrained atlas selection method, which exploits the label images to constrain the manifold projection of raw images. Compared with other related existing methods, the experimental results on prostate segmentation showed that the selected atlases are closer to the target structure and more accurate segmentation were obtained by using our proposed method. We present a multi-atlas-based framework for accurate, consistent and simultaneous segmentation of a group of target images. Multi-atlas-based segmentation algorithms consider concurrently complementary information from multiple atlases to produce optimal segmentation outcomes. When segmenting a group of target images, most current methods consider these images independently with disregard of their correlation, thus resulting in inconsistent segmentations of the same structures across different target images.
Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segme... more Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segmentation, atlas selection and combination are considered as two key factors affecting the performance. Recently, manifold learning based atlas selection methods have emerged as very promising methods. However, due to the complexity of structures in raw images, it is difficult to get accurate atlas selection results only measuring the distance between raw images on the manifolds. Although the distance between the regions to be segmented across images can be readily obtained by the label images, it is infeasible to directly compute the distance between the test image (gray) and the label images (binary). Here is a small try to solve this problem by proposing a label image constrained atlas selection method, which exploits the label images to constrain the manifold projection of raw images. Compared with other related existing methods, the experimental results on prostate segmentation showed that the selected atlases are closer to the target structure and more accurate segmentation were obtained by using our proposed method. We present a multi-atlas-based framework for accurate, consistent and simultaneous segmentation of a group of target images. Multi-atlas-based segmentation algorithms consider concurrently complementary information from multiple atlases to produce optimal segmentation outcomes. When segmenting a group of target images, most current methods consider these images independently with disregard of their correlation, thus resulting in inconsistent segmentations of the same structures across different target images.
—with recent advance in web technology, many online shopping websites have been emerged. Despite ... more —with recent advance in web technology, many online shopping websites have been emerged. Despite its advantages, however, online shopping presents certain drawbacks. One drawback is that it may be difficult for a person to visualize how a given article would look if worn by that person-owing to the rich variation in body size and shape, hair and skin color, etc., in the human population. Testing the fitness of clothes is highly important for both customer and trader. Our approach concentrate on how the selected garment fitted the user's body and how it will be appear as if he/she in real world. This was carried out by identifying critical points on garment and user's body dimensions using image processing techniques. In this paper, an application for mitigating the virtual dressing room was designed, implemented and tested.
— The data security in cloud is an important issue. The important data can be stored in cloud and... more — The data security in cloud is an important issue. The important data can be stored in cloud and the security of that data is totally dependent on cloud. The data might be uncovered by the malicious third party user because of wireless connection between client and cloud without proper authentication and protection. In this paper we figure out the different security issues with the cloud. When the data is stored in cloud the data should be properly managed and cloud have to provide a proper security to the data. In this paper discussing the different type of issues with the cloud and also possible policies are mentioned here that we can take care of those issues while discussing about the security provided by the cloud.
The rapidly advancing mobile communication technology and the decrease in costs make it possible ... more The rapidly advancing mobile communication technology and the decrease in costs make it possible to incorporate mobile technology into Smart Home systems. We propose a mobile and internet based Smart Home system that consists of a mobile phone with android capabilities, an internet based application, and a home server. The home appliances are controlled by the Arduino which receives commands from the server computer, which operates according to the commands received from the mobile application via the wireless network or the internet. In our proposed system the home server is built upon a Wi-Fi technology which receives commands from the client and the command is processed via Arduino, which allows a user to control and monitor any parameters related to the home by using any Andriod capable smart phone or via the internet. This paper presents an innovative low cost design and implementation of automated control based on weather conditions, appliance control, and, home security together with the design of android application to enable the smart phone to send commands and receive alerts through the server based system.
Powering Nation means developing nation as powerful nation. It depends on energy. Energy is the c... more Powering Nation means developing nation as powerful nation. It depends on energy. Energy is the chief gauge of all of categories work done by human beings and nature. Energy sources are of two genuses there are renewable and non-renewable. Examples of renewable sources are solar, wind, hydropower, tidal hydropower and geothermal energy. These are generally sustainable and environmentally friendly. Non-renewable sources are extracted or created after considerable damage to the environment. Examples are coal, petroleum, natural gas and nuclear power. Geothermal energy is an unimpeded source of energy and is available from earth crust. Maneuver of geothermal energy resource is a swap resource for conventional energy resources. Geothermal energy is a clean energy resource. It not only produces electricity but also has many applications like space heating, drying, industrial processes. In USA and New Zealand the usage of geothermal energy is very high. In India the convention of geothermal energy is in burgeoning juncture.
The timing pathway is a fully-organized pathway for exchanging the ideas in which they are encode... more The timing pathway is a fully-organized pathway for exchanging the ideas in which they are encoded by the timing within events. Absolutely, whilst a jammer has the power to disorganize the data gathered in the stormed packets. Timing data is not able to be jammed. As long as on a blocked pathway the data can deliver to the acceptor and achieve their results. The transmission located in the wireless midway is determined by the familiar attack called as jamming attack. Below this attack, because the knot join with the jammer have contrary interests, their communications can be designed by the method of game theory. Correspondingly, in this summary a game theoretic imitation of the interplay among nodes with the help of timing pathways to attain flexibility to jamming intrusion along with a jammer copied and guessed. Especially, the Nash equilibrium is analyzed in some kinds of time period, singleness, and merging below best response dynamics. Moreover, the case in whatever the interacting nodes decide their plan of action join with the jammer respond correspondingly is designed and estimate as a stackelberg game, by in the view of a pair the perfect and imperfect acquaintance of the jammers efficiency function. It will show the collision of network framework on the arrangement performance.
Any large unstructured data sets with sizes beyond the ability of the software tools to manage an... more Any large unstructured data sets with sizes beyond the ability of the software tools to manage and process within a tolerable elapsed time is rightly observed as bigdata. Cloud computing is delivery of on demand computing resources from application to data center over internet. Combining these two strong reliable platforms helps in tackling extraveneous real time problems and obtaining solutions for it. Cloud embedded bigdata supports inexpensive reliable storage and tools for analyzing structured and unstructured, semi streaming, click streaming and various types of data. The existing system tends to be more costlier because of cloud deployment costs and it is not elastic in nature. The subjective nature of cloud delivery to incoming data streams pulls back the efficiency of the system. The paper aims to minimize the cost for cloud adoption by determining the cloud adoption factors from Net present value computation and derives a mathematical expression for 'α'(Cloud adoption factor). It also addresses the issues that affect the performance issues of bigdata by implementing subordinate virtual cloud mechanism to overcome the addressed bottlenecks.
Any large unstructured data sets with sizes beyond the ability of the software tools to manage an... more Any large unstructured data sets with sizes beyond the ability of the software tools to manage and process within a tolerable elapsed time is rightly observed as bigdata. Cloud computing is delivery of on demand computing resources from application to data center over internet. Combining these two strong reliable platforms helps in tackling extraveneous real time problems and obtaining solutions for it. Cloud embedded bigdata supports inexpensive reliable storage and tools for analyzing structured and unstructured, semi streaming, click streaming and various types of data. The existing system tends to be more costlier because of cloud deployment costs and it is not elastic in nature. The subjective nature of cloud delivery to incoming data streams pulls back the efficiency of the system. The paper aims to minimize the cost for cloud adoption by determining the cloud adoption factors from Net present value computation and derives a mathematical expression for 'α'(Cloud adoption factor). It also addresses the issues that affect the performance issues of bigdata by implementing subordinate virtual cloud mechanism to overcome the addressed bottlenecks.
Watermark is embedded by modifying the third level mid frequency coefficients of the host image w... more Watermark is embedded by modifying the third level mid frequency coefficients of the host image with multiple SFs. As many combinations of SFs are possible, it is difficult to obtain optimal solutions by trial and error method. Hence, in order to achieve the highest possible transparency and robustness, optimization of the scaling factors is necessary. This task employs Genetic Algorithm (GA) to obtain optimum SFs. GA can search for multiple solutions simultaneously over a wide range, and an optimum solution can be gained by combining the obtained results appropriately. The aim of the task is to develop an optimal watermarking technique based on DWT domain for grey-scale images. In this paper, a robust and oblivious image watermarking algorithm using maximum wavelet coefficient modulation is proposed. Simulation results show that performance of the proposed method is superior in terms of Peak Signal to Noise Ratio (PSNR) and Normalized Correlation Coefficient (NCC).
Valuable information can be hidden in images, however, few research discuss data mining on them. ... more Valuable information can be hidden in images, however, few research discuss data mining on them. Image retrieval means searching, browsing and retrieving images from image databases. There are two different methodologies for image retrieval i.e. text based image retrieval and content based image retrieval. Former one is obsolete. In latter one many visual features like texture, size, intensities, and frequency of pixels and color of image are extracted. In query-by-example search extracted featured are compared with stored ones. In this work an efficient for extracting image features is considered using intensity histogram of gray color image. Here in this general framework based on the decision tree for mining and processing image data. Pixel wised image features were extracted and transformed into a database-like table which allows various data mining algorithms to make explorations on it. Finally results of average gradient vectors are to be compared with previously stored one dimensional array of intensities to find similarities in image data.
Matrix converters are frequency converters which do not contain a direct current link circuit wit... more Matrix converters are frequency converters which do not contain a direct current link circuit with passive components, unlike conventional frequency converters. Thus, matrix converters may provide a solution for applications where large passive components are not allowed, or a purely semiconductor-based solution provides an economically more efficient result than conventional frequency converters. The matrix converter (MC) is an alternative AC-AC power converter by connecting the direct input to output phases through bidirectional switches and without using any dc-link or energy storing element, therefore, is called an all-silicon converter. Two topologies of matrix converter are established such as linear topology and the indirect topology. This paper is devoting to presents the topology of the Very Sparse Matrix Converter (VSMC). The article is focused on Easy Commutation Space Vector Modulation (ECSVM) modelling applied to the very sparse matrix converter (VSMC).
The research investigated the impact on the power system with an extensive penetration of photovo... more The research investigated the impact on the power system with an extensive penetration of photovoltaic (PV) generation. A model of PV generation suitable for studying its interactions with the power system was developed. The dynamic response of a PV generation system to rapid changes in irradiance was investigated. An aggregated model of grid-connected PV generation was built, and it was used for simulating the integration of PV generation on a large-scale. Voltage control technique was investigated by simulation. Distributed Generation (DG) units are connected to the grid increasing nowadays for several reasons. Most DG units are relatively small and connected to the distribution network. A large part of the DG units connected to the grid via power electronic converters. The main task of the converters is to convert the power that is available from the prime source to the correct voltage and frequency of the grid. The general objective of this paper is to investigate how the power electronic converters can support the grid and solve power quality problems. An IEEE-5 bus system considered for this work to validate the power electronic converter using MATLAB/ Simulink.
In the contemporary era, Data is the most valuable resource that is used in day to day life from ... more In the contemporary era, Data is the most valuable resource that is used in day to day life from an individual to large organizations. Database contains useful and confidential information so it becomes necessary to protect it from any unauthorized access. Any unauthorized user can try to perform unauthorized activities at unauthorized time on sensitive data. So to prevent the database from any misuse, different security mechanisms are applied to the database to make it secure. This paper focuses on the challenges and security mechanisms in database.
" The MHD flow and convective heat transfer from water functionalized CNTs over a static/moving w... more " The MHD flow and convective heat transfer from water functionalized CNTs over a static/moving wedge in the presence of heat source/sing are studied numerically. Thermal conductivity and viscosity of both single and multiple wall carbon nanotubes (CNTs) within a base fluid (water) of similar volume are investigated to determine the impact of these properties on thermo fluid performance. The governing partial differential equations are converted into nonlinear, ordinary, and coupled differential equations and are solved using bvp4c Matlab solver. The effects of volume fraction of CNTs and magnetic and wedge parameters are investigated and presented graphically. The numerical results are compared with the published data and are found to be in good agreement.
Instant fuzzy search is important developing technique from which users can find results characte... more Instant fuzzy search is important developing technique from which users can find results character by character with better search experiences. The results must have high speed, good relevancy score and also good ranking functions used to get top results. Many functions are used to consider proximity of keywords which ultimately gives good relevancy score. In this paper, proximity information is used to ranking query results with which gives good time and space complexities. Many previously proposed techniques are used to achieve proximity ranking into instant fuzzy search. Most of the techniques firstly compute results and rank then according to some ranking functions, but if the dataset used is large then it takes time to compute all results and its very time consuming. At this state early termination technique is used to minimize space and time complexity. In this paper, incremental computation algorithm is used to overcome all drawbacks of previous systems and compute relevant results. Also query logs are used which are very useful for most of query suggestion systems, which ultimately reduces time complexity efficiently. The experimental results are computed to show space, time complexity and quality of results.
Cloud computing is an upcoming technology, which offers various services. The services include in... more Cloud computing is an upcoming technology, which offers various services. The services include infrastructure, platform or software services. The service provided by the cloud is over the internet. In the Cloud, the services are available quickly. The cloud has high demand in the market. Most of the organisation or user prefers their storage as on cloud storage and which is located at the remote place. The user has no control over the cloud storage data. The cloud computing uses the resources like Memory, storage and processor, which are not physically present at the user's location, rather they are located outside the premises and managed by a service provider. The user can access the resources via the Internet. The main focus of this paper is to check data integrity of file which is stored on remote cloud storage with less communication overhead. Security in terms of integrity is most vital aspects in cloud computing environment. In this paper, we focus on a cloud data security problem. We also, try to get a security with minimal computational overhead because all computation is done over the Internet. The different techniques are discussed for data integrity in cloud storage and their performance is measured in terms of computational overhead.
Intrusion detection has become an essential element of network administration thanks to the huge ... more Intrusion detection has become an essential element of network administration thanks to the huge range of attacks persistently threaten our computers. Ancient intrusion detection systems area unit restricted and do not give a whole resolution for the matter. They look for potential malicious activities on network traffics; they generally succeed to search out true security attacks and anomalies. However, in several cases, they fail to observe malicious behaviors (false negative) or they fireplace alarms once nothing wrong within the network (false positive). Additionally, they need thorough manual process and human professional interference. Applying data processing (DM) techniques on network traffic information may be a promising resolution that helps develop higher intrusion detection systems. Moreover, Network Behavior Analysis (NBA) is additionally associate degree effective approach for intrusion detection. During this paper, we tend to discuss DM and NBA approaches for network intrusion observation and recommend that a mix of each approach has the potential to detect intrusions in networks additional effectively.
— The cloud computing and the Internet of things are tightly coupled with each other. The rapid g... more — The cloud computing and the Internet of things are tightly coupled with each other. The rapid growth of the Internet of Things (IoT) and the development of technologies created a widespread connection of " things ". This results in the production of large amounts of data which needs to be stored, processed and accessed. Cloud computing is a paradigm for big data storage and analytics while the Internet of Things is exciting on its own that the real innovation will come from combining it with cloud computing. This can enable sensing services and powerful processing of sensing data stream. More things are being connected to address a growing range of business needs. In fact, by the year 2020, more than 50 billion things will connect to the Internet—seven times our human population. Insufficient security will be a critical barrier to large-scale deployment of IoT systems and broad customer adoption of IoT applications using cloud. Simply extending the existing IT security architectures to the IoT and cloud will not be sufficient. The IoT world requires new security approaches, creating fertile ground for innovative thinking and solutions. This paper discusses key issues that are believed to have long-term significance in IoT and cloud computing security and privacy, based on documented problems and exhibited weaknesses.
Digital equipment such as telecommunication, computers systems and instruments use microprocessor... more Digital equipment such as telecommunication, computers systems and instruments use microprocessors that operate at high frequencies allowing them to carry billions of operations per second. A disturbance in the electrical supply lasting just a few milliseconds can affect millions of basic operations. The result may be malfunctioning and loss of data with dangerous or costly consequences (e.g. loss of production). That is why many loads, called sensitive or critical loads, require a supply that is protected. Many manufacturers of sensitive equipment specify very strict tolerances, much stricter than those in the distribution system for the supply of their equipment, one example being Computer Business Equipment Manufacturer's Association for computer equipment against distribution system disturbances. The design of this uninterrupted power supply (UPS) for personal computer (PC) is necessitated due to a need for enhanced portability in the design of personal computer desktop workstations. Apart from its original functionality as a backup source of power, this design incorporates the unit within the system unit casing, thereby reducing the number of system components available. Also, the embedding of this unit removes the untidiness of connecting wires and makes the whole computer act like a laptop. Not to be left out is the choice of Arduino as an important part of the circuitry. This has eliminated the weight and space-consuming components that make up an original design. The singular use of this Arduino places the UPS under the class of an advanced technology device.
— Cancer is one of the most commonly affected diseases in the developed countries. Early diagnosi... more — Cancer is one of the most commonly affected diseases in the developed countries. Early diagnosis plays a significant role in curing cancer patients. Every year, thousands of people die due to Lung cancer. In this paper, a novel Candidate group search algorithm based on evolutionary approach is proposed. This optimization algorithm allows assistant doctors to identify the nodules present in the lungs at the early stages. As manual interpretations are time consuming and very critical. Genetic algorithm (GA) helps in identifying genes that help to classify patient lung cancer status with notable predictive performance.
Digital electronic maps are used to track the location in outdoor and indoor environment. Most of... more Digital electronic maps are used to track the location in outdoor and indoor environment. Most of electronic maps are useful for outdoor environment. There is no efficiency technology for search the indoor location. Many smart-phones are used to track the location by electronic map. The electronic maps may be the Google map, GPS navigation, waze and offline GPS maps. These are only useful for outdoor environment. Indoor based location tracking system can be implemented by using the Indoor Atlas Android SDK. It provides an API for developers to create application for inside building navigation. This app also useful for vision it have impaired people because the speech recognition for searched location and also include the event details in indoor based location tracking application. Using this app to find the shortest path for desired location.
Presently in the system patient need to contact a doctor and take an appointment is possible only... more Presently in the system patient need to contact a doctor and take an appointment is possible only if patient goes to that particular doctor's clinic. Even the people can't get the correct information about doctors, their details and different hospitals available in a particular city. The only way to get all these are through directly contacting particular persons personally and it's a very big problem for a person new to that city. Those who want to have some information in the medical field or wants to get appointment to particular doctor from his own place are not possible. Nowadays in order to get correct information and right treatment for a patient has to go by him wherever needed. This is a lengthy process, which takes a lot of time to design manually, and also costs more and even limited to certain extent. It's not possible to get all the information or details as well as we can't satisfy the user through this process. So, this " E-Clinic " website will provide all the information regarding different types of therapy available in clinic and patient can easily book the appointment.
Wireless spoofing attacks are easy to launch and can significantly impact the performance of netw... more Wireless spoofing attacks are easy to launch and can significantly impact the performance of networks. Although the identity of a node can be verified through cryptographic authentication, conventional security approaches are not always desirable because of their overhead requirements. In this paper, we propose to use spatial information, a physical property associated with each node, hard to falsify, and not reliant on cryptography, as the basis for (1) detecting spoofing attacks; (2) determining the number of attackers when multiple adversaries masquerading as a same node identity; and (3) localizing multiple adversaries. We propose to use the spatial correlation of received signal strength (RSS) inherited from wireless nodes to detect the spoofing attacks. We then formulate the problem of determining the number of attackers as a multi-class detection problem. Cluster-based mechanisms are developed to determine the number of attackers. When the training data is available, we explore using Support Vector Machines (SVM) method to further improve the accuracy of determining the number of attackers. In addition, we developed an integrated detection and localization system that can localize the positions of multiple attackers. We evaluated our techniques through two test beds using both an 802.11 (WiFi) network and an 802.15.4 (ZigBee) network in two real office buildings. Our experimental results show that our proposed methods can achieve over 90% Hit Rate and Precision when determining the number of attackers. Our localization results using a representative set of algorithms provide strong evidence of high accuracy of localizing multiple adversaries.
– In this competitive world, exams decide the parameter for a student's success and his fate in t... more – In this competitive world, exams decide the parameter for a student's success and his fate in this competitive world. But this examination practises have proved to be wrong by time and many malpractices were introduced. Difficult question papers and out of syllabus questions have been some problems faced in many examinations by the poor students and it's the students who face the brunt of this mistakes .but with this new software that has been designed the examination papers are being made more flexible and transparent such that the head of the institution has the control for the entire examinations question papers. Currently being used only in colleges, it can be extended to further institutions and at higher levels.
Artificial intelligence is the mechanism which makes human work very easy by making the devices t... more Artificial intelligence is the mechanism which makes human work very easy by making the
devices to process from the perspective of a human being. Predictive text is a feature in smart phones,
which runs on Operating Systems such as iOS, Android, Windows etc. This feature plays a major role in
helping the individual to select the word that succeeds the current word of their text message or
statement. In the proposed system, the concept of predictive mechanism is taken to the next level by
predicting mathematical solutions and chronological data. Thus, the system automatically does the
mathematical calculations in the middle of a chat, without making the user switch to the calculator
application; and also finds chronological data such as date and day, without having to switch to the
calendar application.
— Block Compressive sensing technique has been proposed to exploit the sparse nature of medical i... more — Block Compressive sensing technique has been proposed to exploit the sparse nature of medical images in a transform domain to reduce the storage space. Block based compressive sensing is applied to dicom image, where original dicom image is divided in terms of blocks and each block is processed separately. The main advantage of block compressive sensing is that each block is processed independently and combined with parallel processing to reduce the amount of time required for processing. Compressed sensing exploits the sparse nature of images to reduce the volume of the data required for storage purpose. Inspired by this, we propose a new algorithm for image compression that combines compressed sensing with different transforms. Different sparse basis like discrete cosine transform, discrete wavelet transform and contourlet are used to compress the original input image. Among these transforms, Dct transform has block artifacts problem [14]. Wavelet transform can overcome the block artifacts introduced in the reconstructed image. Contourlet transform effectively captures smooth contours[4] and hence Contourlet transform provides better reconstruction quality image. In order to reconstruct original image, different techniques such as basis pursuit, orthogonal matching pursuit etc. are used at the decoder.
In this paper we will discuss about how we can create an artificial environment. In this Hi tech ... more In this paper we will discuss about how we can create an artificial environment. In this Hi tech era Temperature plays an important part in our environment. Changes in temperature can affect the behaviour of human beings, plants and even materials such as semiconductors. This project is to control the temperature of a given environment such as baby incubator, industrial boiler, for automatic room temperature control, for creating artificial weather etc. A microcontroller is used to control the temperature in a circuit. Where the temperature had to be kept constant at particular value. The system will function as stated in the programming code of Atmega 8 in order to keep the temperature stable. A simple temperature controller which has least complex circuitry has to be designed so that it saves space and be more reliable for an incubator. Present design which uses microprocessor as main controller in digital signal processing combined with complex combinational logic circuit are redundant and needs to be improved in the sense of functionality. Hence, replacement of microprocessor with an microcontroller is prudent action due to its efficiency and reliability especially in an incubator and boiler.
— The aim of proposed systems (also called as collaborative filtering systems) is to suggest item... more — The aim of proposed systems (also called as collaborative filtering systems) is to suggest items which a client is expected to order. In this paper we describe the recommendation system related research and then Introduces various techniques and approaches used by the recommender system User-based approach, Item based approach, Hybrid recommendation approaches and related research in the recommender system. Normally, recommended systems are used online to propose items that users discover interesting, thereby, benefiting both the user and merchant Recommender systems benefit the user by building him suggestions on things that he is probable to buy and the business by raise of sales. we also explained the challenges, issues in data mining and how to build a recommendation system to improve performance accuracy by applying the techniques.
Model Predictive Control is a process control technique that is used in process industries that p... more Model Predictive Control is a process control technique that is used in process industries that predict the future behavior of the process state by predicting the change in the dependent variables of the modeled system. It can compute the future input at each step by minimizing a cost function on the manipulated and controlled variables. The main goal of this paper is to design a Model Predictive for a multivariable process. Here, the distillation column is used as a multivariable process. Finally, the settling time, overshoot, ISE, IAE, ITAE errors of MPC controller is compared with PID controller for both SISO and MIMO systems.
— Power quality is a major issue of loads in the distribution system used in industrial and domes... more — Power quality is a major issue of loads in the distribution system used in industrial and domestical appliances. In this work, we proposed a modified solid state transformer system using statcom to improve the power quality of loads in the distribution system. The proposed model is used to eliminate voltage sag, and swell. The matrix converter is adopted in the proposed design of modified solid state transformer to reduce the power loss. The control strategy of matrix converter is done by sinusoidal pulse width modulation techniques. In addition to this, harmonics are reduced by using vector proportional integral controller. The simulation is done in MATLAB/SIMULINK software, the several case studies are carried out and the simulation results shows that the proposed system has better voltage regulation than that of conventional system.
— Power quality is a major issue of loads in the distribution system used in industrial and domes... more — Power quality is a major issue of loads in the distribution system used in industrial and domestical appliances. In this work, we proposed a modified solid state transformer system using statcom to improve the power quality of loads in the distribution system. The proposed model is used to eliminate voltage sag, and swell. The matrix converter is adopted in the proposed design of modified solid state transformer to reduce the power loss. The control strategy of matrix converter is done by sinusoidal pulse width modulation techniques. In addition to this, harmonics are reduced by using vector proportional integral controller. The simulation is done in MATLAB/SIMULINK software, the several case studies are carried out and the simulation results shows that the proposed system has better voltage regulation than that of conventional system.
Big data plays a major role in all aspects of business and IT infrastructure. Today many organiza... more Big data plays a major role in all aspects of business and IT infrastructure. Today many organizations, Social Media Networking Sites, E-commerce, Educational institution, satellite communication, Aircrafts and others generate huge volume of data on a daily basis. This data is in the form of structured, semi-structured and unstructured. So this huge voluminous amount of data is coined as big data. These big data should be stored and processed in the effective manner. But, in the traditional distributed system this data cannot be effectively handled because of lack of resources. So the term Hadoop comes in to the picture. Hadoop stores and process the huge voluminous amount of data with their strong Hadoop ecosystem. It contains many modules for processing the data, storing the data, allocating the resources, Configuration Management, retrieving the data and for providing highly fault tolerance mechanism. In this paper it focuses on big data concepts, characteristics, real time examples of big data, Hadoop Modules and their pros and cons.
The rapid growth of wireless content access implies the need for content placement and scheduling... more The rapid growth of wireless content access implies the need for content placement and
scheduling at wireless base stations. We study a system under which users are divided into clusters based on
their channel conditions, and their requests are represented by different queues at logical front ends.
Requests might be elastic (implying no hard delay constraint) or inelastic (requiring that a delay target be
met). Correspondingly, we have request queues that indicate the number of elastic requests, and deficit
queues that indicate the deficit in inelastic service. Caches are of finite size and can be refreshed periodically
from a media vault. We consider two cost models that correspond to inelastic requests for streaming stored
content and real-time streaming of events, respectively. We design provably optimal policies that stabilize
the request queues (hence ensuring finite delays) and reduce average deficit to zero [hence ensuring that the
quality-of-service (QoS) target is met] at small cost. We illustrate our approach through simulations.
Index Terms—Content distribution network (CDN), delay-sensitive traffic, prediction, quality of service
(QoS), queueing.
In this paper the model of broadcast digital video signal with embedded audio (SDI) using cloud c... more In this paper the model of broadcast digital video signal with embedded audio (SDI) using cloud computing is detailed in each transcoding process of the signal. The SDI signal is coding and multiplexing in ASI signal, Multidecriptor transcoding the signal in Transport Stream, with GT-3 we can change the program in a new TS. Then with CPC develop chuncks with HLS (Http live streaming), with diferents profile. Anevia recive this chunck and delivery the signal to cloud. Testing bandwidth and bit rate for each signal process is performed, without losing the quality control standards and without losing the services of metadata (V-chip, closed caption, cue tone, DPI, GPI, Watermarket, SCTE 35 y104, etc.). How the stream is received by the end user it is shown.
Internet is collection of vast information. Need of searching the web pages for a specific piece ... more Internet is collection of vast information. Need of searching the web pages for a specific piece of information is a common practice. Search engines are used to search and retrieve the information as per query raised by the user. Making this search process better and fast has always been the area of interest for researchers involved in web mining. The process of searching the web can be improved by Web harvesting. Web harvesting is the process by which specialized software collects data from the Internet and places it into files for end user. It serves a function similar to, but more advanced than, the tasks a search engine performs. Web harvesting is also known as Web scraping. In this article we have explored the field of Web harvesting and emphasized its use for fast and effective retrieval of information from web
In today's world our life style accumulates too much of stress. This stress is major factor for d... more In today's world our life style accumulates too much of stress. This stress is major factor for deteriorating health. To reduce the stress level, meditation is one of the ways. The Electroencephalography (EEG) is characterized by five signals they are Alpha, Beta, Gamma, Delta and Theta. This paper focuses on the analysis of EEG signals of meditating and non-meditating persons. The signals are analyzed using Matlab and Verilog Xilinx 14.7 on FPGA. It was found that the variation of alpha wave in a person who is meditating regularly was less and delta wave density is more compared to a person who is not meditating.
The network communication is mandatory and critical in our day-today life. The cryptography is th... more The network communication is mandatory and critical in our day-today life. The cryptography is the technique to secure the data in communication field such that privacy of data is maintained. The cryptography has different types of algorithms; they are Data Encryption Standard (DES), Advanced Encryption Standard (AES) and Ron Rivest Adi Shamir & Leonard Adlemen (RSA) Triple Data Encryption Standard algorithm (TDES). The Multi-processor is most used in this era as it is subject to constraints while computing. The Multi-processing is the processing of multiple tasks at a time using two or more central processing units in one system. Here in this paper the Multiprocessing processor and lightweight DES algorithm is implemented using Xilinx 14.7. The simulation is done in Model Sim 6.3 and verifying the code is done by dumping onto Spartan 3 Field programmable gate array (FPGA).
The on-demand use, high scalability, and low maintenance cost nature of cloud computing have attr... more The on-demand use, high scalability, and low maintenance cost nature of cloud computing have attracted more and more enterprises to migrate their legacy applications to the cloud environment. Although the cloud platform itself promises high reliability, ensuring high quality of service is still one of the major concerns, since the enterprise applications are usually complicated and consist of a large number of distributed components. Thus, improving the reliability of an application during cloud migration is a challenging and critical research problem. To address this problem, we propose a reliability-based optimization framework, named RO Cloud, to improve the application reliability by fault tolerance. RO Cloud includes two ranking algorithms. The first algorithm ranks components for the applications that all their components will be migrated to the cloud. The second algorithm ranks components for hybrid applications that only part of their components are migrated to the cloud. Based on the ranking result, optimal fault-tolerant strategy will be selected automatically for the most significant components with respect to their predefined constraints. The experimental results show that by refactoring a small number of error-prone components and tolerating faults of the most significant components, the reliability of the application can be greatly improved.
— A Cloud is expanding from application aggregation and sharing to data aggregation and utilizati... more — A Cloud is expanding from application aggregation and sharing to data aggregation and utilization. To make use of data tens of terabytes and tens of beta bytes of data to be handled. These massive amounts of data are called as a big data. Range-aggregate queries are to apply a certain aggregate function on all tuples within given query ranges. Fast RAQ first divides big data into independent partitions with a balanced partitioning algorithm, and then generates a local estimation sketch for each partition. When a range-aggregate query request arrives, Fast RAQ obtains the result directly by summarizing local estimates from all partitions & Collective Results are provided. Data Mining can process only Structured Data only. Big Data Approach is spoken all over the Paper. They insist of Three Tier Architecture, 1. Big Data implementation in Multi System Approach, 2. Application Deployment-Banking / Insurance. 3. Extraction of Useful information from Unstructured Data. We implement this Project for Banking Domain. There will be Two Major Departments. 1. Bank Server for Adding New Clients and maintaining their Accounts. Every User while Registration has to provide their aadhar card as a ID Proof to create Account in any Bank. 2. Accounts Monitoring Sever will monitor every users and their Account Status in different Banks. This Server will retrieve users who maintain & Transact more than Rs. 50,000 / Annum in all 3 Accounts in different Banks using the same ID Proof. Map & Reduce is achieved.
Cloud computing is becoming popular. Build high-quality cloud applications is a critical research... more Cloud computing is becoming popular. Build high-quality cloud applications is a critical research problem. QoS rankings provide valuable information for make optimal cloud service selection from a set of functionally equivalent service candidates. To obtain Qos values real-world invocations the service candidates are usually required based on the Cloud Broker. To avoid the time consuming and expensive real-world service invocations, It proposes a QoS ranking prediction framework for cloud services by taking an advantage of the past service usage experiences of other consumers. Our proposed framework requires no need additional invocations of cloud services when making QoS ranking prediction by cloud broker service provider. Two personalized QoS ranking prediction approaches are proposed to predict the QoS rankings directly based on cost and ranking. Comprehensive experiments are conducted employing real-world QoS data, including 300 distributed users and 500 real world web services to all over the world. The experimental results show that our approaches outperform other competing approaches.
Cloud computing is becoming popular. Build high-quality cloud applications is a critical research... more Cloud computing is becoming popular. Build high-quality cloud applications is a critical research problem. QoS rankings provide valuable information for make optimal cloud service selection from a set of functionally equivalent service candidates. To obtain Qos values real-world invocations the service candidates are usually required based on the Cloud Broker. To avoid the time consuming and expensive real-world service invocations, It proposes a QoS ranking prediction framework for cloud services by taking an advantage of the past service usage experiences of other consumers. Our proposed framework requires no need additional invocations of cloud services when making QoS ranking prediction by cloud broker service provider. Two personalized QoS ranking prediction approaches are proposed to predict the QoS rankings directly based on cost and ranking. Comprehensive experiments are conducted employing real-world QoS data, including 300 distributed users and 500 real world web services to all over the world. The experimental results show that our approaches outperform other competing approaches.
The growth of software engineering can justifiably be attributed to the advancement in Software T... more The growth of software engineering can justifiably be attributed to the advancement in Software Testing. The quality of the test cases to be used in Software Testing determines the quality of software testing. This is the reason why test cases are primarily crafted manually. However, generating test cases manually is an intense, complex and time consuming task. There is, therefore, an immediate need for an automated test data generator which accomplishes the task with the same effectiveness as manual crafting of test cases. The work presented intends to automate the process of Test Path Generation with a goal of attaining maximum coverage. The work presents a technique using Cellular Automata (CA) for generating test paths. The work opens the window of Cellular Automata to Software Testing. The approach has been verified on programs selected in accordance with their Lines of Code and utility. The results obtained have been verified.
In distributed applications data centers process high volume of data in order to process user req... more In distributed applications data centers process high volume of data in order to process user request. Using SQL analyzer to process user queries is centralized and it is difficult to manage large data sets. Retrieving data from the storage is also difficult. Finally we can't execute the system in a parallel fashion by distributing data across a large number of machines. Systems that compute SQL analytics over geographically distributed data operate by pulling all data to a central location. This is problematic at large data scales due to expensive transoceanic links. So implement Continuous Hive (CHIVE) that facilitates querying and managing large datasets residing in distributed storage. Hive provides a mechanism to structure the data and query the data using a SQL-like language called HiveQL and it optimizes query plans to minimize their overall bandwidth consumption. The proposed system optimizes query execution plans and data replication to minimize bandwidth cost.
Cloud computing is an emerging pattern that provides computing, communication and storage resourc... more Cloud computing is an emerging pattern that provides computing, communication and storage resources as a service over a network. In existing system, data outsourced in a cloud is unsafe due to the eaves dropping and hacking process. And it allows minimizing the security network delays in cloud computing. In this paper to study data replication in cloud computing data centers. Unlike another approaches available in the literature, consider both security and privacy preserving in the cloud computing. To overcome the above problem we use DROPS methodology. The data encrypted using AES (Advanced Encryption Standard Algorithm). In this process, the common data are divided into multiple nodes also replicate the fragmented data over the cloud nodes. Each data is stored in a different node in fragments individual locations. We ensure a controlled replication of the file fragments, here each of the fragments is replicated only once for the purpose of improved security. The results of the simulations revealed that the simultaneous focus on the security and performance, resulted in improved security level of data accompanied by a slight performance drop.
— Mining the needed data based on our application was the crucial activity in the computerized en... more — Mining the needed data based on our application was the crucial activity in the computerized environment.For that mining techniques was introduced.This project used to extract the mobile apps.The Ranking fraud in the mobile App market refers to fraudulent or deceptive activities which have a purpose of bumping up the Apps in the popularity list. Indeed, it becomes more and more frequent for App developers to use shady means, such as inflating their Apps' sales or posting phony App ratings, to commit ranking fraud. Here first propose to accurately locate the ranking fraud by mining the active periods, namely leading sessions, of mobile Apps. Furthermore, we investigate three types of evidences, i.e., ranking based evidences, rating based evidences and review based evidences, by modeling Apps' ranking, rating and review behaviors through statistical mining based hypotheses tests. In addition, In this project an optimization based application used to integrate all the evidences for fraud detection based on EIRQ (efficient information retrieval for ranked query) algorithm. Finally, evaluate the proposed system with real-world App data collected from the iOS App Store for a long time period. Experiment was need to be done for validate the effectiveness of the proposed system, and show the scalability of the detection algorithm as well as some regularity of ranking fraud activities.
— to secure outsourced information in distributed storage against defilements, adding adaptation ... more — to secure outsourced information in distributed storage against defilements, adding adaptation to internal failure to distributed storage together with information uprightness checking and disappointment reparation gets to be basic. As of late, recovering codes have picked up ubiquity because of their lower repair transfer speed while giving adaptation to non-critical failure. Existing remote checking systems for recovering coded information just give private examining, requiring information proprietors to dependably stay online and handle reviewing, and additionally repairing, which is some of the time unreasonable and also all the distributed data are stored in same functional location ,so search and data retrieval takes much time. This time delay will affect the distributed storage efficiency. In this project an open examining plan for the recovering code based distributed storage is proposed and also Attribute Based Clustering Technique (ABCT) For Distributed Data. The ABCT will recover the issue of time delayness and makes the system more efficient. We randomize the encode coefficients with a pseudorandom capacity to protect information security. The ABCT achieves the much faster performance data searching and retrieval. Broad security examination demonstrates that our plan is provable secure under arbitrary prophet model and trial assessment shows that our plan is exceptionally productive and can be attainably coordinated into the recovering code based distributed storage.
Identification of the plant diseases is the key to preventing the losses in the yield and quantit... more Identification of the plant diseases is the key to preventing the losses in the yield and quantity of
the agricultural product. The studies of the pomegranate plant diseases mean the studies of visually
observable patterns seen on the plant. It is very difficult to monitor the pomegranate plant diseases manually.
Hence, image processing is used for the detection of pomegranate plant diseases. Disease detection involves
the steps like image acquisition, image pre-processing, image segmentation, statistical feature extraction and
classification. K-means clustering algorithm is used for segmentation and support vector machine is used for
classification of disease.
— Ration distribution is one of the big issues that involves corruption and smuggling of goods. T... more — Ration distribution is one of the big issues that involves corruption and smuggling of goods. The only reason of this to happen is because every work in the ration shop involves manual work. These irregularities activities are happen like – the entries which are incorrect in collection record of store containing wrong stock information of the products that is supplied to the peoples, at times it may chance of distribution of minimum quality products than the actual products given by the Government for providing to the public, also the information regarding the available stock quantity in a ration shop that is supplying by the Government to the public. In this paper we propose the concept of changing manual work in public distribution system by automated system which can be installed at the ration shop with ease. This would bring the transparency in rationing system as there will be a direct communication between user and Government through this system.
— Ration distribution is one of the big issues that involves corruption and smuggling of goods. T... more — Ration distribution is one of the big issues that involves corruption and smuggling of goods. The only reason of this to happen is because every work in the ration shop involves manual work. These irregularities activities are happen like – the entries which are incorrect in collection record of store containing wrong stock information of the products that is supplied to the peoples, at times it may chance of distribution of minimum quality products than the actual products given by the Government for providing to the public, also the information regarding the available stock quantity in a ration shop that is supplying by the Government to the public. In this paper we propose the concept of changing manual work in public distribution system by automated system which can be installed at the ration shop with ease. This would bring the transparency in rationing system as there will be a direct communication between user and Government through this system.
—The process of fixing bug is bug triage that aims to properly assign a developer to a new bug. S... more —The process of fixing bug is bug triage that aims to properly assign a developer to a new bug. Software companies pay out most of their expenses in dealing with these bugs. To reduce time and cost of bug triaging, an automated approach is developed to predict a developer with relevant experience to solve the new coming report. In proposed approach data reduction is done on bug data set which will reduce the scale of the data as well as increase the quality of the data. Instance selection and feature selection is also used simultaneously with historical bug data. Previously, text classification techniques are applied to conduct bug triage. The problem here is to get quality bug data sets as they are of very huge in size. In the proposed system, the problems of reducing the size and to improve the quality of bug data are addressed.First, pre-processing is done to the remove unimportant attributes and to identify missing terms. Then instance selection is combined with feature selection by using Dimensionality reduction technique to simultaneously reduce data size on the bug dimension and the word dimension. By using PSO algorithm, the reduction order is determined using fitness value. It is used to produce quality bug data set. The results show that the proposed system can effectively reduce the data size and improve the accuracy of bug triage. The proposed system provides an approach to leveraging techniques on data processing to form reduced and high eminence bug data in software improvement and maintenance.
The agile methods, such as Scrum and Extreme Programming (XP), have been a topic of much discussi... more The agile methods, such as Scrum and Extreme Programming (XP), have been a topic of much discussion in
the software community over the last few years. While these have gained importance in the industry because of
their approach on the issues of human agility and return on investment, usually within a context of small-tomedium
size projects with significant requirements volatility, the ones who do not support these methods have
expressed serious concerns about the effectiveness of the methods. Scrum attempts to build the work in short
iterations where each iteration consists of short time boxes. This paper reviews several papers on Scrum, its
framework including its artifacts and the ceremonies which are involved. This paper gives an insight of the
Scrum Methodology to any beginner.
Keywords: Agile methods, methodology, Scrum, software process, Sprint, Backlog, Artifacts
—In general, all the keypad based authentication system having several possibilities of password ... more —In general, all the keypad based authentication system having several possibilities of password guessing by means of shoulder movements. Shoulder-surfing is an attack on password authentication that has traditionally been hard to defeat. This problem has come up with a new solution. Devising a user authentication scheme based on personal identification numbers (PINs) that is both secure and practically usable is a challenging problem. The greatest difficulty lies with the susceptibility of the PIN entry process to direct observational attacks, such as human shoulder-surfing and camera-based recording. PIN entry mechanism is widely used for authenticating a user. It is a popular scheme because it nicely balances the usability and security aspects of a system. However, if this scheme is to be used in a public system then the scheme may suffer from shoulder surfing attack. In this attack, an unauthorized user can fully or partially observe the login session. Even the activities of the login session can be recorded which the attacker can use it later to get the actual PIN. In this paper, we propose an intelligent user interface, known as Color Pass to resist the shoulder surfing attack so that any genuine user can enter the session PIN without disclosing the actual PIN. The Color Pass is based on a partially observable attacker model. The experimental analysis shows that the Color Pass interface is safe and easy to use even for novice users.
The main objective is to preserve data privacy during communication. In this paper, we show how e... more The main objective is to preserve data privacy during communication. In this paper, we show how external aggregators or multiple parties use algebraic statistics over their private data without exploiting data privacy assuming all channels and communications are open to eavesdropping attacks. Firstly, we propose many protocols that guarantee data privacy. Later we propose advanced protocols that tolerate maximum of k passive adversaries who do not try to modify the computation.
—The sources such as wind power and solar power are expected to be promising energy sources when ... more —The sources such as wind power and solar power are expected to be promising energy sources when it is connected to the power grid. The wind generators have a significant impact on the power quality, voltage profile and the power flow for customers and electricity suppliers. The power exhausted from above energy sources varies due to environmental conditions. Due to the fluctuation in nature of the wind, the wind power injection into an electric grid affects the power quality. The influence of the wind sources in the grid system concerns the power quality such as the reactive power, active power, voltage variation, harmonics and electrical behaviour in switching operation[1]. Demonstration of a grid side connected wind turbine is considered here with the problem arise due to the above system. At the point of common coupling a Static Synchronous Compensator with Battery Energy Storage System-STATCOM/BESS, can regulate four-quadrant active and reactive power, which is an ideal scheme to solve problems of wind power generation. As the power from wind generation varies with time so the battery energy storage used to maintain constant real power comprehensively from varying wind power. The power generated through wind generator can be stored in the batteries at low power demand hours[2-4]. The combination of battery storage with wind energy generation system will synthesize the output waveform by absorbing or injecting reactive power and enable the real power flow required by the load. The control strategy can coordinate charge or discharge of batteries with reactive power compensation of STATCOM, and balance the batteries capacity. If required, amount of energy consumed or given to the grid can be observed through an online smart meter connected in the circuit.
The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern t... more The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern technologies various civil military applications. It is an automatic system The shrinking size and increasing capabilities of microelectr-onic devices in recent years has opened up the doors to more capable autopilot and pushed for more real time UAVs applications. The Unmanned Aerial Vehicle (UAV) market is to grow dramatically by 2020, as military, civil and comercial applications continue to develop. Potential changes in air traffic management include the creation of an information It defines a UAV to be " An aircraft which is management system to exchange information among Air Traffic Management users and providers, the introduction of navigation, and the development of alternative separation procedures. The impact of each scenario on the future air traffic and surveillance is summarized, and associated issues identified. The paper concludes by describing the need for a UAV roadmap to the future. This paper aims to provide a simple and low-cost solution of an autonomous aerial surveyor which can do aerial surveillance ,recognize and track various objects, able in making simple 3d map .
The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern t... more The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern technologies various civil military applications. It is an automatic system The shrinking size and increasing capabilities of microelectr-onic devices in recent years has opened up the doors to more capable autopilot and pushed for more real time UAVs applications. The Unmanned Aerial Vehicle (UAV) market is to grow dramatically by 2020, as military, civil and comercial applications continue to develop. Potential changes in air traffic management include the creation of an information It defines a UAV to be " An aircraft which is management system to exchange information among Air Traffic Management users and providers, the introduction of navigation, and the development of alternative separation procedures. The impact of each scenario on the future air traffic and surveillance is summarized, and associated issues identified. The paper concludes by describing the need for a UAV roadmap to the future. This paper aims to provide a simple and low-cost solution of an autonomous aerial surveyor which can do aerial surveillance ,recognize and track various objects, able in making simple 3d map .
In the modern world, growth is a factor every company seeks for, And Growth is determined by the ... more In the modern world, growth is a factor every company seeks for, And Growth is determined by the amount of revenue they generate, the number of consumers they satisfy etc. For product based companies, achieving Greater number of sales is their key objective. And it achieved by understanding customers' requirements, employing various marketing strategy, carrying out relevant analysis, advertisement etc., and all of these factors pose a great challenge for every enterprise and companies around the world, but foremost understanding the customers interest is the greatest challenge of all. Every time we step into a mall or Exhibition, It is the natural tendency of a person to spend more time with the object that meets their interest, but till now we have never kept the track of the interaction between the consumers and product with respect to time. In our project, with the help of wireless communication & ubiquitous sensors such as RFID and proper GUI, we are developing a consumer interest tracking device which is capable of gathering valuable information regarding time spent by an individual at various stores, Products in an exhibition or shopping mall. Based on the information collected we determine the interest of consumer, which in turn help the company to manufacture better products, take smarter decisions and ensure a safer future for the enterprise. The information gathered is made available for real-time monitoring, or can be stored for future analysis
— The classical fuzzy system model method kindly assumed data which is generated from single task... more — The classical fuzzy system model method kindly assumed data which is generated from single task. This data can be acquired from the perspective of multiple task the modeling has on intrinsic inconsistency. In this project , a multiple fuzzy system modeling method by mining interact common hidden structure is propose to overcome the weakness of classical TSK-based fuzzy modeling method system for multitask learning. When the classical fuzzy modeling method are applied to multitask datasets, they usually focus on the task independence information and the ignore the correlation between different task. Here we mine the common hidden structure among multiple tasks to realize multitask TSK fuzzy system learning it makes good used of the independence information of each task and correlation information captured by common hidden structure among all tasks as well. Thus, the proposed learning algorithm can effectively improve both the generalization and fitting performance of the learned fuzzy system for each task. Our experiment result demonstrate at the proposed MTCS-TSK_FS has better modeling performance and adaptability than the existing TSK based fuzzy modeling method on multitask datasets. Learning multiple tasks across different datasets is a challenging problem since the feature space may not be the same for different tasks. The data can be any type or the datasets. The data can be any type or the datasets are any type like text datasets. Index Terms—Common hidden structure, fuzzy modeling,multitask learning, Takagi-Sugeno-Kang (TSK) fuzzy systems.