IJCT Editor - Guru Nanak Dev University Amritsar (original) (raw)
Papers by IJCT Editor
Load balancing is one of the essential factors to enhance the working performance of the cloud se... more Load balancing is one of the essential factors to enhance the working performance of the cloud service provider. Cloud Computing is an emerging computing paradigm. It aims to share data, calculations, and service transparently over a scalable network of nodes. Since Cloud computing stores the data and disseminated resources in the open environment. Since, cloud has inherited characteristic of distributed computing and virtualization there is a possibility of machines getting unused. Hence, in this paper, different load balancing algorithms has been studied. Different kinds of job types have been discussed and their problems have been reviewed. In the cloud storage, load balancing is a key issue. It would consume a lot of cost to maintain load information, since the system is too huge to timely disperse load. Load balancing is one of the main challenges in cloud computing which is required to distribute the dynamic workload across multiple nodes to ensure that no single node is overwhelmed.
Cloud computing is Internet based development and use of computer technology. It is a style of co... more Cloud computing is Internet based development and use of computer technology. It is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure "in the cloud" that supports them. Cloud computing is a hot topic all over the world nowadays, through which customers can access information and computer power via a web browser. As the adoption and deployment of cloud computing increase, it is critical to evaluate the performance of cloud environments. Currently, modeling and simulation technology has become a useful and powerful tool in cloud computing research community to deal with these issues. Cloud simulators are required for cloud system testing to decrease the complexity and separate quality concerns. Cloud computing means saving and accessing the data over the internet instead of local storage. In this paper, we have provided a short review on the types, models and architecture of the cloud environment.
Cloud Computing is being used widely all over the world by many IT companies as it provides vario... more Cloud Computing is being used widely all over the world by many IT companies as it provides various benefits to the users like cost saving and ease of use. However, with the growing demands of users for computing services, cloud providers are encouraged to deploy large datacenters which consume very high amount of energy and also contribute to the increase in carbon dioxide emission in the environment. Therefore, we require to develop techniques which will help to get more environment friendly computing i.e. Green Cloud Computing. Cloud computing is an increasingly popular paradigm for accessing computing resources. This paper discusses some of the research challenges for cloud computing from an enterprise or organizational perspective, and puts them in context by reviewing the existing body of literature in cloud computing. Various research challenges relating to the following topics are discussed: the organizational changes brought about by cloud computing; the economic and organizational implications of its utility billing model; the security, legal and privacy issues that cloud computing raises. It is important to highlight these research challenges because cloud computing is not simply about a technological improvement of data centers but a fundamental change in how IT is provisioned and used. This type of research has the potential to influence wider adoption of cloud computing in enterprise, and in the consumer market too.
EEG (electroencephalography) energy is an important evaluation indicator in brain death determina... more EEG (electroencephalography) energy is an important evaluation indicator in brain death determination based on EEG analysis. In related works, the static EEG energy value can be discovered using EMD (empirical mode decomposition), MEMD (multivariate empirical mode decomposition) and 2T-EMD (turning tangent empirical mode decomposition) for EEG-based coma and quasi-brain-death analysis. However such methods are not time-varying and feasible. In this paper, we firstly propose the Dynamic 2T-EMD algorithm to evaluate the dynamic patients' EEG energy variation by the means of time window and time step method. With the time window sliding along the time axis in a time step, EEG energy of corresponding time step is computed and stored. The proposed algorithm is applied to analyze 19 cases of coma patients' EEG and 17 cases of quasi-brain-death patients' EEG. Two typical patients in coma and quasi-brain-death state and one special case who was from coma to quasi-brain-death have been taken as examples to give the algorithm performance. Results show that EEG energy in coma state are obviously higher than that in quasi-brain-death state, and even present the EEG energy change trend of every case, which can prevent loss of information and wrong analysis results caused by noise interference and provide scientific basis for doctors to evaluate patients' consciousness levels in brain death determination. The proposed algorithm will be very helpful to develop the real time brain death diagnostic system.
In speaker diarization, the speech/voice activity detection is performed to separate speech, non-... more In speaker diarization, the speech/voice activity detection is performed to separate speech, non-speech and silent frames. Zero crossing rate and root mean square value of frames of audio clips has been used to select training data for silent, speech and non-speech models. The trained models are used by two classifiers, Gaussian mixture model (GMM) and Artificial neural network (ANN), to classify the speech and non-speech frames of audio clip. The results of ANN and GMM classifier are compared by Receiver operating characteristics (ROC) curve and Detection ErrorTradeoff (DET) graph. It is concluded that neural network based SAD comparatively better than Gaussian mixture model based SAD.
Over the last twenty years face recognition has made immense progress based on statistical learni... more Over the last twenty years face recognition has made immense progress based on statistical learning or subspace discriminant analysis. This paper investigates a technique to reduce features necessary for face recognition based on local binary pattern, which is constructed by applying wavelet transform into local binary pattern. The approach is evaluated in two ways: wavelet transform applied to the LBP features and wavelet transform applied twice on the original image and LBP features. The resultant data are compared to the results obtained without applying wavelet transform, revealing that the reduction base one wavelet achieves the same or sometimes improved accuracy. The proposed algorithm is experimented on the Cambridge ORL Face database.
Nowadays surveillance systems have been widely deployed in various places and generate massive am... more Nowadays surveillance systems have been widely deployed in various places and generate massive amount of video data every day. This raises threats of unauthorized access and potential privacy leakage as the recorded videos usually contain rich identifiable information such as facial biometrics. In order to mitigate the threats, many existing methods perform symmetric encryption on the entire frames in the videos. Unfortunately, these methods could introduce additional computation cost and storage. Moreover, as surveillance systems could be a part of distributed system, the key management is critical and challenging. In this paper, we propose a novel method which incorporates background subtraction technique and RSA encryption algorithm. Rather than encrypting the entire frames of the videos, the proposed detect the regions around moving objects in the frames of video and then perform RSA encryption on the detected regions. And RSA encryption technique has its advantages of key distribution and management. Our experimental results show that the proposed method only involve moderate computation cost and storage.
Content Based Image Retrieval (CBIR) techniques are becoming an essential requirement in the mult... more Content Based Image Retrieval (CBIR) techniques are becoming an essential requirement in the multimedia systems with the widespread use of internet, declining cost of storage devices and the exponential growth of un-annotated digital image information available in recent years. Therefore multi query systems have been used rather than a single query in order to bridge the semantic gaps and in order to understand user’s requirements. Moreover, query replacement algorithm has been used in the previous works in which user provides multiple images to the query image set referred as representative images. Feature vectors are extracted for each image in the representative image set and every image in the database. The centroid, Crep of the representative images is obtained by computing the mean of their feature vectors. Then every image in the representative image set is replaced with the same candidate image in the dataset one by one and new centroids are calculated for every replacement .The distance between each of the centroids resulting from the replacement and the representative image centroid Crep is calculated using Euclidean distance. The cumulative sum of these distances determines the similarity of the candidate image with the representative image set and is used for ranking the images. The smaller the distance, the similar will be the image with the representative image set. But it has some research gaps like it takes a lot of time to extract feature of each and every image from the database and compare our image with the database images and complexity as well as cost increases. So in our proposed work, the KNN algorithm is applied for classification of images in the database image set using the query images and the candidate images are reduced to images returned after classification mechanism which leads to decrease the execution time and reduce the number of iterations. Hence due to hybrid model of multi query and KNN, the effectiveness of image retrieval in CBIR system increases. The language used in this work is C /C++ with Open CV libraries and IDE is Visual studio 2015. The experimental results show that our method is more effective to improve the performance of the retrieval of images.
Mobile Ad-hoc Network (MANET) is a kind of wireless network that has the most challenging network... more Mobile Ad-hoc Network (MANET) is a kind of wireless network that has the most challenging network infrastructure. It is formed using the mobile nodes without any centralized administration from the security perspective and is a self-configuring fastest emerging wireless technology, each node on the MANET will act like a router which forwards the packets. Dynamic nature of this network makes routing protocols to play a prominent role in setting up efficient route among a pair of nodes. Dynamic Source Routing (DSR) and Ad-hoc On-Demand Distance Vector (ADOV) is a reactive MANET routing protocols. Most of the attacks on MANETs are routing protocol attacks. Attacks on routing protocols, especially internal attacks will cause the damage to MANETs. Sinkhole and black hole attacks a re a type of internal attack which is affected by attempting to draw all network traffic to malicious nodes that fake routing update and degrade the performance of the network. The black hole nodes should be detected from the network as early as possible via detection mechanism and should also guarantee the higher detection rate and less cross-over error rate. In this paper, we studied the characteristics of black hole attack and how it will affect the performance of the distance vector routing on demand routing protocol such as (ADOV) protocol, which recognizes the presence of black hole node from packet flow information between nodes and isolates it from the network via applying AODV protocol that one of popular routing protocol. We have evaluated the performance of the system using widely used simulator NS2, results prove the effectiveness of our prevention and detection method.
The advent of digital systems for the production and transmission of information decisively influ... more The advent of digital systems for the production and transmission of information decisively influences human progress and represents the future in any field of social life. In order to survive, organizations must correlate the objectives to the new trend of the society based on information, deeply marked by globalization. In recent years new computational paradigm were proposed and adopted. These include Cloud computing. Together with the stabilization of technologies related to Cloud computing, SQL databases have become more attractive due to native support for scalability and distributed architecture and the fact that many of these can be offered as services. The paper presents a few important aspects about cloud computing and proposed a new database designed to be implemented in cloud. We offer a new model and an example of implementing in Romanian medicine1.
Code switching is a widely observed but less studied phenomenon, especially in multilingual and m... more Code switching is a widely observed but less studied phenomenon, especially in multilingual and multicultural communities. So, the present study investigated the status of grammatical code switching among Iranian EFL university students. Also, the role of the teacher in managing the code switching was investigated, too. Two classes including 96 participants from two different universities were observed carefully and the required data were collected. Analyses of the data revealed varying nature of code switching in both settings. Moreover, the obtained frequencies revealed the fact that among the four types of „trigger words‟ only „proper nouns‟ and discourse marker „OK‟ remarkably were responsible for code switching in one setting, while „lexical transfer‟ and also the discourse marker „OK‟ lead to code switch in another. Meanwhile, four functions for code switching were determined, which included providing equivalents for the key word, showing humor, inspiring learners, and explaining the required assignments, as teacher‟s role in dealing with code switching .
Keywords
Code Switching, EFL Classroom, Interference and Interaction
Computer memory is expensive and the recording of data captured by a webcam needs memory. In orde... more Computer memory is expensive and the recording of data captured by a webcam needs memory. In order to minimize the memory usage in recording data from human motion as recorded from the webcam, this algorithm will use motion detection as applied to a process to measure the change in speed or vector of an object in the field of view. This application only works if there is a motion detected and it will automatically save the captured image in its designated folder.
Owing to the conception of big data and massive data processing there are increasing owes related... more Owing to the conception of big data and massive data processing there are increasing owes related to the temporal aspects of the data processing. In order to address these issues a continuous progression in data collection, storage technologies, designing and implementing large-scale parallel algorithm for Data mining is seen to be emerging in a rapid pace. In this regards, the Apriori algorithms have a great impact for finding frequent item sets using candidate generation. This paper presents highlights on parallel algorithm for mining association rules using MPI for passing message base in the Master-Slave based structural model.
Cloud Computing provides different types of services such as SaaS, PaaS, IaaS. Each of them have ... more Cloud Computing provides different types of services such as SaaS, PaaS, IaaS. Each of them have their own security challenges, but IaaS undertakes all types of challenges viz., network attack ,behaviour based attack, request based attacks i.e handling the requests from untrusted users, XSS (cross site scripting attack), DDOS and many more. These attacks are independent of each other and consequently the QoS provided by cloud is compromised. This paper proposes a History aware Behaviour based IDS (Intrusion Detection System) BIDS. BIDS provides detection of untrusted users, false requests that may lead to spoofing, XSS or DOS attack and many more such attacks. In addition, certain cases where user login or password is compromised. History aware BIDs can be helpful in detecting such attacks and maintaining the QoS provided to the user in cloud IaaS (Infrastructure as a service).
This paper presents a review on digital image filtering techniques. The main emphasis is on media... more This paper presents a review on digital image filtering techniques. The main emphasis is on median filtering and its extended versions like hybrid median filtering, relaxed median filtering etc. It is found that still median filtering demands some sort of enhancements as it is best for salt and pepper noise only. By conducting a survey suitable gaps are found in existing literature. In the end comparison table is also drawn among the existing techniques.
Gaussian Mixture Models (GMMs) has been proposed for off-line signature verification. The individ... more Gaussian Mixture Models (GMMs) has been proposed for off-line signature verification. The individual Gaussian components are shown to represent some global features such as skewness, kurtosis, etc. that characterize various aspects of a signature, and are effective for modeling its specificity. The learning phase involves the use of Gaussian Mixture Model (GMM) technique to build a reference model for each signature sample of a particular user. The verification phase uses three layers of statistical techniques. The first layer involves computation of GMM-based log-likelihood probability match score, second layer performs the mapping of this score into soft boundary ranges of acceptance or rejection through the use of z-score analysis and normalization function, thirdly, threshold is used to arrive at the final decision of accepting or rejecting a given signature sample. The focus of this work is on faster detection of authenticated signature as no vector analysis is done in GMM. From the experimental results, the new features proved to be more robust than other related features used in the earlier systems. The FAR (False Acceptance Rate) and FRR (False Rejection Rate) for the genuine samples is 0.15 and 0.19 respectively. Mixture Model (GMM), Z-score analysis, False Acceptance Rate(FAR), False Rejection Rate(FRR)
The Campus Local Area Network (CLAN) of academic institutions interconnect computers ranging from... more The Campus Local Area Network (CLAN) of academic institutions interconnect computers ranging from one hundred to about twenty five hundred and these computers are located in academic building(s), hostel building(s), faculty quarter(s), students amenities centre, etc all around the campus. The students, faculty and the supporting staff members use the network primarily for internet usage at both personal and professional levels and secondarily for usage of the available services and resources. Various web based services viz: Web Services, Mail Services, DNS, and FTP services are generally made available in the campus LAN. Apart from these services various intranet based services are also made available for the users of the LAN. Campus LAN users from the hostels change very frequently and also sometime become targets (we call as soft targets) to the attackers or zombie because of either inadequate knowledge to protect their own computer/ laptop, which is also a legitimate node of the campus LAN; or their enthusiastic nature of experimentation. The interconnectivity of these legitimates nodes of the campus LAN and that of the attackers in the World Wide Web, make the computers connected in the LAN (nodes) an easy target for malicious users who attempt to exhaust the resources by launching Distributed Denial-of-Service (DDoS) attacks. In this paper we present a technique to mitigate the distributed denial of service attacks in campus wide LAN by limiting the bandwidth of the affected computers (soft targets) of the virtual LAN from a unified threat management (UTM) firewall. The technique is supported with help of bandwidth utilization report of the campus LAN with and without implementation of bandwidth limiting rule; obtained from the UTM network traffic analyzer. The graphical analyzer report on the utilization of the bandwidth with transmitting and receiving bits of the campus LAN after implementation of our bandwidth limiting rule is also given.
In this paper we are proposing a GUI based Prototype for user centered environment like class roo... more In this paper we are proposing a GUI based Prototype for user centered environment like class room, library hall, laboratory, meeting hall, coffee shop, kitchen, living room and bedroom, which recommends useful services based on the user's context. Service recommendation is mainly based on parameters such as user, location, time, day and mood. In addition whenever the conflict arises among different users it will be resolved using some conflict resolving algorithms. The motivation behind the proposed work is to improve the user satisfaction level and to improve the social relationship between user and devices The prototype contains simulated sensors which are used to capture the raw context information, which is then described with meaningful English sentence and services are recommended based on user's situation. The proposed conflict resolving algorithms are Rule based algorithm, Bayesian probability based algorithm and Rough set theory based algorithm. The amount of conflicts resolved by these algorithms is also analyzed at the end.
Designing heterogeneous distributed systems requires of the use of tools that facilitate the depl... more Designing heterogeneous distributed systems requires of the use of tools that facilitate the deployment and the interaction between platforms. In this paper we propose using Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST), two main approaches for creating applications based on distributed services, for distributed computation. Our aim is to demonstrate how they could be used to develop evolutionary computation systems on heterogeneous platforms, taking advantage of their ability to deal with heterogeneous infrastructures and environments, and giving support for parallel implementations with a high platform flexibility. Both approaches are different and present some advantages and disadvantages for interfacing to web services: SOAP is conceptually more difficult (has a steeper learning curve) and more " heavyweight " than REST, although it lacks of standards support for security. The results obtained on different experiments have shown that both SOAP and REST can be used as communication protocol for distributed evolutionary computation. Results obtained are comparable, however for large amounts of data (big messages), REST communications take longer than SOAP communications.
Reliable operations of power transformers are necessary for effective transmission and distributi... more Reliable operations of power transformers are necessary for effective transmission and distribution of power supply. During normal functions of the power transformer, distinct types of faults occurs due to insulation failure, oil aging products, overheating of windings, etc., affect the continuity of power supply thus leading to serious economic losses. To avoid interruptions in the power supply, various software fault diagnosis approaches are developed to detect faults in the power transformer and eliminate the impacts. SVM and SVM-SMO are the software fault diagnostic techniques developed in this paper for the continuous monitoring and analysis of faults in the power transformer. The SVM algorithm is faster, conceptually simple and easy to implement with better scaling properties for few training samples. The performances of SVM for large training samples are complex, subtle and difficult to implement. In order to obtain better fault diagnosis of large training data, SVM is optimized with SMO technique to achieve high interpretation accuracy in fault analysis of power transformer. The proposed methods use Dissolved Gas-in-oil Analysis (DGA) data set obtained from 500 KV main transformers of Pingguo Substation in South China Electric Power Company. DGA is an important tool for diagnosis and detection of incipient faults in the power transformers. The Gas Chromatograph (GC) is one of the traditional methods of DGA, utilized to choose the most appropriate gas signatures dissolved in transformer oil to detect types of faults in the transformer. The simulations are carried out in MATLAB software with an Intel core 3 processor with speed of 3 GHZ and 2 GB RAM PC. The results obtained by optimized SVM and SVM-SMO are compared with the existing SVM classification techniques. The test results indicate that the SVM-SMO approach significantly improve the classification accuracy and computational time for power transformer fault classification. 1688 | P a g e A u g 2 0 , 2 0 1 3
Load balancing is one of the essential factors to enhance the working performance of the cloud se... more Load balancing is one of the essential factors to enhance the working performance of the cloud service provider. Cloud Computing is an emerging computing paradigm. It aims to share data, calculations, and service transparently over a scalable network of nodes. Since Cloud computing stores the data and disseminated resources in the open environment. Since, cloud has inherited characteristic of distributed computing and virtualization there is a possibility of machines getting unused. Hence, in this paper, different load balancing algorithms has been studied. Different kinds of job types have been discussed and their problems have been reviewed. In the cloud storage, load balancing is a key issue. It would consume a lot of cost to maintain load information, since the system is too huge to timely disperse load. Load balancing is one of the main challenges in cloud computing which is required to distribute the dynamic workload across multiple nodes to ensure that no single node is overwhelmed.
Cloud computing is Internet based development and use of computer technology. It is a style of co... more Cloud computing is Internet based development and use of computer technology. It is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure "in the cloud" that supports them. Cloud computing is a hot topic all over the world nowadays, through which customers can access information and computer power via a web browser. As the adoption and deployment of cloud computing increase, it is critical to evaluate the performance of cloud environments. Currently, modeling and simulation technology has become a useful and powerful tool in cloud computing research community to deal with these issues. Cloud simulators are required for cloud system testing to decrease the complexity and separate quality concerns. Cloud computing means saving and accessing the data over the internet instead of local storage. In this paper, we have provided a short review on the types, models and architecture of the cloud environment.
Cloud Computing is being used widely all over the world by many IT companies as it provides vario... more Cloud Computing is being used widely all over the world by many IT companies as it provides various benefits to the users like cost saving and ease of use. However, with the growing demands of users for computing services, cloud providers are encouraged to deploy large datacenters which consume very high amount of energy and also contribute to the increase in carbon dioxide emission in the environment. Therefore, we require to develop techniques which will help to get more environment friendly computing i.e. Green Cloud Computing. Cloud computing is an increasingly popular paradigm for accessing computing resources. This paper discusses some of the research challenges for cloud computing from an enterprise or organizational perspective, and puts them in context by reviewing the existing body of literature in cloud computing. Various research challenges relating to the following topics are discussed: the organizational changes brought about by cloud computing; the economic and organizational implications of its utility billing model; the security, legal and privacy issues that cloud computing raises. It is important to highlight these research challenges because cloud computing is not simply about a technological improvement of data centers but a fundamental change in how IT is provisioned and used. This type of research has the potential to influence wider adoption of cloud computing in enterprise, and in the consumer market too.
EEG (electroencephalography) energy is an important evaluation indicator in brain death determina... more EEG (electroencephalography) energy is an important evaluation indicator in brain death determination based on EEG analysis. In related works, the static EEG energy value can be discovered using EMD (empirical mode decomposition), MEMD (multivariate empirical mode decomposition) and 2T-EMD (turning tangent empirical mode decomposition) for EEG-based coma and quasi-brain-death analysis. However such methods are not time-varying and feasible. In this paper, we firstly propose the Dynamic 2T-EMD algorithm to evaluate the dynamic patients' EEG energy variation by the means of time window and time step method. With the time window sliding along the time axis in a time step, EEG energy of corresponding time step is computed and stored. The proposed algorithm is applied to analyze 19 cases of coma patients' EEG and 17 cases of quasi-brain-death patients' EEG. Two typical patients in coma and quasi-brain-death state and one special case who was from coma to quasi-brain-death have been taken as examples to give the algorithm performance. Results show that EEG energy in coma state are obviously higher than that in quasi-brain-death state, and even present the EEG energy change trend of every case, which can prevent loss of information and wrong analysis results caused by noise interference and provide scientific basis for doctors to evaluate patients' consciousness levels in brain death determination. The proposed algorithm will be very helpful to develop the real time brain death diagnostic system.
In speaker diarization, the speech/voice activity detection is performed to separate speech, non-... more In speaker diarization, the speech/voice activity detection is performed to separate speech, non-speech and silent frames. Zero crossing rate and root mean square value of frames of audio clips has been used to select training data for silent, speech and non-speech models. The trained models are used by two classifiers, Gaussian mixture model (GMM) and Artificial neural network (ANN), to classify the speech and non-speech frames of audio clip. The results of ANN and GMM classifier are compared by Receiver operating characteristics (ROC) curve and Detection ErrorTradeoff (DET) graph. It is concluded that neural network based SAD comparatively better than Gaussian mixture model based SAD.
Over the last twenty years face recognition has made immense progress based on statistical learni... more Over the last twenty years face recognition has made immense progress based on statistical learning or subspace discriminant analysis. This paper investigates a technique to reduce features necessary for face recognition based on local binary pattern, which is constructed by applying wavelet transform into local binary pattern. The approach is evaluated in two ways: wavelet transform applied to the LBP features and wavelet transform applied twice on the original image and LBP features. The resultant data are compared to the results obtained without applying wavelet transform, revealing that the reduction base one wavelet achieves the same or sometimes improved accuracy. The proposed algorithm is experimented on the Cambridge ORL Face database.
Nowadays surveillance systems have been widely deployed in various places and generate massive am... more Nowadays surveillance systems have been widely deployed in various places and generate massive amount of video data every day. This raises threats of unauthorized access and potential privacy leakage as the recorded videos usually contain rich identifiable information such as facial biometrics. In order to mitigate the threats, many existing methods perform symmetric encryption on the entire frames in the videos. Unfortunately, these methods could introduce additional computation cost and storage. Moreover, as surveillance systems could be a part of distributed system, the key management is critical and challenging. In this paper, we propose a novel method which incorporates background subtraction technique and RSA encryption algorithm. Rather than encrypting the entire frames of the videos, the proposed detect the regions around moving objects in the frames of video and then perform RSA encryption on the detected regions. And RSA encryption technique has its advantages of key distribution and management. Our experimental results show that the proposed method only involve moderate computation cost and storage.
Content Based Image Retrieval (CBIR) techniques are becoming an essential requirement in the mult... more Content Based Image Retrieval (CBIR) techniques are becoming an essential requirement in the multimedia systems with the widespread use of internet, declining cost of storage devices and the exponential growth of un-annotated digital image information available in recent years. Therefore multi query systems have been used rather than a single query in order to bridge the semantic gaps and in order to understand user’s requirements. Moreover, query replacement algorithm has been used in the previous works in which user provides multiple images to the query image set referred as representative images. Feature vectors are extracted for each image in the representative image set and every image in the database. The centroid, Crep of the representative images is obtained by computing the mean of their feature vectors. Then every image in the representative image set is replaced with the same candidate image in the dataset one by one and new centroids are calculated for every replacement .The distance between each of the centroids resulting from the replacement and the representative image centroid Crep is calculated using Euclidean distance. The cumulative sum of these distances determines the similarity of the candidate image with the representative image set and is used for ranking the images. The smaller the distance, the similar will be the image with the representative image set. But it has some research gaps like it takes a lot of time to extract feature of each and every image from the database and compare our image with the database images and complexity as well as cost increases. So in our proposed work, the KNN algorithm is applied for classification of images in the database image set using the query images and the candidate images are reduced to images returned after classification mechanism which leads to decrease the execution time and reduce the number of iterations. Hence due to hybrid model of multi query and KNN, the effectiveness of image retrieval in CBIR system increases. The language used in this work is C /C++ with Open CV libraries and IDE is Visual studio 2015. The experimental results show that our method is more effective to improve the performance of the retrieval of images.
Mobile Ad-hoc Network (MANET) is a kind of wireless network that has the most challenging network... more Mobile Ad-hoc Network (MANET) is a kind of wireless network that has the most challenging network infrastructure. It is formed using the mobile nodes without any centralized administration from the security perspective and is a self-configuring fastest emerging wireless technology, each node on the MANET will act like a router which forwards the packets. Dynamic nature of this network makes routing protocols to play a prominent role in setting up efficient route among a pair of nodes. Dynamic Source Routing (DSR) and Ad-hoc On-Demand Distance Vector (ADOV) is a reactive MANET routing protocols. Most of the attacks on MANETs are routing protocol attacks. Attacks on routing protocols, especially internal attacks will cause the damage to MANETs. Sinkhole and black hole attacks a re a type of internal attack which is affected by attempting to draw all network traffic to malicious nodes that fake routing update and degrade the performance of the network. The black hole nodes should be detected from the network as early as possible via detection mechanism and should also guarantee the higher detection rate and less cross-over error rate. In this paper, we studied the characteristics of black hole attack and how it will affect the performance of the distance vector routing on demand routing protocol such as (ADOV) protocol, which recognizes the presence of black hole node from packet flow information between nodes and isolates it from the network via applying AODV protocol that one of popular routing protocol. We have evaluated the performance of the system using widely used simulator NS2, results prove the effectiveness of our prevention and detection method.
The advent of digital systems for the production and transmission of information decisively influ... more The advent of digital systems for the production and transmission of information decisively influences human progress and represents the future in any field of social life. In order to survive, organizations must correlate the objectives to the new trend of the society based on information, deeply marked by globalization. In recent years new computational paradigm were proposed and adopted. These include Cloud computing. Together with the stabilization of technologies related to Cloud computing, SQL databases have become more attractive due to native support for scalability and distributed architecture and the fact that many of these can be offered as services. The paper presents a few important aspects about cloud computing and proposed a new database designed to be implemented in cloud. We offer a new model and an example of implementing in Romanian medicine1.
Code switching is a widely observed but less studied phenomenon, especially in multilingual and m... more Code switching is a widely observed but less studied phenomenon, especially in multilingual and multicultural communities. So, the present study investigated the status of grammatical code switching among Iranian EFL university students. Also, the role of the teacher in managing the code switching was investigated, too. Two classes including 96 participants from two different universities were observed carefully and the required data were collected. Analyses of the data revealed varying nature of code switching in both settings. Moreover, the obtained frequencies revealed the fact that among the four types of „trigger words‟ only „proper nouns‟ and discourse marker „OK‟ remarkably were responsible for code switching in one setting, while „lexical transfer‟ and also the discourse marker „OK‟ lead to code switch in another. Meanwhile, four functions for code switching were determined, which included providing equivalents for the key word, showing humor, inspiring learners, and explaining the required assignments, as teacher‟s role in dealing with code switching .
Keywords
Code Switching, EFL Classroom, Interference and Interaction
Computer memory is expensive and the recording of data captured by a webcam needs memory. In orde... more Computer memory is expensive and the recording of data captured by a webcam needs memory. In order to minimize the memory usage in recording data from human motion as recorded from the webcam, this algorithm will use motion detection as applied to a process to measure the change in speed or vector of an object in the field of view. This application only works if there is a motion detected and it will automatically save the captured image in its designated folder.
Owing to the conception of big data and massive data processing there are increasing owes related... more Owing to the conception of big data and massive data processing there are increasing owes related to the temporal aspects of the data processing. In order to address these issues a continuous progression in data collection, storage technologies, designing and implementing large-scale parallel algorithm for Data mining is seen to be emerging in a rapid pace. In this regards, the Apriori algorithms have a great impact for finding frequent item sets using candidate generation. This paper presents highlights on parallel algorithm for mining association rules using MPI for passing message base in the Master-Slave based structural model.
Cloud Computing provides different types of services such as SaaS, PaaS, IaaS. Each of them have ... more Cloud Computing provides different types of services such as SaaS, PaaS, IaaS. Each of them have their own security challenges, but IaaS undertakes all types of challenges viz., network attack ,behaviour based attack, request based attacks i.e handling the requests from untrusted users, XSS (cross site scripting attack), DDOS and many more. These attacks are independent of each other and consequently the QoS provided by cloud is compromised. This paper proposes a History aware Behaviour based IDS (Intrusion Detection System) BIDS. BIDS provides detection of untrusted users, false requests that may lead to spoofing, XSS or DOS attack and many more such attacks. In addition, certain cases where user login or password is compromised. History aware BIDs can be helpful in detecting such attacks and maintaining the QoS provided to the user in cloud IaaS (Infrastructure as a service).
This paper presents a review on digital image filtering techniques. The main emphasis is on media... more This paper presents a review on digital image filtering techniques. The main emphasis is on median filtering and its extended versions like hybrid median filtering, relaxed median filtering etc. It is found that still median filtering demands some sort of enhancements as it is best for salt and pepper noise only. By conducting a survey suitable gaps are found in existing literature. In the end comparison table is also drawn among the existing techniques.
Gaussian Mixture Models (GMMs) has been proposed for off-line signature verification. The individ... more Gaussian Mixture Models (GMMs) has been proposed for off-line signature verification. The individual Gaussian components are shown to represent some global features such as skewness, kurtosis, etc. that characterize various aspects of a signature, and are effective for modeling its specificity. The learning phase involves the use of Gaussian Mixture Model (GMM) technique to build a reference model for each signature sample of a particular user. The verification phase uses three layers of statistical techniques. The first layer involves computation of GMM-based log-likelihood probability match score, second layer performs the mapping of this score into soft boundary ranges of acceptance or rejection through the use of z-score analysis and normalization function, thirdly, threshold is used to arrive at the final decision of accepting or rejecting a given signature sample. The focus of this work is on faster detection of authenticated signature as no vector analysis is done in GMM. From the experimental results, the new features proved to be more robust than other related features used in the earlier systems. The FAR (False Acceptance Rate) and FRR (False Rejection Rate) for the genuine samples is 0.15 and 0.19 respectively. Mixture Model (GMM), Z-score analysis, False Acceptance Rate(FAR), False Rejection Rate(FRR)
The Campus Local Area Network (CLAN) of academic institutions interconnect computers ranging from... more The Campus Local Area Network (CLAN) of academic institutions interconnect computers ranging from one hundred to about twenty five hundred and these computers are located in academic building(s), hostel building(s), faculty quarter(s), students amenities centre, etc all around the campus. The students, faculty and the supporting staff members use the network primarily for internet usage at both personal and professional levels and secondarily for usage of the available services and resources. Various web based services viz: Web Services, Mail Services, DNS, and FTP services are generally made available in the campus LAN. Apart from these services various intranet based services are also made available for the users of the LAN. Campus LAN users from the hostels change very frequently and also sometime become targets (we call as soft targets) to the attackers or zombie because of either inadequate knowledge to protect their own computer/ laptop, which is also a legitimate node of the campus LAN; or their enthusiastic nature of experimentation. The interconnectivity of these legitimates nodes of the campus LAN and that of the attackers in the World Wide Web, make the computers connected in the LAN (nodes) an easy target for malicious users who attempt to exhaust the resources by launching Distributed Denial-of-Service (DDoS) attacks. In this paper we present a technique to mitigate the distributed denial of service attacks in campus wide LAN by limiting the bandwidth of the affected computers (soft targets) of the virtual LAN from a unified threat management (UTM) firewall. The technique is supported with help of bandwidth utilization report of the campus LAN with and without implementation of bandwidth limiting rule; obtained from the UTM network traffic analyzer. The graphical analyzer report on the utilization of the bandwidth with transmitting and receiving bits of the campus LAN after implementation of our bandwidth limiting rule is also given.
In this paper we are proposing a GUI based Prototype for user centered environment like class roo... more In this paper we are proposing a GUI based Prototype for user centered environment like class room, library hall, laboratory, meeting hall, coffee shop, kitchen, living room and bedroom, which recommends useful services based on the user's context. Service recommendation is mainly based on parameters such as user, location, time, day and mood. In addition whenever the conflict arises among different users it will be resolved using some conflict resolving algorithms. The motivation behind the proposed work is to improve the user satisfaction level and to improve the social relationship between user and devices The prototype contains simulated sensors which are used to capture the raw context information, which is then described with meaningful English sentence and services are recommended based on user's situation. The proposed conflict resolving algorithms are Rule based algorithm, Bayesian probability based algorithm and Rough set theory based algorithm. The amount of conflicts resolved by these algorithms is also analyzed at the end.
Designing heterogeneous distributed systems requires of the use of tools that facilitate the depl... more Designing heterogeneous distributed systems requires of the use of tools that facilitate the deployment and the interaction between platforms. In this paper we propose using Simple Object Access Protocol (SOAP) and REpresentational State Transfer (REST), two main approaches for creating applications based on distributed services, for distributed computation. Our aim is to demonstrate how they could be used to develop evolutionary computation systems on heterogeneous platforms, taking advantage of their ability to deal with heterogeneous infrastructures and environments, and giving support for parallel implementations with a high platform flexibility. Both approaches are different and present some advantages and disadvantages for interfacing to web services: SOAP is conceptually more difficult (has a steeper learning curve) and more " heavyweight " than REST, although it lacks of standards support for security. The results obtained on different experiments have shown that both SOAP and REST can be used as communication protocol for distributed evolutionary computation. Results obtained are comparable, however for large amounts of data (big messages), REST communications take longer than SOAP communications.
Reliable operations of power transformers are necessary for effective transmission and distributi... more Reliable operations of power transformers are necessary for effective transmission and distribution of power supply. During normal functions of the power transformer, distinct types of faults occurs due to insulation failure, oil aging products, overheating of windings, etc., affect the continuity of power supply thus leading to serious economic losses. To avoid interruptions in the power supply, various software fault diagnosis approaches are developed to detect faults in the power transformer and eliminate the impacts. SVM and SVM-SMO are the software fault diagnostic techniques developed in this paper for the continuous monitoring and analysis of faults in the power transformer. The SVM algorithm is faster, conceptually simple and easy to implement with better scaling properties for few training samples. The performances of SVM for large training samples are complex, subtle and difficult to implement. In order to obtain better fault diagnosis of large training data, SVM is optimized with SMO technique to achieve high interpretation accuracy in fault analysis of power transformer. The proposed methods use Dissolved Gas-in-oil Analysis (DGA) data set obtained from 500 KV main transformers of Pingguo Substation in South China Electric Power Company. DGA is an important tool for diagnosis and detection of incipient faults in the power transformers. The Gas Chromatograph (GC) is one of the traditional methods of DGA, utilized to choose the most appropriate gas signatures dissolved in transformer oil to detect types of faults in the transformer. The simulations are carried out in MATLAB software with an Intel core 3 processor with speed of 3 GHZ and 2 GB RAM PC. The results obtained by optimized SVM and SVM-SMO are compared with the existing SVM classification techniques. The test results indicate that the SVM-SMO approach significantly improve the classification accuracy and computational time for power transformer fault classification. 1688 | P a g e A u g 2 0 , 2 0 1 3