SHAIK SUBHANI - Academia.edu (original) (raw)
Papers by SHAIK SUBHANI
International Journal of Mathematical Sciences and Computing
Now a day's variety of reasons for job resignations due to this, we have to take different types ... more Now a day's variety of reasons for job resignations due to this, we have to take different types of measurements for prediction of job seekers. They have different reasons for not doing jobs well and fell like pressure. Many employees suddenly come to an end of their service without any reason. Techniques of machine learning have full-grown in fame in the middle of researchers in current years. It is accomplished of propose answer to a broad range of problems. Help of machine learning, you may produce prediction concerning staff abrasion. So machine learning model we will be using TCS employee attrition a genuine time dataset to train our model. The aim of this study is to at hand a comparison of different machine learning algorithms for predict which employees are probable to go away their society. We propose two methods to crack the dataset into train and test Original Research Article
AbstractPattern recognition of speech signals is ability to translate a spoken word to text forma... more AbstractPattern recognition of speech signals is ability to translate a spoken word to text format. This paper presents an authority speech recognition system based on curvelet transform and artificial neural network techniques to enhance the recognition rate. This research comprised in two distinct phases, a feature extractor and recognizer is presented. In feature extraction phase, curvelet transform extract the features from the input speech signal and detail components of these signals which assist in achieving higher recognition rate. For feature matching, artificial neural networks is used as classifiers. The performance evaluation has been described in terms of accurate recognition rate, interfering sounds, hit rates, false alarm and miss rates. The rate of accurate classification was about 95.3 % for the sample speech signals.
International Journal of Innovative Technology and Exploring Engineering, 2020
The knowledge of Brain-Computer Interface (BCI) provides a direct exchange of information from th... more The knowledge of Brain-Computer Interface (BCI) provides a direct exchange of information from the human brain and external devices. In BCI design structure, electroencephalography (EEG) identifies to be the major deliberately calculate the recordings of brain activity. Our proposed method is used to extract and analyze the characteristics of the EEG signal. They organize signal for BCI can be discriminate against and serve up human emotions. The projected method recognizes EEG information retrieving and computing feature extraction and classification. These signals have dissimilar frequency stages for Data waves, theta, alpha and beta. The combination of curvelet transforms (CT) and the principal component analysis (PCA) compute the dimensionality minimize and optimal characteristic extraction. The categorization of EEG signals, ANN (Artificial Neural Network) impact on this process of classification. This paper also provides a similarity between the projected two tools PCA and CT,...
Online social networks provide user relationships and increase information and resource sharing b... more Online social networks provide user relationships and increase information and resource sharing between a large amount of users. Communications by social networks overtake the emails, and it poses enormous security challenges in this platform. Numerous privacy leakages arise while unauthorized users can access information at anywhere and anytime. This paper presents Usage Decision Online Social Network (UDOSN) model to protect private data from illegal access before and during the usage in online social networks. The authentication function controls the access requests before the usage, and right requests are done by the authorization function during the usage. Presented scenarios show the process of both functions. Moreover, Usage Decision Online Social Network model is reusable and environment-independent, and it is defined for social network with formal descriptions.
Many real time databases contain uncertain data. To overcome the uncertainty, the data must be in... more Many real time databases contain uncertain data. To overcome the uncertainty, the data must be integrated. This can be accomplished by using probabilistic approaches. Data uncertain is a generalized approach in many real world applications. The uncertain data can be controlled by using many statistical and soft computing techniques. In this paper, we are introducing a probabilistic technique for calculating the conditional probabilities. Here, we are proposing a naive Bayesian classifier for handling uncertain data using Log-Normal distribution. Our main objective is to determine uncertain of multiple attributes using the proposed approach (UNBC). Finally results shows that the proposed method classifies the uncertain data with high accuracy.
Ijcer, Jan 2, 2015
Data hiding is a technique that conceals data into a carrier for conveying secret messages confid... more Data hiding is a technique that conceals data into a carrier for conveying secret messages confidentially. This paper proposes a new data-hiding method based on pixel pair matching (PPM). The basic idea of PPM is to use the values of pixel pair as a reference coordinate, and search a coordinate in the neighbourhood set of this pixel pair according to a given message digit. The pixel pair is then replaced by the searched coordinate to conceal the digit. Exploiting modification direction (EMD) and diamond encoding (DE) are two data-hiding methods proposed recently based on PPM. The maximum capacity of EMD is 1.161 bpp and DE extends the payload of EMD by embedding digits in a larger notational system. The proposed method offers lower distortion than DE by providing more compact neighbourhood sets and allowing embedded digits in any notational system. Compared with the optimal pixel adjustment process (OPAP) method, the proposed method always has lower distortion for various payloads. Experimental results reveal that the proposed method not only provides better performance than those of OPAP and DE, but also is secure under the detection of some well-known steganalysis techniques.
ICT and Critical Infrastructure: Proceedings of the 48th Annual Convention of Computer Society of India- Vol II, 2014
Identifying abnormal behavior in the chosen dataset is essential for improving the quality of the... more Identifying abnormal behavior in the chosen dataset is essential for improving the quality of the given dataset and decreasing the impact of abnormal values/patterns in the knowledge discovery process. Outlier detection may be established in many data mining techniques. In this paper Regression analysis have been used to detect the outliers. Partial Least Square approach is mainly used in regression analysis. Laser dataset has been used to find out the outliers. The main objective is used for constructing predictive models. The Mahalanobis distance, Jackknife distance and T2 distance were calculated for finding the outliers.
International Journal of Advanced Research in Computer Science, Apr 30, 2017
Searching is most widely used in many applications. Elastic search is done for the documents and ... more Searching is most widely used in many applications. Elastic search is done for the documents and text within the local or in global databases. Elastic search is very fast comparing with the other normal searching process because elastic search will search the index directly not the text. Searching from huge data becomes more complicated to get the accurate results. Elastic search is developed with java by using the Lucene library. This is open source and highly efficient to search full text and analytics engine. In this paper, the Improved Elastic Search (IES) is introduced which is integrated with efficient duplicate detection by using Ensemble big data algorithms are implemented. This will improve the performance of the Elastic search because of detection and removal of duplicate data. For the improved accuracy the big data with hadoop algorithms are adopted. The dataset used for experimental results is any synthetic document related data.
International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 2019
Intrusion Detection and Prevention System disclosure is approach towards watching the action appe... more Intrusion Detection and Prevention System disclosure is approach towards watching the action appearing in a pc structure and separating them for signs of probable scenes, which are infringement or moving toward perils of infringement of PC security courses of action, commendable use methodologies, or standard security practices. An interruption identification framework is customizing that robotizes the intrusion disclosure process. An interruption avoidance framework is modifying that has all of the capacities of an intrusion distinguishing proof structure and can in like manner try to stop possible scenes. IDS and IPS progressions offer an expansive number of comparable capacities, and administrators can ordinarily impede neutralizing activity incorporates into IPS things, making them fill in as IDSs. As necessities are, for brevity the term interruption recognition and anticipation frameworks are used all through whatever is left of this segment to insinuate the two IDS and IPS ad...
International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 2019
Vehicle Number Identification using Raspberry pi 3 is an image conversion technology which captur... more Vehicle Number Identification using Raspberry pi 3 is an image conversion technology which captures the license plate of a vehicle. The main aim is to make an effective and accurate license number plate identification system. This system is carried out and performed in the areas where traffic signals are present and the camera is placed on the signal which is connected to raspberry pi and it sends signals to the server and it can also be used in apartments or residencies for capturing all the vehicle numbers entering the building. This system at first detects the vehicle license plate and then captures it .It then converts the image into the text. The text of the license plate is displayed on the screen using the image conversion. Open CV and OCR are the two software's used for image capturing and conversion of that into text format respectively. The resulting data is then displayed on the screen and saved into a folder. The whole system is developed on Raspberry Pi desktop and ...
In this paper we survey, consolidate, and present the state of the art in database system and we ... more In this paper we survey, consolidate, and present the state of the art in database system and we try to specify the most important research possibilities in the data bases. The heart of our analysis is a survey on basic to advanced concepts and their relativity which will give a conclusion of how better performance is expected by integrating those aspects. In addition we described the advantages of parallelism and distribution including the data ware housing approach. This paper reviews all the above mentioned things in a better possible way.The main motto of the paper is how the basic aspects of the data base systems are helpful to lead the systems to advanced mechanism such as parallelism, distribution and handling of the metadata base architectures.
Most real databases contain data whose correctness is uncertain. In order to work with such data,... more Most real databases contain data whose correctness is uncertain. In order to work with such data, there is a need to quantify the integrity of the data. This is achieved by using probabilistic databases. Data uncertain is common in real world applications. The uncertainty can be controlled very prudently. In this paper, we are using probabilistic models on uncertain data and develop a novel method to calculate conditional probabilities for uncertain numerical attributes. Based on that, we propose a Naive Bayesian classifier algorithm for uncertain data(NBCU) using exponential distribution. The ultimate aim is determine the uncertainty of multiple attributes using our proposed approach (NBCU).The experimental results show that the proposed method classifies uncertain data with potentially higher accuracy.
EEG-based brain-computer interface is a computer-based system provides effective communication an... more EEG-based brain-computer interface is a computer-based system provides effective communication and control channels between human brain and computer to carry out a desired action. However, classification of single-trial EEG signals and controlling a device continuously during motor imagery is a difficult task. In this paper, we propose feature extraction method for a single trial online motor imagery using curvelet transform. These curvelet coefficients were used to extract the characters from the motor imagery EEG and classify the pattern of left and right hand movement imagery by Bayesian analysis with Gaussian model. The performance of motor imagery tested by the eye dataset for BCI competition 2003. The hypothetical results presented highest classification accuracy of 96% and superior information transfer rate is obtained.
International Research Journal of Engineering and Technology, 2021
In the banking system, bank marketing has become an essential continued existence weapon and is b... more In the banking system, bank marketing has become an essential continued existence weapon and is basically dynamic in real world. The rise of bank marketing is reused business relations, and most made banks are those who will really strong their relationships with customers. Knowledge modernization and ferocious rivalry in the midst of current banks have altered a large collection of banking facilities. Technology is neutering the relationship amid banks and its inside and outside customers. This dataset will give you the clear targets for marketing depending upon the age of the customers, salaries, duration of the call, etc. The objective of this paper is to assess the value of data in defining marketing strategies and marketing management. The technological advances in recent years offer many opportunities to marketing practitioners and researchers.
International journal of engineering research and technology, 2018
----Electrocardiogram (ECG or EKG) is an analytical tool used to assess the electrical activity o... more ----Electrocardiogram (ECG or EKG) is an analytical tool used to assess the electrical activity of the heart over a period of time using electrodes placed on the skin. This paper deals with the analysis of ECG signals using a powerful technique called curvelet transform. The proposed work is carried in two steps, in the first phase, an attempt was made to generate ECG curved forms using MATLAB simulator and in the second phase, the ECG signal was de-noised by removing the corresponding curvelet coefficients at higher scales using curvelet transform. Wiener filter is used to detect the positions of the heart beats. The parameters like the mean difference of the heart beats and the heart rate are computed for ECG signal. Finally, the Statistical results show that our method could significantly improve accuracy, error reduction and has potential to become an effective method for ECG signal analysis.
Frequent Pattern Mining (FPM) is a focused research area with a goal of identifying the patterns ... more Frequent Pattern Mining (FPM) is a focused research area with a goal of identifying the patterns that appear in the dataset most frequently. Due to huge increase in data volume and large search space, it is necessary to study the parallel computing algorithms for mining the frequent patterns. In the last two decades, many sequential algorithms have been implemented for solving FPM problem. Yet no more efficient algorithm erect for today’s large data volumes called Big Data. In this paper, we presented a scalable parallel algorithm for big data frequent patterns mining. Three key challenges are identified to parallel algorithmic design: load balancing, work partitioning and memory scalability. The experimental results are carried out using different datasets such as chess, census, mush room, Kosarak, pumsb, connect and a comparison is made with existing parallel approaches. The experimental results show scalable performance and yields significant gains over different machines.
Localization and detecting weak signal are the challenging tasks in radar systems. Radar performa... more Localization and detecting weak signal are the challenging tasks in radar systems. Radar performance can be enhanced by mounting the receiver output signal-to-noise ratio (SNR). In this paper, a novel approach based on Curvelet transform for detecting weak radar signal is proposed. We applied the curvelet transform and modulus maximum to detect the locations of laser radar echo signal. Localizing the received signal plays key during the detection of signal in noise. The parameters like SNR, RMSE and curvelet coefficients are evaluated and the simulation to real radar signal is verified by MATLAB. The statistical result shows that the proposed method effectively improves the signal noise rate of the weak signal. Keywords— Curvelet, Weal signal, Radar system, signal-to-noise ratio (SNR), MATLAB.
International Journal of Mathematical Sciences and Computing
Now a day's variety of reasons for job resignations due to this, we have to take different types ... more Now a day's variety of reasons for job resignations due to this, we have to take different types of measurements for prediction of job seekers. They have different reasons for not doing jobs well and fell like pressure. Many employees suddenly come to an end of their service without any reason. Techniques of machine learning have full-grown in fame in the middle of researchers in current years. It is accomplished of propose answer to a broad range of problems. Help of machine learning, you may produce prediction concerning staff abrasion. So machine learning model we will be using TCS employee attrition a genuine time dataset to train our model. The aim of this study is to at hand a comparison of different machine learning algorithms for predict which employees are probable to go away their society. We propose two methods to crack the dataset into train and test Original Research Article
AbstractPattern recognition of speech signals is ability to translate a spoken word to text forma... more AbstractPattern recognition of speech signals is ability to translate a spoken word to text format. This paper presents an authority speech recognition system based on curvelet transform and artificial neural network techniques to enhance the recognition rate. This research comprised in two distinct phases, a feature extractor and recognizer is presented. In feature extraction phase, curvelet transform extract the features from the input speech signal and detail components of these signals which assist in achieving higher recognition rate. For feature matching, artificial neural networks is used as classifiers. The performance evaluation has been described in terms of accurate recognition rate, interfering sounds, hit rates, false alarm and miss rates. The rate of accurate classification was about 95.3 % for the sample speech signals.
International Journal of Innovative Technology and Exploring Engineering, 2020
The knowledge of Brain-Computer Interface (BCI) provides a direct exchange of information from th... more The knowledge of Brain-Computer Interface (BCI) provides a direct exchange of information from the human brain and external devices. In BCI design structure, electroencephalography (EEG) identifies to be the major deliberately calculate the recordings of brain activity. Our proposed method is used to extract and analyze the characteristics of the EEG signal. They organize signal for BCI can be discriminate against and serve up human emotions. The projected method recognizes EEG information retrieving and computing feature extraction and classification. These signals have dissimilar frequency stages for Data waves, theta, alpha and beta. The combination of curvelet transforms (CT) and the principal component analysis (PCA) compute the dimensionality minimize and optimal characteristic extraction. The categorization of EEG signals, ANN (Artificial Neural Network) impact on this process of classification. This paper also provides a similarity between the projected two tools PCA and CT,...
Online social networks provide user relationships and increase information and resource sharing b... more Online social networks provide user relationships and increase information and resource sharing between a large amount of users. Communications by social networks overtake the emails, and it poses enormous security challenges in this platform. Numerous privacy leakages arise while unauthorized users can access information at anywhere and anytime. This paper presents Usage Decision Online Social Network (UDOSN) model to protect private data from illegal access before and during the usage in online social networks. The authentication function controls the access requests before the usage, and right requests are done by the authorization function during the usage. Presented scenarios show the process of both functions. Moreover, Usage Decision Online Social Network model is reusable and environment-independent, and it is defined for social network with formal descriptions.
Many real time databases contain uncertain data. To overcome the uncertainty, the data must be in... more Many real time databases contain uncertain data. To overcome the uncertainty, the data must be integrated. This can be accomplished by using probabilistic approaches. Data uncertain is a generalized approach in many real world applications. The uncertain data can be controlled by using many statistical and soft computing techniques. In this paper, we are introducing a probabilistic technique for calculating the conditional probabilities. Here, we are proposing a naive Bayesian classifier for handling uncertain data using Log-Normal distribution. Our main objective is to determine uncertain of multiple attributes using the proposed approach (UNBC). Finally results shows that the proposed method classifies the uncertain data with high accuracy.
Ijcer, Jan 2, 2015
Data hiding is a technique that conceals data into a carrier for conveying secret messages confid... more Data hiding is a technique that conceals data into a carrier for conveying secret messages confidentially. This paper proposes a new data-hiding method based on pixel pair matching (PPM). The basic idea of PPM is to use the values of pixel pair as a reference coordinate, and search a coordinate in the neighbourhood set of this pixel pair according to a given message digit. The pixel pair is then replaced by the searched coordinate to conceal the digit. Exploiting modification direction (EMD) and diamond encoding (DE) are two data-hiding methods proposed recently based on PPM. The maximum capacity of EMD is 1.161 bpp and DE extends the payload of EMD by embedding digits in a larger notational system. The proposed method offers lower distortion than DE by providing more compact neighbourhood sets and allowing embedded digits in any notational system. Compared with the optimal pixel adjustment process (OPAP) method, the proposed method always has lower distortion for various payloads. Experimental results reveal that the proposed method not only provides better performance than those of OPAP and DE, but also is secure under the detection of some well-known steganalysis techniques.
ICT and Critical Infrastructure: Proceedings of the 48th Annual Convention of Computer Society of India- Vol II, 2014
Identifying abnormal behavior in the chosen dataset is essential for improving the quality of the... more Identifying abnormal behavior in the chosen dataset is essential for improving the quality of the given dataset and decreasing the impact of abnormal values/patterns in the knowledge discovery process. Outlier detection may be established in many data mining techniques. In this paper Regression analysis have been used to detect the outliers. Partial Least Square approach is mainly used in regression analysis. Laser dataset has been used to find out the outliers. The main objective is used for constructing predictive models. The Mahalanobis distance, Jackknife distance and T2 distance were calculated for finding the outliers.
International Journal of Advanced Research in Computer Science, Apr 30, 2017
Searching is most widely used in many applications. Elastic search is done for the documents and ... more Searching is most widely used in many applications. Elastic search is done for the documents and text within the local or in global databases. Elastic search is very fast comparing with the other normal searching process because elastic search will search the index directly not the text. Searching from huge data becomes more complicated to get the accurate results. Elastic search is developed with java by using the Lucene library. This is open source and highly efficient to search full text and analytics engine. In this paper, the Improved Elastic Search (IES) is introduced which is integrated with efficient duplicate detection by using Ensemble big data algorithms are implemented. This will improve the performance of the Elastic search because of detection and removal of duplicate data. For the improved accuracy the big data with hadoop algorithms are adopted. The dataset used for experimental results is any synthetic document related data.
International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 2019
Intrusion Detection and Prevention System disclosure is approach towards watching the action appe... more Intrusion Detection and Prevention System disclosure is approach towards watching the action appearing in a pc structure and separating them for signs of probable scenes, which are infringement or moving toward perils of infringement of PC security courses of action, commendable use methodologies, or standard security practices. An interruption identification framework is customizing that robotizes the intrusion disclosure process. An interruption avoidance framework is modifying that has all of the capacities of an intrusion distinguishing proof structure and can in like manner try to stop possible scenes. IDS and IPS progressions offer an expansive number of comparable capacities, and administrators can ordinarily impede neutralizing activity incorporates into IPS things, making them fill in as IDSs. As necessities are, for brevity the term interruption recognition and anticipation frameworks are used all through whatever is left of this segment to insinuate the two IDS and IPS ad...
International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 2019
Vehicle Number Identification using Raspberry pi 3 is an image conversion technology which captur... more Vehicle Number Identification using Raspberry pi 3 is an image conversion technology which captures the license plate of a vehicle. The main aim is to make an effective and accurate license number plate identification system. This system is carried out and performed in the areas where traffic signals are present and the camera is placed on the signal which is connected to raspberry pi and it sends signals to the server and it can also be used in apartments or residencies for capturing all the vehicle numbers entering the building. This system at first detects the vehicle license plate and then captures it .It then converts the image into the text. The text of the license plate is displayed on the screen using the image conversion. Open CV and OCR are the two software's used for image capturing and conversion of that into text format respectively. The resulting data is then displayed on the screen and saved into a folder. The whole system is developed on Raspberry Pi desktop and ...
In this paper we survey, consolidate, and present the state of the art in database system and we ... more In this paper we survey, consolidate, and present the state of the art in database system and we try to specify the most important research possibilities in the data bases. The heart of our analysis is a survey on basic to advanced concepts and their relativity which will give a conclusion of how better performance is expected by integrating those aspects. In addition we described the advantages of parallelism and distribution including the data ware housing approach. This paper reviews all the above mentioned things in a better possible way.The main motto of the paper is how the basic aspects of the data base systems are helpful to lead the systems to advanced mechanism such as parallelism, distribution and handling of the metadata base architectures.
Most real databases contain data whose correctness is uncertain. In order to work with such data,... more Most real databases contain data whose correctness is uncertain. In order to work with such data, there is a need to quantify the integrity of the data. This is achieved by using probabilistic databases. Data uncertain is common in real world applications. The uncertainty can be controlled very prudently. In this paper, we are using probabilistic models on uncertain data and develop a novel method to calculate conditional probabilities for uncertain numerical attributes. Based on that, we propose a Naive Bayesian classifier algorithm for uncertain data(NBCU) using exponential distribution. The ultimate aim is determine the uncertainty of multiple attributes using our proposed approach (NBCU).The experimental results show that the proposed method classifies uncertain data with potentially higher accuracy.
EEG-based brain-computer interface is a computer-based system provides effective communication an... more EEG-based brain-computer interface is a computer-based system provides effective communication and control channels between human brain and computer to carry out a desired action. However, classification of single-trial EEG signals and controlling a device continuously during motor imagery is a difficult task. In this paper, we propose feature extraction method for a single trial online motor imagery using curvelet transform. These curvelet coefficients were used to extract the characters from the motor imagery EEG and classify the pattern of left and right hand movement imagery by Bayesian analysis with Gaussian model. The performance of motor imagery tested by the eye dataset for BCI competition 2003. The hypothetical results presented highest classification accuracy of 96% and superior information transfer rate is obtained.
International Research Journal of Engineering and Technology, 2021
In the banking system, bank marketing has become an essential continued existence weapon and is b... more In the banking system, bank marketing has become an essential continued existence weapon and is basically dynamic in real world. The rise of bank marketing is reused business relations, and most made banks are those who will really strong their relationships with customers. Knowledge modernization and ferocious rivalry in the midst of current banks have altered a large collection of banking facilities. Technology is neutering the relationship amid banks and its inside and outside customers. This dataset will give you the clear targets for marketing depending upon the age of the customers, salaries, duration of the call, etc. The objective of this paper is to assess the value of data in defining marketing strategies and marketing management. The technological advances in recent years offer many opportunities to marketing practitioners and researchers.
International journal of engineering research and technology, 2018
----Electrocardiogram (ECG or EKG) is an analytical tool used to assess the electrical activity o... more ----Electrocardiogram (ECG or EKG) is an analytical tool used to assess the electrical activity of the heart over a period of time using electrodes placed on the skin. This paper deals with the analysis of ECG signals using a powerful technique called curvelet transform. The proposed work is carried in two steps, in the first phase, an attempt was made to generate ECG curved forms using MATLAB simulator and in the second phase, the ECG signal was de-noised by removing the corresponding curvelet coefficients at higher scales using curvelet transform. Wiener filter is used to detect the positions of the heart beats. The parameters like the mean difference of the heart beats and the heart rate are computed for ECG signal. Finally, the Statistical results show that our method could significantly improve accuracy, error reduction and has potential to become an effective method for ECG signal analysis.
Frequent Pattern Mining (FPM) is a focused research area with a goal of identifying the patterns ... more Frequent Pattern Mining (FPM) is a focused research area with a goal of identifying the patterns that appear in the dataset most frequently. Due to huge increase in data volume and large search space, it is necessary to study the parallel computing algorithms for mining the frequent patterns. In the last two decades, many sequential algorithms have been implemented for solving FPM problem. Yet no more efficient algorithm erect for today’s large data volumes called Big Data. In this paper, we presented a scalable parallel algorithm for big data frequent patterns mining. Three key challenges are identified to parallel algorithmic design: load balancing, work partitioning and memory scalability. The experimental results are carried out using different datasets such as chess, census, mush room, Kosarak, pumsb, connect and a comparison is made with existing parallel approaches. The experimental results show scalable performance and yields significant gains over different machines.
Localization and detecting weak signal are the challenging tasks in radar systems. Radar performa... more Localization and detecting weak signal are the challenging tasks in radar systems. Radar performance can be enhanced by mounting the receiver output signal-to-noise ratio (SNR). In this paper, a novel approach based on Curvelet transform for detecting weak radar signal is proposed. We applied the curvelet transform and modulus maximum to detect the locations of laser radar echo signal. Localizing the received signal plays key during the detection of signal in noise. The parameters like SNR, RMSE and curvelet coefficients are evaluated and the simulation to real radar signal is verified by MATLAB. The statistical result shows that the proposed method effectively improves the signal noise rate of the weak signal. Keywords— Curvelet, Weal signal, Radar system, signal-to-noise ratio (SNR), MATLAB.