Gonzalo Pajare - Academia.edu (original) (raw)
Papers by Gonzalo Pajare
One important objective in Precision Agriculture is to minimize the volume of herbicides that are... more One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid ...
Expert Systems with Applications, 2015
Image segmentation plays an important role in image processing and computer vision. It is often u... more Image segmentation plays an important role in image processing and computer vision. It is often used to classify an image into separate regions, which ideally correspond to different real-world objects. Several segmentation methods have been proposed in the literature, being thresholding techniques the most popular. In such techniques, it is selected a set of proper threshold values that optimize a determined functional criterion, so that each pixel is assigned to a determined class according to its corresponding threshold points. One interesting functional criterion is the Tsallis entropy, which gives excellent results in bi-level thresholding. However, when it is applied to multilevel thresholding, its evaluation becomes computationally expensive, since each threshold point adds restrictions, multimodality and complexity to its functional formulation. Therefore, in the process of finding the appropriate threshold values, it is desired to limit the number of evaluations of the objective function (Tsallis entropy). Under such circumstances, most of the optimization algorithms do not seem to be suited to face such problems as they usually require many evaluations before delivering an acceptable result. On the other hand, the Electromagnetism-Like algorithm is an evolutionary optimization approach which emulates the attraction-repulsion mechanism among charges for evolving the individuals of a population. This technique exhibits interesting search capabilities whereas maintains a low number of function evaluations. In this paper, a new algorithm for multilevel segmentation based on the Electromagnetism-Like algorithm is proposed. In the approach, the optimization algorithm based on the electromagnetism theory is used to find the optimal threshold values by maximizing the Tsallis entropy. Experimental results over several images demonstrate that the proposed approach is able to improve the convergence velocity, compared with similar methods such as Cuckoo search, and Particle Swarm Optimization.
The Scientific World Journal, 2014
Computer-based sensors and actuators such as global positioning systems, machine vision, and lase... more Computer-based sensors and actuators such as global positioning systems, machine vision, and laser-based sensors have progressively been incorporated into mobile robots with the aim of configuring autonomous systems capable of shifting operator activities in agricultural tasks. However, the incorporation of many electronic systems into a robot impairs its reliability and increases its cost. Hardware minimization, as well as software minimization and ease of integration, is essential to obtain feasible robotic systems. A step forward in the application of automatic equipment in agriculture is the use of fleets of robots, in which a number of specialized robots collaborate to accomplish one or several agricultural tasks. This paper strives to develop a system architecture for both individual robots and robots working in fleets to improve reliability, decrease complexity and costs, and permit the integration of software from different developers. Several solutions are studied, from a f...
Pattern Recognition, 1998
This paper outlines a relaxation approach using the Hopfield neural network for solving the globa... more This paper outlines a relaxation approach using the Hopfield neural network for solving the global stereovision matching problem. The primitives used are edge segments. The similarity, smoothness and uniqueness constraints are transformed into the form of an ...
Pattern Recognition Letters, 1995
Classical stereo matching techniques use features representing objects in both images and compute... more Classical stereo matching techniques use features representing objects in both images and compute the minimum difference attribute values. No knowledge of the environment is taken into account. This paper proposes an image understanding stereo matching method using a supervised networks: the perceptron.
Applied Intelligence, 2014
Template matching (TM) plays an important role in several image-processing applications such as f... more Template matching (TM) plays an important role in several image-processing applications such as feature tracking, object recognition, stereo matching, and remote sensing. The TM approach seeks for the bestpossible resemblance between a subimage known as template and its coincident region within a source image. TM involves two critical aspects: similarity measurement and search strategy. The simplest available TM method aims for the best-possible coincidence between the images through an exhaustive computation of the normalized cross-correlation (NCC) values (similarity measurement) for all elements of the source image (search strategy). Recently, several TM algorithms that are based on evolutionary approaches have been proposed to reduce the number of NCC operations by calculating only a
Deforestation Around the World, 2012
Pattern Recognition, 2004
The objective of image fusion is to combine information from multiple images of the same scene. T... more The objective of image fusion is to combine information from multiple images of the same scene. The result of image fusion is a new image which is more suitable for human and machine perception or further image-processing tasks such as segmentation, feature extraction ...
Neural Networks, 2010
In this paper we propose a new method for combining simple classifiers through the analogue Hopfi... more In this paper we propose a new method for combining simple classifiers through the analogue Hopfield Neural Network (HNN) optimization paradigm for classifying natural textures in images. The base classifiers are the Fuzzy clustering (FC) and the parametric Bayesian estimator (BP). An initial unsupervised training phase determines the number of clusters and estimates the parameters for both FC and BP. Then a decision phase is carried out, where we build as many Hopfield Neural Networks as the available number of clusters. The number of nodes at each network is the number of pixels in the image which is to be classified. Each node at each network is initially loaded with a state value, which is the membership degree (provided by FC) that the node (pixel) belongs to the cluster associated to the network. Each state is later iteratively updated during the HNN optimization process taking into account the previous states and two types of external influences exerted by other nodes in its neighborhood. The external influences are mapped as consistencies. One is embedded in an energy term which considers the states of the node to be updated and the states of its neighbors. The other is mapped as the inter-connection weights between the nodes. From BP, we obtain the probabilities that the nodes (pixels) belong to a cluster (network). We define these weights as a relation between states and probabilities between the nodes in the neighborhood of the node which is being updated. This is the classifier combination, making the main finding of this paper. The proposed combined strategy based on the HNN outperforms the simple classifiers and also classical combination strategies.
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 2002
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 1999
This paper presents an approach to the local stereo matching problem using edge segments as featu... more This paper presents an approach to the local stereo matching problem using edge segments as features with several attributes. We have verified that the differences in attributes for the true matches cluster in a cloud around a center. The correspondence is established on the basis of the minimum distance criterion, computing the Mahalanobis distance between the difference of the attributes for a current pair of features and the cluster center (similarity constraint). We introduce a learning strategy based on the Hebbian Learning to get the best cluster center. A comparative analysis among methods without learning and with other learning strategies is illustrated.
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 2004
where he is currently a Professor. His research interest include robotics vision systems, fusion ... more where he is currently a Professor. His research interest include robotics vision systems, fusion sensors, and applications of automatic control to robotics and flight control.
IEEE Transactions on Image Processing, 2008
Fusion Engineering and Design, 2006
Databases in fusion experiments are made up of thousands of signals. For this reason, data analys... more Databases in fusion experiments are made up of thousands of signals. For this reason, data analysis must be simplified by developing automatic mechanisms for fast search and retrieval of specific data in the waveform database. In particular, a method for finding similar waveforms would be very helpful. The term ‘similar’ implies the use of proximity measurements in order to quantify
Fusion Engineering and Design, 2006
Since fusion plasma experiments generate hundreds of signals, it is important for their analysis ... more Since fusion plasma experiments generate hundreds of signals, it is important for their analysis to have automatic mechanisms for searching for similarities and retrieving specific data from the signal database. This paper describes a technique for searching in the TJ-II database that combines support vector machines and similarity query methods. Firstly, plasma signals are pre-processed by wavelet transform or discrete
Expert Systems with Applications, 2013
Studying the communities of microbial species is highly important since many natural and artifici... more Studying the communities of microbial species is highly important since many natural and artificial processes are mediated by groups of microbes rather than by single entities. One way of studying them is the search of common metabolic characteristics among microbial species, which is not only a potential measure for the differentiation and classification of closely-related organisms but also their study allows the finding of common functional properties that may describe the way of life of entire organisms or species. In this work we propose an expert system (ES), making the main contribution, to cluster a complex data set of 365 prokaryotic species by 114 metabolic features, information which may be incomplete for some species. Inspired on the human expert reasoning and based on hierarchical clustering strategies, our proposed ES estimates the optimal number of clusters adequate to divide the dataset and afterwards it starts an iterative process of clustering, based on the Self-organizing Maps (SOM) approach, where it finds relevant clusters at different steps by means of a new validity index inspired on the well-known Davies Bouldin (DB) index. In order to monitor the process and assess the behavior of the ES the partition obtained at each step is validated with the DB validity index. The resulting clusters prove that the use of metabolic features combined with the ES is able to handle a complex dataset that can help in the extraction of underlying information, gaining advantage over other existing approaches, that may relate metabolism with phenotypic, environmental or evolutionary characteristics in prokaryotic species.
One important objective in Precision Agriculture is to minimize the volume of herbicides that are... more One important objective in Precision Agriculture is to minimize the volume of herbicides that are applied to the fields through the use of site-specific weed management systems. In order to reach this goal, two major factors need to be considered: 1) the similar spectral signature, shape and texture between weeds and crops; 2) the irregular distribution of the weeds within the crop's field. This paper outlines an automatic computer vision system for the detection and differential spraying of Avena sterilis, a noxious weed growing in cereal crops. The proposed system involves two processes: image segmentation and decision making. Image segmentation combines basic suitable image processing techniques in order to extract cells from the image as the low level units. Each cell is described by two area-based attributes measuring the relations among the crops and the weeds. From these attributes, a hybrid decision making approach determines if a cell must be or not sprayed. The hybrid ...
Expert Systems with Applications, 2015
Image segmentation plays an important role in image processing and computer vision. It is often u... more Image segmentation plays an important role in image processing and computer vision. It is often used to classify an image into separate regions, which ideally correspond to different real-world objects. Several segmentation methods have been proposed in the literature, being thresholding techniques the most popular. In such techniques, it is selected a set of proper threshold values that optimize a determined functional criterion, so that each pixel is assigned to a determined class according to its corresponding threshold points. One interesting functional criterion is the Tsallis entropy, which gives excellent results in bi-level thresholding. However, when it is applied to multilevel thresholding, its evaluation becomes computationally expensive, since each threshold point adds restrictions, multimodality and complexity to its functional formulation. Therefore, in the process of finding the appropriate threshold values, it is desired to limit the number of evaluations of the objective function (Tsallis entropy). Under such circumstances, most of the optimization algorithms do not seem to be suited to face such problems as they usually require many evaluations before delivering an acceptable result. On the other hand, the Electromagnetism-Like algorithm is an evolutionary optimization approach which emulates the attraction-repulsion mechanism among charges for evolving the individuals of a population. This technique exhibits interesting search capabilities whereas maintains a low number of function evaluations. In this paper, a new algorithm for multilevel segmentation based on the Electromagnetism-Like algorithm is proposed. In the approach, the optimization algorithm based on the electromagnetism theory is used to find the optimal threshold values by maximizing the Tsallis entropy. Experimental results over several images demonstrate that the proposed approach is able to improve the convergence velocity, compared with similar methods such as Cuckoo search, and Particle Swarm Optimization.
The Scientific World Journal, 2014
Computer-based sensors and actuators such as global positioning systems, machine vision, and lase... more Computer-based sensors and actuators such as global positioning systems, machine vision, and laser-based sensors have progressively been incorporated into mobile robots with the aim of configuring autonomous systems capable of shifting operator activities in agricultural tasks. However, the incorporation of many electronic systems into a robot impairs its reliability and increases its cost. Hardware minimization, as well as software minimization and ease of integration, is essential to obtain feasible robotic systems. A step forward in the application of automatic equipment in agriculture is the use of fleets of robots, in which a number of specialized robots collaborate to accomplish one or several agricultural tasks. This paper strives to develop a system architecture for both individual robots and robots working in fleets to improve reliability, decrease complexity and costs, and permit the integration of software from different developers. Several solutions are studied, from a f...
Pattern Recognition, 1998
This paper outlines a relaxation approach using the Hopfield neural network for solving the globa... more This paper outlines a relaxation approach using the Hopfield neural network for solving the global stereovision matching problem. The primitives used are edge segments. The similarity, smoothness and uniqueness constraints are transformed into the form of an ...
Pattern Recognition Letters, 1995
Classical stereo matching techniques use features representing objects in both images and compute... more Classical stereo matching techniques use features representing objects in both images and compute the minimum difference attribute values. No knowledge of the environment is taken into account. This paper proposes an image understanding stereo matching method using a supervised networks: the perceptron.
Applied Intelligence, 2014
Template matching (TM) plays an important role in several image-processing applications such as f... more Template matching (TM) plays an important role in several image-processing applications such as feature tracking, object recognition, stereo matching, and remote sensing. The TM approach seeks for the bestpossible resemblance between a subimage known as template and its coincident region within a source image. TM involves two critical aspects: similarity measurement and search strategy. The simplest available TM method aims for the best-possible coincidence between the images through an exhaustive computation of the normalized cross-correlation (NCC) values (similarity measurement) for all elements of the source image (search strategy). Recently, several TM algorithms that are based on evolutionary approaches have been proposed to reduce the number of NCC operations by calculating only a
Deforestation Around the World, 2012
Pattern Recognition, 2004
The objective of image fusion is to combine information from multiple images of the same scene. T... more The objective of image fusion is to combine information from multiple images of the same scene. The result of image fusion is a new image which is more suitable for human and machine perception or further image-processing tasks such as segmentation, feature extraction ...
Neural Networks, 2010
In this paper we propose a new method for combining simple classifiers through the analogue Hopfi... more In this paper we propose a new method for combining simple classifiers through the analogue Hopfield Neural Network (HNN) optimization paradigm for classifying natural textures in images. The base classifiers are the Fuzzy clustering (FC) and the parametric Bayesian estimator (BP). An initial unsupervised training phase determines the number of clusters and estimates the parameters for both FC and BP. Then a decision phase is carried out, where we build as many Hopfield Neural Networks as the available number of clusters. The number of nodes at each network is the number of pixels in the image which is to be classified. Each node at each network is initially loaded with a state value, which is the membership degree (provided by FC) that the node (pixel) belongs to the cluster associated to the network. Each state is later iteratively updated during the HNN optimization process taking into account the previous states and two types of external influences exerted by other nodes in its neighborhood. The external influences are mapped as consistencies. One is embedded in an energy term which considers the states of the node to be updated and the states of its neighbors. The other is mapped as the inter-connection weights between the nodes. From BP, we obtain the probabilities that the nodes (pixels) belong to a cluster (network). We define these weights as a relation between states and probabilities between the nodes in the neighborhood of the node which is being updated. This is the classifier combination, making the main finding of this paper. The proposed combined strategy based on the HNN outperforms the simple classifiers and also classical combination strategies.
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 2002
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 1999
This paper presents an approach to the local stereo matching problem using edge segments as featu... more This paper presents an approach to the local stereo matching problem using edge segments as features with several attributes. We have verified that the differences in attributes for the true matches cluster in a cloud around a center. The correspondence is established on the basis of the minimum distance criterion, computing the Mahalanobis distance between the difference of the attributes for a current pair of features and the cluster center (similarity constraint). We introduce a learning strategy based on the Hebbian Learning to get the best cluster center. A comparative analysis among methods without learning and with other learning strategies is illustrated.
IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), 2004
where he is currently a Professor. His research interest include robotics vision systems, fusion ... more where he is currently a Professor. His research interest include robotics vision systems, fusion sensors, and applications of automatic control to robotics and flight control.
IEEE Transactions on Image Processing, 2008
Fusion Engineering and Design, 2006
Databases in fusion experiments are made up of thousands of signals. For this reason, data analys... more Databases in fusion experiments are made up of thousands of signals. For this reason, data analysis must be simplified by developing automatic mechanisms for fast search and retrieval of specific data in the waveform database. In particular, a method for finding similar waveforms would be very helpful. The term ‘similar’ implies the use of proximity measurements in order to quantify
Fusion Engineering and Design, 2006
Since fusion plasma experiments generate hundreds of signals, it is important for their analysis ... more Since fusion plasma experiments generate hundreds of signals, it is important for their analysis to have automatic mechanisms for searching for similarities and retrieving specific data from the signal database. This paper describes a technique for searching in the TJ-II database that combines support vector machines and similarity query methods. Firstly, plasma signals are pre-processed by wavelet transform or discrete
Expert Systems with Applications, 2013
Studying the communities of microbial species is highly important since many natural and artifici... more Studying the communities of microbial species is highly important since many natural and artificial processes are mediated by groups of microbes rather than by single entities. One way of studying them is the search of common metabolic characteristics among microbial species, which is not only a potential measure for the differentiation and classification of closely-related organisms but also their study allows the finding of common functional properties that may describe the way of life of entire organisms or species. In this work we propose an expert system (ES), making the main contribution, to cluster a complex data set of 365 prokaryotic species by 114 metabolic features, information which may be incomplete for some species. Inspired on the human expert reasoning and based on hierarchical clustering strategies, our proposed ES estimates the optimal number of clusters adequate to divide the dataset and afterwards it starts an iterative process of clustering, based on the Self-organizing Maps (SOM) approach, where it finds relevant clusters at different steps by means of a new validity index inspired on the well-known Davies Bouldin (DB) index. In order to monitor the process and assess the behavior of the ES the partition obtained at each step is validated with the DB validity index. The resulting clusters prove that the use of metabolic features combined with the ES is able to handle a complex dataset that can help in the extraction of underlying information, gaining advantage over other existing approaches, that may relate metabolism with phenotypic, environmental or evolutionary characteristics in prokaryotic species.