Evolutionary Optimization of Least-Squares Support Vector Machines (original) (raw)
Related papers
Lecture Notes in Computer Science, 2004
This paper addresses the problem of tuning hyperparameters in support vector machine modeling. A Direct Simplex Search (DSS) method, which seeks to evolve hyperparameter values using an empirical error estimate as steering criterion, is proposed and experimentally evaluated on real-world datasets. DSS is a robust hill climbing scheme, a popular derivative-free optimization method, suitable for low-dimensional optimization problems for which the computation of the derivatives is impossible or difficult. Our experiments show that DSS attains performance levels equivalent to that of GS while dividing computational cost by a minimum factor of 4.
Optimizing Hyperparameters of Support Vector Machines by Genetic Algorithms
International Conference on Artificial Intelligence, 2005
In this paper, a combination of genetic algo- rithms and support vector machines (SVMs) is proposed. SVMs are used for solving classification tasks, whereas genetic algorithms are optimization heuristics combining direct and stochastic search within a solution space. Here, the solution space is formed by combinations of different SVM's kernel functions and kernel parameters. We investigate classification performance of evolutionary
Evolutionary Support Vector Machine for Parameters Optimization Applied to Medical Diagnostic
Proceedings of the International Conference on Computer Vision Theory and Applications, 2011
The parameter selection is very important for successful modelling of input-output relationship in a function classification model. In this study, support vector machine (SVM) has been used as a function classification tool for accurate segregation and genetic algorithm (GA) has been utilised for optimisation of the parameters of the SVM model. Having as input only five selected features, parameters optimisation for SVM is applied. The five selected features are mean of contrast, mean of homogeneity, mean of sum average, mean of sum variance and range of autocorrelation. The performance of the proposed model has been compared with a statistical approach. Despite the fact that Grid algorithm has fewer processing time, it does not seem to be efficient. Testing results show that the proposed GA-SVM model outperforms the statistical approach in terms of accuracy and computational efficiency.
Applied Intelligence
Support Vector Machines (SVMs) deliver state-of-the-art performance in real-world applications and are now established as one of the standard tools for machine learning and data mining. A key problem of these methods is how to choose an optimal kernel and how to optimise its parameters. The real-world applications have also emphasised the need to consider a combination of kernels—a multiple kernel—in order to boost the classification accuracy by adapting the kernel to the characteristics of heterogeneous data. This combination could be linear or non-linear, weighted or un-weighted. Several approaches have been already proposed to find a linear weighted kernel combination and to optimise its parameters together with the SVM parameters, but no approach has tried to optimise a non-linear weighted combination. Therefore, our goal is to automatically generate and adapt a kernel combination (linear or non-linear, weighted or un-weighted, according to the data) and to optimise both the kernel parameters and SVM parameters by evolutionary means in a unified framework. We will denote our combination as a kernel of kernels (KoK). Numerical experiments show that the SVM algorithm, involving the evolutionary kernel of kernels (eKoK) we propose, performs better than well-known classic kernels whose parameters were optimised and a state of the art convex linear and an evolutionary linear, respectively, kernel combinations. These results emphasise the fact that the SVM algorithm could require a non-linear weighted combination of kernels.
Tuning and evolution of support vector kernels
Evolutionary Intelligence, 2012
Kernel-based methods like Support Vector Machines (SVM) have been established as powerful techniques in machine learning. The idea of SVM is to perform a mapping φ from the input space to a higher-dimensional feature space using a kernel function k, so that a linear learning algorithm can be employed. However, the burden of choosing the appropriate kernel function is usually left to the user. It can easily be shown, that the accuracy of the learned model highly depends on the chosen kernel function and its parameters, especially for complex tasks. In order to obtain a good classification or regression model, an appropriate kernel function must be used.
SVM Modeling via a Hybrid Genetic Strategy. A Health Care Application
Studies in health technology and informatics, 2005
This paper addresses the model selection problem for Support Vector Machines. A hybrid genetic algorithm guided by Direct Simplex Search to evolves hyperparameter values using an empirical error estimate as a steering criterion. This approach is specificaly tailored and experimentally evaluated on a health care problem which involves discriminating 11 % nosocomially infected patients from 89 % non infected patients. The combination of Direct Search Simplex with GAs is shown to improve the performance of GAs in terms of solution quality and computational efficiency. Unlike most other hyperparameter tuning techniques, our hybrid approach does not require supplementary effort such as computation of derivatives, making them well suited for practical purposes. This method produces encouraging results: it exhibits high performance and good convergence properties.
Evolutionary Support Vector Regression Machines
2006 Eighth International Symposium on Symbolic and Numeric Algorithms for Scientific Computing
Evolutionary support vector machines (ESVMs) are a novel technique that assimilates the learning engine of the state-of-the-art support vector machines (SVMs) but evolves the coefficients of the decision function by means of evolutionary algorithms (EAs). The new method has accomplished the purpose for which it has been initially developed, that of a simpler alternative to the canonical SVM approach for solving the optimization component of training. ESVMs, as SVMs, are natural tools for primary application to classification. However, since the latter had been further on extended to also handle regression, it is the scope of this paper to present the corresponding evolutionary paradigm. In particular, we consider the hybridization with the classical -support vector regression ( -SVR) introduced by Vapnik and the subsequent evolution of the coefficients of the regression hyperplane. -evolutionary support regression ( -ESVR) is validated on the Boston housing benchmark problem and the obtained results demonstrate the promise of ESVMs also as concerns regression.
Evolutionary Feature and Parameter Selection in Support Vector Regression
2007
A genetic approach is presented in this article to deal with two problems: a) feature selection and b) the determination of parameters in Support Vector Regression (SVR). We consider a kind of genetic algorithm (GA) in which the probabilities of mutation and crossover are determined in the evolutionary process. Some empirical experiments are made to measure the efficiency of this algorithm against two frequently used approaches.
Support vector machine learning with an evolutionary engine
Journal of the Operational Research Society, 2009
The paper presents a novel evolutionary technique constructed as an alternative of the standard support vector machines architecture. The approach adopts the learning strategy of the latter but aims to simplify and generalize its training, by offering a transparent substitute to the initial black-box. Contrary to the canonical technique, the evolutionary approach can at all times explicitly acquire the coefficients of the decision function, without any further constraints. Moreover, in order to converge, the evolutionary method does not require the positive (semi-)definition properties for kernels within nonlinear learning. Several potential structures, enhancements and additions are proposed, tested and confirmed using available benchmarking test problems. Computational results show the validity of the new approach in terms of runtime, prediction accuracy and flexibility.
Support Vector Machine: Applications and Improvements Using Evolutionary Algorithms
Algorithms for Intelligent Systems, 2019
A description of the theory and the mathematical base of support vector machines with a survey on its applications is first presented in this chapter. Then, a method for obtaining nonlinear kernel of support vector machines is proposed. The proposed method uses the gray wolf optimizer for solving the corresponding nonlinear optimization problem. A sensitivity analysis is also performed on the parameter of the model to tune the resulting classifier. The method has been applied to a set of experimental data for diabetes mellitus diagnosis. Results show that the method leads to a classifier which distinguished healthy and patient cases with 87.5% of accuracy.