Least Squares Support Vector Machine Classifiers (original) (raw)
Abstract
In this letter we discuss a least squares version for support vector machine (SVM) classifiers. Due to equality type constraints in the formulation, the solution follows from solving a set of linear equations, instead of quadratic programming for classical SVM's. The approach is illustrated on a two-spiral benchmark classification problem.
Access this article
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime Subscribe now
Buy Now
Price excludes VAT (USA)
Tax calculation will be finalised during checkout.
Instant access to the full article PDF.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.
References
- C.M. Bishop, Neural Networks for Pattern Recognition,Oxford University Press, 1995.
- V. Cherkassky and F. Mulier, Learningfrom Data: Concepts, Theory and Methods, John Wiley and Sons, 1998.
- R. Fletcher, Practical Methods of Optimization, John Wiley and Sons: Chichester and New York, 1987.
Google Scholar - G.H. Golub and C.F. van Loan, MatrixComputations, Johns Hopkins University Press: Baltimore MD, 1989.
Google Scholar - S. Haykin, Neural Networks: A Comprehensive Foundation, Macmillan College Publishing Company: Englewood Cliffs, 1994.
Google Scholar - S. Ridella, S. Rovetta and R. Zunino, “Circular back propagation networks for classification, ” IEEE Transactions on Neural Networks, Vol. 8, No. 1, pp. 84–97, 1997.
Google Scholar - C. Saunders, A. Gammerman and V. Vovk, “Ridge regression learning algorithm in dual variables, ” Proceedings of the 15th International Conference on Machine Learning ICML-98, Madison-Wisconsin, 1998.
- B. Schölkopf, K.-K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio and V. Vapnik, “Comparing support vector machines with Gaussian kernels to radial basis function classifiers, ” IEEE Transactions on Signal Processing, Vol. 45, No. 11, pp. 2758–2765, 1997.
Google Scholar - V. Vapnik, “The nature of statistical learning theory, ” Springer-Verlag: New York, 1995.
Google Scholar - V. Vapnik, “Statistical learning theory, ” John Wiley: New York, 1998.
Google Scholar - V. Vapnik, “The support vector method of function estimation, ” in J.A.K. Suykens and J. Vandewalle (Eds) Nonlinear Modeling: Advanced Black-Box Techniques, Kluwer Academic Publishers, Boston, pp. 55–85, 1998.
Google Scholar - J.M. Zurada, Introduction to Artificial Neural Systems, West Publishing Company, 1992.
Author information
Authors and Affiliations
- Department of Electrical Engineering, Katholieke Universiteit Leuven, ESAT-SISTA Kardinaal Mercierlaan 94, B–3001, Leuven (Heverlee), Belgium, e-mail
J.A.K. Suykens & J. Vandewalle
Authors
- J.A.K. Suykens
You can also search for this author inPubMed Google Scholar - J. Vandewalle
You can also search for this author inPubMed Google Scholar
Rights and permissions
About this article
Cite this article
Suykens, J., Vandewalle, J. Least Squares Support Vector Machine Classifiers.Neural Processing Letters 9, 293–300 (1999). https://doi.org/10.1023/A:1018628609742
- Issue Date: June 1999
- DOI: https://doi.org/10.1023/A:1018628609742