Least Squares Support Vector Machine Classifiers (original) (raw)

Abstract

In this letter we discuss a least squares version for support vector machine (SVM) classifiers. Due to equality type constraints in the formulation, the solution follows from solving a set of linear equations, instead of quadratic programming for classical SVM's. The approach is illustrated on a two-spiral benchmark classification problem.

Access this article

Log in via an institution

Subscribe and save

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. C.M. Bishop, Neural Networks for Pattern Recognition,Oxford University Press, 1995.
  2. V. Cherkassky and F. Mulier, Learningfrom Data: Concepts, Theory and Methods, John Wiley and Sons, 1998.
  3. R. Fletcher, Practical Methods of Optimization, John Wiley and Sons: Chichester and New York, 1987.
    Google Scholar
  4. G.H. Golub and C.F. van Loan, MatrixComputations, Johns Hopkins University Press: Baltimore MD, 1989.
    Google Scholar
  5. S. Haykin, Neural Networks: A Comprehensive Foundation, Macmillan College Publishing Company: Englewood Cliffs, 1994.
    Google Scholar
  6. S. Ridella, S. Rovetta and R. Zunino, “Circular back propagation networks for classification, ” IEEE Transactions on Neural Networks, Vol. 8, No. 1, pp. 84–97, 1997.
    Google Scholar
  7. C. Saunders, A. Gammerman and V. Vovk, “Ridge regression learning algorithm in dual variables, ” Proceedings of the 15th International Conference on Machine Learning ICML-98, Madison-Wisconsin, 1998.
  8. B. Schölkopf, K.-K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio and V. Vapnik, “Comparing support vector machines with Gaussian kernels to radial basis function classifiers, ” IEEE Transactions on Signal Processing, Vol. 45, No. 11, pp. 2758–2765, 1997.
    Google Scholar
  9. V. Vapnik, “The nature of statistical learning theory, ” Springer-Verlag: New York, 1995.
    Google Scholar
  10. V. Vapnik, “Statistical learning theory, ” John Wiley: New York, 1998.
    Google Scholar
  11. V. Vapnik, “The support vector method of function estimation, ” in J.A.K. Suykens and J. Vandewalle (Eds) Nonlinear Modeling: Advanced Black-Box Techniques, Kluwer Academic Publishers, Boston, pp. 55–85, 1998.
    Google Scholar
  12. J.M. Zurada, Introduction to Artificial Neural Systems, West Publishing Company, 1992.

Download references

Author information

Authors and Affiliations

  1. Department of Electrical Engineering, Katholieke Universiteit Leuven, ESAT-SISTA Kardinaal Mercierlaan 94, B–3001, Leuven (Heverlee), Belgium, e-mail
    J.A.K. Suykens & J. Vandewalle

Authors

  1. J.A.K. Suykens
    You can also search for this author inPubMed Google Scholar
  2. J. Vandewalle
    You can also search for this author inPubMed Google Scholar

Rights and permissions

About this article

Cite this article

Suykens, J., Vandewalle, J. Least Squares Support Vector Machine Classifiers.Neural Processing Letters 9, 293–300 (1999). https://doi.org/10.1023/A:1018628609742

Download citation