LIBSVM (original) (raw)
LIBSVM: A library for support vector machines
Article No.: 27, Pages 1 - 27
Published: 06 May 2011 Publication History
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
References
[1]
Boser, B. E., Guyon, I., and Vapnik, V. 1992. A training algorithm for optimal margin classifiers. In Proceedings of the 5th Annual Workshop on Computational Learning Theory. ACM Press, 144--152.
[2]
Chang, C.-C. and Lin, C.-J. 2001. Training ν-support vector classifiers: Theory and algorithms. Neural Comput. 13, 9, 2119--2147.
[3]
Chang, C.-C. and Lin, C.-J. 2002. Training ν-support vector regression: Theory and algorithms. Neural Comput. 14, 8, 1959--1977.
[4]
Chen, P.-H., Fan, R.-E., and Lin, C.-J. 2006. A study on SMO-type decomposition methods for support vector machines. IEEE Trans. Neural Netw. 17, 893--908.
[5]
Chen, P.-H., Lin, C.-J., and Schölkopf, B. 2005. A tutorial on ν-support vector machines. Appl. Stochas. Models Bus. Indust. 21, 111--136.
[6]
Cortes, C. and Vapnik, V. 1995. Support-vector network. Mach. Learn. 20, 273--297.
[7]
Crisp, D. J. and Burges, C. J. C. 2000. A geometric interpretation of ν-SVM classifiers. In Advances in Neural Information Processing Systems, S. Solla, T. Leen, and K.-R. Müller Eds., Vol. 12, MIT Press, Cambridge, MA.
[8]
Dorff, K. C., Chambwe, N., Srdanovic, M., and Campagne, F. 2010. BDVal: reproducible large-scale predictive model development and validation in high-throughput datasets. Bioinf. 26, 19, 2472--2473.
[9]
Fan, R.-E., Chen, P.-H., and Lin, C.-J. 2005. Working set selection using second order information for training SVM. J. Mach. Learn. Res. 6, 1889--1918.
[10]
Fine, S. and Scheinberg, K. 2001. Efficient svm training using low-rank kernel representations. J. Mach. Learn. Res. 2, 243--264.
[11]
Glasmachers, T. and Igel, C. 2006. Maximum-Gain working set selection for support vector machines. J. Mach. Learn. Res. 7, 1437--1466.
[12]
Grauman, K. and Darrell, T. 2005. The pyramid match kernel: Discriminative classification with sets of image features. In Proceedings of the IEEE International Conference on Computer Vision.
[13]
Hanke, M., Halchenko, Y. O., Sederberg, P. B., Hanson, S. J., Haxby, J. V., and Pollmann, S. 2009. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data. Neuroinf. 7, 1, 37--53.
[14]
Hsu, C.-W., Chang, C.-C., and Lin, C.-J. 2003. A practical guide to support vector classification. Tech. rep., Department of Computer Science, National Taiwan University.
[15]
Hsu, C.-W. and Lin, C.-J. 2002a. A comparison of methods for multi-class support vector machines. IEEE Trans. Neural Netw. 13, 2, 415--425.
[16]
Hsu, C.-W. and Lin, C.-J. 2002b. A simple decomposition method for support vector machines. Mach. Learn. 46, 291--314.
[17]
Joachims, T. 1998. Making large-scale SVM learning practical. In Advances in Kernel Methods -- Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola, Eds., MIT Press, Cambridge, MA, 169--184.
[18]
Keerthi, S. S., Chapelle, O., and DeCoste, D. 2006. Building support vector machines with reduced classifier complexity. J. Mach. Learn. Res. 7, 1493--1515.
[19]
Keerthi, S. S., Shevade, S. K., Bhattacharyya, C., and Murthy, K. R. K. 2001. Improvements to Platt's SMO algorithm for SVM classifier design. Neural Comput. 13, 637--649.
[20]
Knerr, S., Personnaz, L., and Dreyfus, G. 1990. Single-Layer learning revisited: A stepwise procedure for building and training a neural network. In Neurocomputing: Algorithms, Architectures and Applications, J. Fogelman, Ed. Springer.
[21]
Kressel, U. H.-G. 1998. Pairwise classification and support vector machines. In Advances in Kernel Methods -- Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola, Eds., MIT Press, Cambridge, MA, 255--268.
[22]
Lee, Y.-J. and Mangasarian, O. L. 2001. RSVM: Reduced support vector machines. In Proceedings of the 1st SIAM International Conference on Data Mining.
[23]
Lin, C.-J. and Weng, R. C. 2004. Simple probabilistic predictions for support vector regression. Tech. rep., Department of Computer Science, National Taiwan University.
[24]
Lin, H.-T., Lin, C.-J., and Weng, R. C. 2007. A note on Platt's probabilistic outputs for support vector machines. Mach. Learn. 68, 267--276.
[25]
List, N. and Simon, H. U. 2007. General polynomial time decomposition algorithms. J. Mach. Learn. Res. 8, 303--321.
[26]
List, N. and Simon, H. U. 2009. SVM-Optimization and steepest-descent line search. In Proceedings of the 22nd Annual Conference on Computational Learning Theory.
[27]
Nivre, J., Hall, J., Nilsson, J., Chanev, A., Eryigit, G., Kubler, S., Marinov, S., and Marsi, E. 2007. MaltParser: A language-independent system for data-driven dependency parsing. Natural Lang. Engin. 13, 2, 95--135.
[28]
Osuna, E., Freund, R., and Girosi, F. 1997a. Support vector machines: Training and applications. AI Memo 1602, Massachusetts Institute of Technology.
[29]
Osuna, E., Freund, R., and Girosi, F. 1997b. Training support vector machines: An application to face detection. In Proceedings of CVPR'97. IEEE, Los Alamitos, CA, 130--136.
[30]
Palagi, L. and Sciandrone, M. 2005. On the convergence of a modified version of SVMlight algorithm. Optimiz. Methods Softw. 20, 2--3, 315--332.
[31]
Platt, J. C. 1998. Fast training of support vector machines using sequential minimal optimization. In Advances in Kernel Methods - Support Vector Learning, B. Schölkopf, C. J. C. Burges, and A. J. Smola, Eds. MIT Press, Cambridge, MA.
[32]
Platt, J. C. 2000. Probabilistic outputs for support vector machines and comparison to regularized likelihood methods. In Advances in Large Margin Classifiers, A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, Eds. MIT Press, Cambridge, MA.
[33]
Schölkopf, B., Platt, J. C., Shawe-Taylor, J., Smola, A. J., and Williamson, R. C. 2001. Estimating the support of a high-dimensional distribution. Neural Comput. 13, 7, 1443--1471.
[34]
Schölkopf, B., Smola, A., Williamson, R. C., and Bartlett, P. L. 2000. New support vector algorithms. Neural Comput. 12, 1207--1245.
[35]
Segata, N. and Blanzieri, E. 2010. Fast and scalable local kernel machines. J. Mach. Learn. Res. 11, 1883--1926.
[36]
Vapnik, V. 1998. Statistical Learning Theory. Wiley, New York.
[37]
Wu, T.-F., Lin, C.-J., and Weng, R. C. 2004. Probability estimates for multi-class classification by pairwise coupling. J. Mach. Learn. Res. 5, 975--1005.
Information & Contributors
Information
Published In
ACM Transactions on Intelligent Systems and Technology Volume 2, Issue 3
April 2011
259 pages
Copyright © 2011 ACM.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Published: 06 May 2011
Accepted: 01 February 2011
Received: 01 January 2011
Published in TIST Volume 2, Issue 3
Permissions
Request permissions for this article.
Check for updates
Author Tag
Qualifiers
- Research-article
- Survey
- Refereed
Funding Sources
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- View Citations
- Downloads (Last 12 months)1,251
- Downloads (Last 6 weeks)117
Reflects downloads up to 09 Jan 2025
Other Metrics
Citations
- Zhang XLi HFan YZhang LPeng SHuang JZhang JMeng Z(2025)Predicting the Dynamic of Debris Flow Based on Viscoplastic Theory and Support Vector RegressionWater10.3390/w1701012017:1(120)Online publication date: 4-Jan-2025
- Ferrante MDe Marco PRampado OGianusso LOriggi D(2025)Effective Dose Estimation in Computed Tomography by Machine LearningTomography10.3390/tomography1101000211:1(2)Online publication date: 2-Jan-2025
- Li ZZhang SFu ZMeng FZhang L(2025)Confidence-Feature Fusion: A Novel Method for Fog Density Estimation in Object Detection SystemsElectronics10.3390/electronics1402021914:2(219)Online publication date: 7-Jan-2025
- Li JZhang MNiu KZhang YKe YYang X(2025)High-Security HEVC Video Steganography Method Using the Motion Vector Prediction Index and Motion Vector DifferenceTsinghua Science and Technology10.26599/TST.2024.901001630:2(813-829)Online publication date: Apr-2025
- Zhang ZDong YHong W(2025)Long Short-Term Memory-Based Twin Support Vector Regression for Probabilistic Load ForecastingIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.333535536:1(1764-1778)Online publication date: Jan-2025
- Yang RPeng HLi ALi PLiu CYu P(2025)Hierarchical Abstracting Graph KernelIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2024.350902837:2(724-738)Online publication date: Feb-2025
- Han JWang HZhao GLiu XXiao XHe Y(2025)Indirect Detection of Short-Pitch Rail Corrugation Using Acoustic SignalIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2024.350278974(1-13)Online publication date: 2025
- Wang XLyu JKim BParameshachari BLi KLi Q(2025)Exploring Multimodal Multiscale Features for Sentiment Analysis Using Fuzzy-Deep Neural Network LearningIEEE Transactions on Fuzzy Systems10.1109/TFUZZ.2024.341914033:1(28-42)Online publication date: Jan-2025
- Taconné MCorino VMainardi L(2025)An ECG-Based Model for Left Ventricular Hypertrophy Detection: A Machine Learning ApproachIEEE Open Journal of Engineering in Medicine and Biology10.1109/OJEMB.2024.35093796(219-226)Online publication date: 2025
- Wang LSheng JZhang QSong YZhang QWang BZhang R(2025)Diagnosis of Alzheimer’s disease using FusionNet with improved secretary bird optimization algorithm for optimal MK-SVM based on imaging genetic dataCerebral Cortex10.1093/cercor/bhae498Online publication date: 4-Jan-2025
- Show More Cited By
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Full Access
View options
View or Download as a PDF file.
eReader
View online with eReader.
Media
Figures
Other
Tables
Affiliations
Chih-Chung Chang
National Taiwan University, Taipei, Taiwan
Chih-Jen Lin
National Taiwan University, Taipei, Taiwan