Nature Inspiration for Support Vector Machines (original) (raw)

Abstract

We propose in this paper a new kernel, suited for Support Vector Machines learning, which is inspired from the biological world. The kernel is based on Gabor filters that are a good model for the response of the cells in the primary visual cortex and have been shown to be very effective in processing natural images. Furthermore, we build a link between energy-efficiency, which is a driving force in biological processing systems, and good generalization ability of learning machines. This connection can be the starting point for developing new kernel-based learning algorithms.

Preview

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Anguita, D., Pischiutta, S., Ridella, S., Sterpi, D.: Feed–forward Support Vector Machines without Multipliers. IEEE Trans. on Neural Networks (in press, 2006)
    Google Scholar
  2. Anguita, D., Boni, A., Ridella, S.: A Digital Architecture for Support Vector Machines: Theory, Algorithm and FPGA Implementation. IEEE Trans. on Neural Networks 14, 993–1009 (2003)
    Article Google Scholar
  3. Anthony, M., Bartlett, P.L.: Neural Networks Learning: Theoretical Foundations. Cambridge University Press, Cambridge (1999)
    Book Google Scholar
  4. Bartlett, P.L.: The Sample Complexity of Pattern Classification with Neural Networks: The Size of the Weights is More Important than the Size of the Network. IEEE Transactions on Information Theory 44, 525–536 (1998)
    Article MATH MathSciNet Google Scholar
  5. Chandrakasan, A., Brodersen, R.: Minimizing power consumption in digital CMOS circuits. Proc. of the IEEE 83, 498–523 (1995)
    Article Google Scholar
  6. Cortes, C., Vapnik, V.: Support–vector networks. Machine Learning 27, 273–297 (1991)
    Google Scholar
  7. Genton, M.G.: Classes of kernel for machine learning. Journal of Machine Learning Research 2, 299–312 (2001)
    Article Google Scholar
  8. Herbrich, R.: Learning Kernel Classifiers: Theory and Algorithms. MIT Press, Cambridge (2002)
    Google Scholar
  9. Herbrich, R., Graepel, T., Shawe-Taylor, J.: Sparsity vs. Margins for Linear Classifiers. In: Proc. of the 13th Conf. on Computational Learning Theory, pp. 304–308 (2000)
    Google Scholar
  10. Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison-Wesley, Reading (1997)
    Google Scholar
  11. The International Technology Roadmap for Semiconductors. ITRS (2005), http://public.itrs.net
  12. Jones, J.P., Palmer, L.A.: An evaluation of the two-dimensional Gabor filter model of simple receptive fields in cat striate cortex. J. Neurophysiol. 58, 1233–1258 (1987)
    Google Scholar
  13. Mitchell, T.: Machine learning. McGraw-Hill, New York (1997)
    MATH Google Scholar
  14. Parhami, B.: Computer arithmetic: algorithms and hardware design. Oxford University Press, Oxford (2000)
    Google Scholar
  15. Poggio, T., Girosi, F.: Networks for approximation and learning. Proc. of the IEEE 78, 1481–1497 (1987)
    Article Google Scholar
  16. Poggio, T., Girosi, F.: A Theory of Networks for Approximation and Learning. Technical Report 1140, MIT AI Lab (1989)
    Google Scholar
  17. Poggio, T., Mukherjee, S., Rifkin, R., Rahklin, A., Verri, A.: b. Technical Report 198, MIT CBCL (2001)
    Google Scholar
  18. Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42, 287–320 (2001)
    Article MATH Google Scholar
  19. Rumelhart, D.E., McClelland, J.L.: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge (1986)
    Google Scholar
  20. Schmidhuber, J.: Discovering neural nets w1997ith low Kolmogorov complexity and high generalization capability. Neural Networks 10, 857–873 (1997)
    Article Google Scholar
  21. Schölkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, T., Vapnik, V.: Comparing support vector machines with Gaussian kernels to radial basis function classifiers. IEEE Trans. on Signal Processing 45, 2758–2765 (1997)
    Article Google Scholar
  22. Shawe-Taylor, J., Bartlett, P.L., Williamson, R.C., Anthony, M.: Structural Risk Minimization over Data-dependent Hierarchies. IEEE Trans. on Information Theory 44, 1926–1940 (1998)
    Article MATH MathSciNet Google Scholar
  23. Valiant, L.G.: A theory of the learnable. Comm. of the ACM 27, 1134–1142 (1984)
    Article MATH Google Scholar
  24. Vapnik, V.: Statistical Learning Theory. John Wiley & Sons, Chichester (1998)
    MATH Google Scholar
  25. Vapnik, V.: The Elements of Statistical Learning Theory, 2nd edn. Springer, Heidelberg (2000)
    Google Scholar
  26. Vincent, B.T., Baddeley, R.J.: Synaptic energy efficiency in retinal processing. Vision Research 43, 1283–1290 (2003)
    Article Google Scholar
  27. Wang, Y., Chua, C.-S.: Face recognition using 2D and 3D images using 3D Gabor filters. Image and Vision Computing 23, 1018–1028 (2005)
    Article Google Scholar

Download references

Author information

Authors and Affiliations

  1. Dept. of Biophysical and Electronic Engineering, University of Genoa, 16145, Genoa, Italy
    Davide Anguita & Dario Sterpi

Authors

  1. Davide Anguita
  2. Dario Sterpi

Editor information

Editors and Affiliations

  1. School of Design, Engineering and Computing, Bournemouth University, UK
    Bogdan Gabrys
  2. Centre for SMART Systems, School of Environment and Technology, University of Brighton, BN2 4GJ, Brighton, UK
    Robert J. Howlett
  3. School of Electrical and Information Engineering, Knowledge Based Intelligent Engineering Systems Centre, University of South Australia, SA, 5095, Mawson Lakes, Australia
    Lakhmi C. Jain

Rights and permissions

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Anguita, D., Sterpi, D. (2006). Nature Inspiration for Support Vector Machines. In: Gabrys, B., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2006. Lecture Notes in Computer Science(), vol 4252. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893004\_57

Download citation

Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Publish with us