Skip to main content

RSGALS-SVM: Random Subspace Method Applied to a LS-SVM Ensemble Optimized by Genetic Algorithm

  • Conference paper
Intelligent Data Engineering and Automated Learning - IDEAL 2012 (IDEAL 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7435))

Abstract

The Support Vector Machines (SVMs) have received great emphasis in the pattern classification due its good ability to generalize. The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming. Both the SVMs and the LS-SVMs provide some free parameters that have to be tuned to reflect the requirements of the given task. Despite their high performance, lots of tools have been developed to improve them, mainly the development of new classifying methods and the employment of ensembles. So, in this paper, our proposal is to use both the theory of ensembles and a genetic algorithm to enhance the LS-SVM classification. First, we randomly divide the problem into subspaces to generate diversity among the classifiers of the ensemble. So, we apply a genetic algorithm to optimize the classification of this ensemble of LS-SVM, testing with some benchmark data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.: Statistical Learning Theory. John Wiley and Sons Inc., New York (1998)

    MATH  Google Scholar 

  2. Suykens, J.A.K., Vandewalle, J.: Least-Squares Support Vector Machine Classifiers. Neural Processing Letters 9(3) (1999)

    Google Scholar 

  3. Osuna, E., Freund, R., Girosi, F.: An Improved Training Algorithm for Support Vector Machines. In: NNSP 1997 (1997)

    Google Scholar 

  4. Lima, N., Dória Neto, A., Melo, J.: Creating an Ensemble of Diverse Support Vector Machines Using Adaboost. In: Proceedings on International Joint Conference on Neural Networks (2009)

    Google Scholar 

  5. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  6. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings 13th International Conference on Machine Learning, pp. 148–156 (1996)

    Google Scholar 

  7. Ho, T.K.: The Random subspace method for constructing decision forests. IEEE Transactions Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  8. Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman & Hall, New York (1993)

    MATH  Google Scholar 

  9. Padilha, C., Lima, N., Dória Neto, A., Melo, J.: An Genetic Approach to Support Vector Machines in classification problems. In: Proceedings on International Joint Conference on Neural Networks (2010)

    Google Scholar 

  10. Fletcher, R., Johnson, T.: On the stability of null-space methods for KKT systems. SIAM J. Matrix Anal. Appl. 18(4), 938–958 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  11. Castro, L., Zuben, F.V.: Algoritmos Genéticos. Universidade Estadual de Campinas (2002), ftp://ftp.dca.fee.unicamp.br/pub/docs/vonzuben/ia707_02/topico9_02.pdf

  12. Kuncheva, L., Whitaker, C.: Measures in diversity in classifier ensembles and their relationship with ensemble accuracy. Machine Learning 51(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  13. Hansen, L., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12, 993–1001 (1990)

    Article  Google Scholar 

  14. Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, vol. 7, pp. 231–238. MIT Press, Cambridge (1995)

    Google Scholar 

  15. Opitz, D., Shavlik, J.: Actively searching for an effective neural-network ensemble. Connection Science 8(3/4), 337–353 (1996)

    Article  Google Scholar 

  16. Benchmark Repository (2012), http://www.fml.tuebingen.mpg.de/Members/raetsch/benchmark

  17. Rätsch, G., Onoda, T., Müller, K.-R.: Soft Margins for Adaboost. Machine Learning 42 (2001)

    Google Scholar 

  18. Opitz, D.: Feature Selection for Ensembles. In: Proceedings of the Sixteenth National Conference on Artificial Intelligence (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Padilha, C., Neto, A.D.D., Melo, J.D. (2012). RSGALS-SVM: Random Subspace Method Applied to a LS-SVM Ensemble Optimized by Genetic Algorithm. In: Yin, H., Costa, J.A.F., Barreto, G. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2012. IDEAL 2012. Lecture Notes in Computer Science, vol 7435. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32639-4_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32639-4_31

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32638-7

  • Online ISBN: 978-3-642-32639-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics