Advertisement

Random Subspace Method and Genetic Algorithm Applied to a LS-SVM Ensemble

  • Carlos Padilha
  • Adrião Dória Neto
  • Jorge Melo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7553)

Abstract

The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming implemented in SVM. The LS-SVMs provide some free parameters that have to be correctly chosen in order that the performance. Lots of tools have been developed to improve their performance, mainly the development of new classifying methods and the employment of ensembles. So, in this paper, our proposal is to use both the theory of ensembles and a genetic algorithm to enhance the LS-SVM classification. First, we randomly divide the problem into subspaces to generate diversity among the classifiers of the ensemble. So, we apply a genetic algorithm to find the values of the LS-SVM parameters and also to find the weights of the linear combination of the ensemble members, used to take the final decision.

Keywords

Pattern Classification LS-SVM Ensembles Genetic Algorithm Random Subspace Method 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Vapnik, V.: Statistical Learning Theory. John Wiley and Sons Inc., New York (1998)zbMATHGoogle Scholar
  2. 2.
    Suykens, J.A.K., Vandewalle, J.: Least-Squares Support Vector Machine Classifiers. Neural Processing Letters 9(3) (1999)Google Scholar
  3. 3.
    Osuna, E., Freund, R., Girosi, F.: An Improved Training Algorithm for Support Vector Machines. In: NNSP 1997 (1997)Google Scholar
  4. 4.
    Lima, N., Dória Neto, A., Melo, J.: Creating an Ensemble of Diverse Support Vector Machines Using Adaboost. In: Proceedings on International Joint Conference on Neural Networks (2009)Google Scholar
  5. 5.
    Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)zbMATHMathSciNetGoogle Scholar
  6. 6.
    Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings 13th International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
  7. 7.
    Ho, T.K.: The Random subspace method for constructing decision forests. IEEE Transactions Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)CrossRefGoogle Scholar
  8. 8.
    Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman & Hall, New York (1993)CrossRefzbMATHGoogle Scholar
  9. 9.
    Bryll, R., Gutierrez-Osuna, R., Quek, F.: Attribute Bagging: Improving Accuracy of Classifier Ensembles by using Random Feature Subsets. Pattern Recognition 36, 1291–1302 (2003)CrossRefzbMATHGoogle Scholar
  10. 10.
    Oza, N.C., Tumer, K.: Input Decimation Ensembles: Decorrelation through Dimensionality Reduction. In: Kittler, J., Roli, F. (eds.) MCS 2001. LNCS, vol. 2096, pp. 238–247. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  11. 11.
    Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions Pattern Analysis and Machine Intelligence 20, 832–844 (1998)CrossRefGoogle Scholar
  12. 12.
    Ahn, H., Moon, H., Fazzari, M.J., Lim, N., Chen, J., Kodell, R.: Classification by ensembles from random partitions of high-dimensional data. Computational Statistics and Data Analysis 51, 6166–6179 (2007)CrossRefzbMATHMathSciNetGoogle Scholar
  13. 13.
    Padilha, C., Lima, N., Dória Neto, A., Melo, J.: An Genetic Approach to Support Vector Machines in classification problems. In: Proceedings on International Joint Conference on Neural Networks (2010)Google Scholar
  14. 14.
    Castro, L., Zuben, F.V.: Algoritmos Genéticos. Universidade Estadual de Campinas (2002), ftp://ftp.dca.fee.unicamp.br/pub/docs/vonzuben/ia707_02/topico9_02.pdf
  15. 15.
    Kuncheva, L., Whitaker, C.: Measures in diversity in classifier ensembles and their relationship with ensemble accuracy. Machine Learning 51(2), 181–207 (2003)CrossRefzbMATHGoogle Scholar
  16. 16.
    Hansen, L., Salamon, P.: Neural network ensembles. IEEE Transactions on Pattern Analysis and Machine Intelligence 12, 993–1001 (1990)CrossRefGoogle Scholar
  17. 17.
    Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, vol. 7, pp. 231–238. MIT Press, Cambridge (1995)Google Scholar
  18. 18.
    Opitz, D., Shavlik, J.: Actively searching for an effective neural-network ensemble. Connection Science 8(3/4), 337–353 (1996)CrossRefGoogle Scholar
  19. 19.
    Rätsch, G., Onoda, T., Müller, K.-R.: Soft Margins for Adaboost. Machine Learning 42 (2001)Google Scholar
  20. 20.
    Opitz, D.: Feature Selection for Ensembles. In: Proceedings of the Sixteenth National Conference on Artificial Intelligence (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Carlos Padilha
    • 1
  • Adrião Dória Neto
    • 1
  • Jorge Melo
    • 1
  1. 1.Department of Computer Engineering and AutomationFederal University of Rio Grande do NorteNatalBrazil

Personalised recommendations