Skip to main content

Multi-Objective Genetic Algorithms for Sparse Least Square Support Vector Machines

  • Conference paper
Book cover Intelligent Data Engineering and Automated Learning – IDEAL 2014 (IDEAL 2014)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8669))

Abstract

This paper introduces a new approach to building sparse least square support vector machines (LSSVM) based on multi-objective genetic algorithms (GAs) for classification tasks. LSSVM classifiers are an alternative to SVM ones due to the training process of LSSVM classifiers only requires to solve a linear equation system instead of a quadratic programming optimization problem. However, the lost of sparseness in the Lagrange multipliers vector (i.e. the solution) is a significant drawback which comes out with theses classifiers. In order to overcome this lack of sparseness, we propose a multi-objective GA approach to leave a few support vectors out of the solution without affecting the classifier’s accuracy and even improving it. The main idea is to leave out outliers, non-relevant patterns or those ones which can be corrupted with noise and thus prevent classifiers to achieve higher accuracies along with a reduced set of support vectors. We point out that the resulting sparse LSSVM classifiers achieve equivalent (in some cases, superior) performances than standard full-set LSSVM classifiers over real data sets. Differently from previous works, genetic algorithms are used in this work to obtain sparseness not to find out the optimal values of the LSSVM hyper-parameters.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Carvalho, B.P.R., Braga, A.P.: IP-LSSVM: A two-step sparse classifier. Pattern Recognition Letters 30, 1507–1515 (2009)

    Article  Google Scholar 

  2. Dubois-Lacoste, J., López-Ibáñez, M., Stützle, T.: Combining two search paradigms for multi-objective optimization: Two-phase and pareto local search. In: Talbi, E.-G. (ed.) Hybrid Metaheuristics. SCI, vol. 434, pp. 97–117. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  3. Geebelen, D., Suykens, J.A.K., Vandewalle, J.: Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans. on Neural Networks and Learning Systems 23(4), 682–688 (2012)

    Article  Google Scholar 

  4. Li, Y., Lin, C., Zhang, W.: Improved sparse least-squares support vector machine classifiers. Neurocomputing 69, 1655–1658 (2006)

    Article  Google Scholar 

  5. Mustafa, M., Sulaiman, M., Shareef, H., Khalid, S.: Reactive power tracing in pool-based power system utilising the hybrid genetic algorithm and least squares SVM. IET Generation, Transmission & Distribution 6(2), 133–141 (2012)

    Article  Google Scholar 

  6. Peres, R., Pedreira, C.E.: Generalized risk zone: Selecting observations for classification. IEEE Pattern Analysis and Machine Intelligence 31(7), 1331–1337 (2009)

    Article  Google Scholar 

  7. Steinwart, I.: Sparseness of support vector machines. Journal of Machine Learning Research 4, 1071–1105 (2003)

    MathSciNet  Google Scholar 

  8. Suykens, J.A.K., Lukas, L., Vandewalle, J.: Sparse least squares support vector machine classifiers. In: Proceedings of the 8th European Symposium on Artificial Neural Networks (ESANN 2000), pp. 37–42 (2000)

    Google Scholar 

  9. Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9(3), 293–300 (1999)

    Article  MathSciNet  Google Scholar 

  10. Valyon, J., Horvath, G.: A sparse least squares support vector machine classifier. In: Proceedings of IEEE IJCNN, vol. 1, pp. 543–548 (2004)

    Google Scholar 

  11. Vapnik, V.N.: Statistical Learning Theory. Wiley-Interscience (1998)

    Google Scholar 

  12. Yu, L., Chen, H., Wang, S., Lai, K.K.: Evolving least squares SVM for stock market trend mining. IEEE Trans. on Evolutionary Computation 13(1), 87–102 (2009)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Silva, D.A., Rocha Neto, A.R. (2014). Multi-Objective Genetic Algorithms for Sparse Least Square Support Vector Machines. In: Corchado, E., Lozano, J.A., Quintián, H., Yin, H. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2014. IDEAL 2014. Lecture Notes in Computer Science, vol 8669. Springer, Cham. https://doi.org/10.1007/978-3-319-10840-7_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-10840-7_20

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-10839-1

  • Online ISBN: 978-3-319-10840-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics