Abstract
This paper introduces a new approach to building sparse least square support vector machines (LSSVM) based on multi-objective genetic algorithms (GAs) for classification tasks. LSSVM classifiers are an alternative to SVM ones due to the training process of LSSVM classifiers only requires to solve a linear equation system instead of a quadratic programming optimization problem. However, the lost of sparseness in the Lagrange multipliers vector (i.e. the solution) is a significant drawback which comes out with theses classifiers. In order to overcome this lack of sparseness, we propose a multi-objective GA approach to leave a few support vectors out of the solution without affecting the classifier’s accuracy and even improving it. The main idea is to leave out outliers, non-relevant patterns or those ones which can be corrupted with noise and thus prevent classifiers to achieve higher accuracies along with a reduced set of support vectors. We point out that the resulting sparse LSSVM classifiers achieve equivalent (in some cases, superior) performances than standard full-set LSSVM classifiers over real data sets. Differently from previous works, genetic algorithms are used in this work to obtain sparseness not to find out the optimal values of the LSSVM hyper-parameters.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Carvalho, B.P.R., Braga, A.P.: IP-LSSVM: A two-step sparse classifier. Pattern Recognition Letters 30, 1507–1515 (2009)
Dubois-Lacoste, J., López-Ibáñez, M., Stützle, T.: Combining two search paradigms for multi-objective optimization: Two-phase and pareto local search. In: Talbi, E.-G. (ed.) Hybrid Metaheuristics. SCI, vol. 434, pp. 97–117. Springer, Heidelberg (2013)
Geebelen, D., Suykens, J.A.K., Vandewalle, J.: Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans. on Neural Networks and Learning Systems 23(4), 682–688 (2012)
Li, Y., Lin, C., Zhang, W.: Improved sparse least-squares support vector machine classifiers. Neurocomputing 69, 1655–1658 (2006)
Mustafa, M., Sulaiman, M., Shareef, H., Khalid, S.: Reactive power tracing in pool-based power system utilising the hybrid genetic algorithm and least squares SVM. IET Generation, Transmission & Distribution 6(2), 133–141 (2012)
Peres, R., Pedreira, C.E.: Generalized risk zone: Selecting observations for classification. IEEE Pattern Analysis and Machine Intelligence 31(7), 1331–1337 (2009)
Steinwart, I.: Sparseness of support vector machines. Journal of Machine Learning Research 4, 1071–1105 (2003)
Suykens, J.A.K., Lukas, L., Vandewalle, J.: Sparse least squares support vector machine classifiers. In: Proceedings of the 8th European Symposium on Artificial Neural Networks (ESANN 2000), pp. 37–42 (2000)
Suykens, J.A.K., Vandewalle, J.: Least squares support vector machine classifiers. Neural Processing Letters 9(3), 293–300 (1999)
Valyon, J., Horvath, G.: A sparse least squares support vector machine classifier. In: Proceedings of IEEE IJCNN, vol. 1, pp. 543–548 (2004)
Vapnik, V.N.: Statistical Learning Theory. Wiley-Interscience (1998)
Yu, L., Chen, H., Wang, S., Lai, K.K.: Evolving least squares SVM for stock market trend mining. IEEE Trans. on Evolutionary Computation 13(1), 87–102 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Silva, D.A., Rocha Neto, A.R. (2014). Multi-Objective Genetic Algorithms for Sparse Least Square Support Vector Machines. In: Corchado, E., Lozano, J.A., Quintián, H., Yin, H. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2014. IDEAL 2014. Lecture Notes in Computer Science, vol 8669. Springer, Cham. https://doi.org/10.1007/978-3-319-10840-7_20
Download citation
DOI: https://doi.org/10.1007/978-3-319-10840-7_20
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-10839-1
Online ISBN: 978-3-319-10840-7
eBook Packages: Computer ScienceComputer Science (R0)