Abstract
Improving the classification performance is a crucial step of any machine learning method. In order to achieve a better classification Support Vector Machines need to tune parameters and to select relevant variables. To simultaneously perform both targets an embedded approach can be considered. This method consists of a two-layer algorithm where an evolutionary approach handles the solutions and an approximated one evaluates them. The evolutionary search, based on approximated error measures computed on the kernel matrix, allows discovering solutions which have high classification accuracy. The aim of the paper is to verify whether the proposed method is able to find reliable solutions which enhance the classification performance. The proposed method is applied on three real-world datasets using three kernels. In the experiments it is compared against the enclosed Genetic Algorithms and SVMs approach to demonstrate the ability of the approximated method to achieve high classification accuracy in a shorter time.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Bengio, Y.: Gradient-based optimization of hyperparameters. Neural computation 12(8), 1889–1900 (2000)
Boser, B.E., Guyon, I.M., Vapnik, V.N.: Training algorithm for optimal margin classifiers. In: Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory (1992)
Braga, P.L., Oliveira., A.L.I., Meira, S.R.L.: A ga-based feature selection and parameters optimization for support vector regression applied to software effort estimation. In: Proceedings Alessandro Perolini of the 23rd Annual ACM Symposium on Applied Computing, SAC’08, pp. 1788–1792. Association for Computing Machinery (2008)
Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)
Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines (2001)
Chapelle, O., Vapnik, V.: Model Selection for Support Vector Machines (2000)
Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Machine Learning 46(1-3), 131–159 (2002)
Cortes, C., Vapnik, V.: Support-vector networks. Machine Learning 20(3), 273–297 (1995)
Cristianini, N., Kandola, J., Elisseeff, A., Shawe-Taylor, J.: On kernel-target alignment. In: Advances in Neural Information Processing Systems 14, vol. 14, pp. 367–373 (2002)
Cristianini, N., Shawe-Taylor, J.: An introduction to support vector machines and other kernelbased learning methods. Cambridge University Press (2000)
Duan, K., Keerthi, S.S., Poo, A.N.: Evaluation of simple performance measures for tuning svm hyperparameters. Neurocomputing 51, 41–59 (2003)
Fröohlich, H., Chapelle, O., Schöolkopf, B.: Feature selection for support vector machines by means of genetic algorithms. In: Proceedings of the 15th IEEE International Conference on Tools with artificial Intelligence, pp. 142–148 (2003)
Goldberg, D.E.: Genetic Algorithms in Search, Optimization andMachine Learning. Addison- Wesley Longman Publishing Co., Inc, Boston, MA, USA (1989)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)
Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI, USA (1975)
Huang, H.L., Chang, F.L.: Esvm: Evolutionary support vector machine for automatic feature selection and classification of microarray data. BioSystems 90(2), 516–528 (2007)
Jia, L., Liao, S.: Combinatorial kernel matrix model selection using feature distances. In: Proceedings of International Conference on Intelligent Computation Technology and Automation, ICICTA 2008, vol. 1, pp. 40–43 (2008)
Joachims, T.: Learning to Classify Text Using Support Vector Machines: Methods, Theory and Algorithms. Kluwer Academic Publishers, Norwell, MA, USA (2002)
Kira, K., Rendell, L.A.: Feature selection problem: traditional methods and a new algorithm. In: Proceedings 10th National Conference on Artificial Intelligence - AAAI-92, pp. 129–134 (1992)
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97(1-2), 273–324 (1997)
Nguyen, C.H., Ho, T.B.: An efficient kernel matrix evaluation measure. Pattern Recognition 41(11), 3366–3372 (2008)
Rakotomamonjy, A.: Variable selection using svm-based criteria. Journal of Machine Learning Research 3, 1357–1370 (2003)
Vapnik, V.: The nature of statistical learning theory. Springer-Verlag New York, Inc (1995)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Wu, C.H., Tzeng, G.H., Goo, Y.J., Fang, W.C.: A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Systems with Applications 32(2), 397–408 (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag London Limited
About this paper
Cite this paper
Perolini, A. (2011). A Fast Approximated Evolutionary Approach to Improve SVM Accuracy. In: Bramer, M., Petridis, M., Hopgood, A. (eds) Research and Development in Intelligent Systems XXVII. SGAI 2010. Springer, London. https://doi.org/10.1007/978-0-85729-130-1_14
Download citation
DOI: https://doi.org/10.1007/978-0-85729-130-1_14
Published:
Publisher Name: Springer, London
Print ISBN: 978-0-85729-129-5
Online ISBN: 978-0-85729-130-1
eBook Packages: Computer ScienceComputer Science (R0)