Abstract
Support vector machines can be considered as one of the most powerful classifiers. They are parameterized models build upon the support vectors extracted during the training phase. One of the crucial tasks in the modeling of SVM is to select optimal values for its hyper-parameters, because the effectiveness and efficiency of SVM depend upon these parameters. This task of tuning the values for the SVM hyper-parameters is called as the SVM model selection problem. Till now a lot of techniques have been proposed for optimizing values of hyper-parameters of SVM both in static and dynamic environment. Static environment is one where the knowledge about a problem does not change over time due to which static optimal values can be assigned to the hyper-parameters. On the other hand, due to the changing nature of the knowledge about a problem, in dynamic environment the optimization process has to be flexible enough to adapt the changes quickly. In dynamic environment, re-evaluation of the optimal values of the hyper-parameters is needed. This paper attempts to identify various optimization techniques used for SVM hyper-parameters tuning and recognize their pros and cons.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Vapnik V (1995) The nature of statistical learning theory. Springer New York Google Scholar
Bousquet, O, Elisseeff A (2002) Stability and generalization. J Mach Learn Res 2:499–526
Vapnik V, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12(9):2013–2036
Do H, Kalousis A (2013) Convex formulations of radius-margin based support vector machines. In: International conference on machine learning, pp 169–177
Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge
Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159
Ayat N-E, Cheriet M, Suen CY (2005) Automatic model selection for the optimization of SVM kernels. Pattern Recogn 38(10):1733–1745
Wilson DRl, Martinez TR (2003) The general inefficiency of batch training for gradient descent learning. Neural Netw 16(10):1429–1451
Bottou L (2010) Large-scale machine learning with stochastic gradient descent. In: Proceedings of COMPSTAT’ 2010. Physica-Verlag HD, pp 177–186
Hsu C-W, Lin C-J (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2):415–425
Huang C-M, Lee Y-J, Lin DKJ, Huang S-Y (2007) Model selection for support vector machines via uniform design. Comput Stat Data Anal 52(1):335–346
Chunhong Z, Licheng J (2004) Automatic parameters selection for SVM based on GA. In: Fifth world congress on intelligent control and automation, 2004. WCICA 2004, vol 2. IEEE, pp 1869–1872
Gilles C, Hilario M, Geissbuhler A (2004) Model selection for support vector classifiers via genetic algorithms. An application to medical decision support. In: International symposium on biological and medical data analysis. Springer, Berlin, pp 200–211
Chatelain C, Adam S, Lecourtier Y, Laurent L, Paquet T (2007) Multi-objective optimization for SVM model selection. In: ICDAR, vol 1, pp 427–431
Lessmann S, Stahlbock R, Crone SF (2006) Genetic algorithms for support vector machine model selection. IJCNN 6:3063–3069
Igel C, Hansen N, Roth S (2007) Covariance matrix adaptation for multi-objective optimization. Evol Comput 15(1):1–28
Friedrichs F, Igel C (2005) Evolutionary tuning of multiple SVM parameters. Neurocomputing 64:107–117
Janson S, Middendorf M (2004) A hierarchical particle swarm optimizer for dynamic optimization problems. In: Workshops on applications of evolutionary computation. Springer, Berlin, pp 513–524
Kennedy J (2011) Particle swarm optimization. In: Encyclopedia of machine learning. Springer, Boston, pp 760–766
Kennedy, J, Mendes R (2002) Population structure and particle swarm performance. In: Proceedings of the 2002 congress on evolutionary computation, CEC’02, vol. 2. IEEE, pp 1671–1676
Kapp MN, Sabourin R, Maupin P (2012) A dynamic model selection strategy for support vector machine classifiers. Appl Soft Comput 12(8):2550–2565
Weston J (1998) Support vector machine (and statistical learning theory) tutorial. NEC Labs Am 4
Xiao T, Ren D, Lei S, Zhang J, Liu X (2014) Based on grid-search and PSO parameter optimization for support vector machine. In: 2014 11th world congress on intelligent control and automation (WCICA). IEEE, pp. 1529–1533
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Kalita, D.J., Singh, V.P., Kumar, V. (2020). A Survey on SVM Hyper-Parameters Optimization Techniques. In: Shukla, R., Agrawal, J., Sharma, S., Chaudhari, N., Shukla, K. (eds) Social Networking and Computational Intelligence. Lecture Notes in Networks and Systems, vol 100. Springer, Singapore. https://doi.org/10.1007/978-981-15-2071-6_20
Download citation
DOI: https://doi.org/10.1007/978-981-15-2071-6_20
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-2070-9
Online ISBN: 978-981-15-2071-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)