Skip to main content

A Survey on SVM Hyper-Parameters Optimization Techniques

  • Conference paper
  • First Online:
Social Networking and Computational Intelligence

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 100))

Abstract

Support vector machines can be considered as one of the most powerful classifiers. They are parameterized models build upon the support vectors extracted during the training phase. One of the crucial tasks in the modeling of SVM is to select optimal values for its hyper-parameters, because the effectiveness and efficiency of SVM depend upon these parameters. This task of tuning the values for the SVM hyper-parameters is called as the SVM model selection problem. Till now a lot of techniques have been proposed for optimizing values of hyper-parameters of SVM both in static and dynamic environment. Static environment is one where the knowledge about a problem does not change over time due to which static optimal values can be assigned to the hyper-parameters. On the other hand, due to the changing nature of the knowledge about a problem, in dynamic environment the optimization process has to be flexible enough to adapt the changes quickly. In dynamic environment, re-evaluation of the optimal values of the hyper-parameters is needed. This paper attempts to identify various optimization techniques used for SVM hyper-parameters tuning and recognize their pros and cons.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Vapnik V (1995) The nature of statistical learning theory. Springer New York Google Scholar

    Google Scholar 

  2. Bousquet, O, Elisseeff A (2002) Stability and generalization. J Mach Learn Res 2:499–526

    Google Scholar 

  3. Vapnik V, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12(9):2013–2036

    Article  Google Scholar 

  4. Do H, Kalousis A (2013) Convex formulations of radius-margin based support vector machines. In: International conference on machine learning, pp 169–177

    Google Scholar 

  5. Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge

    Google Scholar 

  6. Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159

    Article  Google Scholar 

  7. Ayat N-E, Cheriet M, Suen CY (2005) Automatic model selection for the optimization of SVM kernels. Pattern Recogn 38(10):1733–1745

    Article  Google Scholar 

  8. Wilson DRl, Martinez TR (2003) The general inefficiency of batch training for gradient descent learning. Neural Netw 16(10):1429–1451

    Article  Google Scholar 

  9. Bottou L (2010) Large-scale machine learning with stochastic gradient descent. In: Proceedings of COMPSTAT’ 2010. Physica-Verlag HD, pp 177–186

    Google Scholar 

  10. Hsu C-W, Lin C-J (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2):415–425

    Article  Google Scholar 

  11. Huang C-M, Lee Y-J, Lin DKJ, Huang S-Y (2007) Model selection for support vector machines via uniform design. Comput Stat Data Anal 52(1):335–346

    Article  MathSciNet  Google Scholar 

  12. Chunhong Z, Licheng J (2004) Automatic parameters selection for SVM based on GA. In: Fifth world congress on intelligent control and automation, 2004. WCICA 2004, vol 2. IEEE, pp 1869–1872

    Google Scholar 

  13. Gilles C, Hilario M, Geissbuhler A (2004) Model selection for support vector classifiers via genetic algorithms. An application to medical decision support. In: International symposium on biological and medical data analysis. Springer, Berlin, pp 200–211

    Google Scholar 

  14. Chatelain C, Adam S, Lecourtier Y, Laurent L, Paquet T (2007) Multi-objective optimization for SVM model selection. In: ICDAR, vol 1, pp 427–431

    Google Scholar 

  15. Lessmann S, Stahlbock R, Crone SF (2006) Genetic algorithms for support vector machine model selection. IJCNN 6:3063–3069

    Google Scholar 

  16. Igel C, Hansen N, Roth S (2007) Covariance matrix adaptation for multi-objective optimization. Evol Comput 15(1):1–28

    Article  Google Scholar 

  17. Friedrichs F, Igel C (2005) Evolutionary tuning of multiple SVM parameters. Neurocomputing 64:107–117

    Article  Google Scholar 

  18. Janson S, Middendorf M (2004) A hierarchical particle swarm optimizer for dynamic optimization problems. In: Workshops on applications of evolutionary computation. Springer, Berlin, pp 513–524

    Google Scholar 

  19. Kennedy J (2011) Particle swarm optimization. In: Encyclopedia of machine learning. Springer, Boston, pp 760–766

    Google Scholar 

  20. Kennedy, J, Mendes R (2002) Population structure and particle swarm performance. In: Proceedings of the 2002 congress on evolutionary computation, CEC’02, vol. 2. IEEE, pp 1671–1676

    Google Scholar 

  21. Kapp MN, Sabourin R, Maupin P (2012) A dynamic model selection strategy for support vector machine classifiers. Appl Soft Comput 12(8):2550–2565

    Article  Google Scholar 

  22. Weston J (1998) Support vector machine (and statistical learning theory) tutorial. NEC Labs Am 4

    Google Scholar 

  23. Xiao T, Ren D, Lei S, Zhang J, Liu X (2014) Based on grid-search and PSO parameter optimization for support vector machine. In: 2014 11th world congress on intelligent control and automation (WCICA). IEEE, pp. 1529–1533

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dhruba Jyoti Kalita .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kalita, D.J., Singh, V.P., Kumar, V. (2020). A Survey on SVM Hyper-Parameters Optimization Techniques. In: Shukla, R., Agrawal, J., Sharma, S., Chaudhari, N., Shukla, K. (eds) Social Networking and Computational Intelligence. Lecture Notes in Networks and Systems, vol 100. Springer, Singapore. https://doi.org/10.1007/978-981-15-2071-6_20

Download citation

Publish with us

Policies and ethics