Uncertainty Handling in Model Selection for Support Vector Machines

  • Tobias Glasmachers
  • Christian Igel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5199)


We consider evolutionary model selection for support vector machines. Hold-out set-based objective functions are natural model selection criteria, and we introduce a symmetrization of the standard cross-validation approach. We propose the covariance matrix adaptation evolution strategy (CMA-ES) with uncertainty handling for optimizing the new randomized objective function. Our results show that this search strategy avoids premature convergence and results in improved classification accuracy compared to strategies without uncertainty handling.


Support Vector Machine Generalization Error Search Point Model Selection Criterion Evolution Strategy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Cortes, C., Vapnik, V.: Support-vector networks. Machine Learning 20, 273–297 (1995)zbMATHGoogle Scholar
  2. 2.
    Evgeniou, T., Pontil, M., Poggio, T.: Regularization networks and support vector machines. Advances in Computational Mathematics 13, 1–50 (2000)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Friedrichs, F., Igel, C.: Evolutionary Tuning of Multiple SVM Parameters. Neurocomputing 64, 107–117 (2005)CrossRefGoogle Scholar
  4. 4.
    Mersch, B., Glasmachers, T., Meinicke, P., Igel, C.: Evolutionary Optimization of Sequence Kernels for Detection of Bacterial Gene Starts. International Journal of Neural Systems 17, 369–381 (2007); Selected paper of ICANN 2006CrossRefGoogle Scholar
  5. 5.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9, 159–195 (2001)CrossRefGoogle Scholar
  6. 6.
    Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation 11, 1–18 (2003)CrossRefGoogle Scholar
  7. 7.
    Hansen, N., Niederberger, A.S.P., Guzzella, L., Koumoutsakos, P.: Evolutionary optimization of feedback controllers for thermoacoustic instabilities. In: Morrison, J.F., Birch, D.M., Lavoie, P. (eds.) IUTAM Symposium on Flow Control and MEMS. Springer, Heidelberg (2008)Google Scholar
  8. 8.
    Hansen, N., Niederberger, A.S.P., Guzzella, L., Koumoutsakos, P.: A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion. IEEE Transactions on Evolutionary Computation (in press, 2008)Google Scholar
  9. 9.
    Beyer, H.G.: Evolution strategies. Scholarpedia 2, 1965 (2007)CrossRefGoogle Scholar
  10. 10.
    Jin, Y., Branke, J.: Evolutionary optimization in uncertain environments—a survey. IEEE Transactions on Evolutionary Computation 9, 303–317 (2005)CrossRefGoogle Scholar
  11. 11.
    Arnold, D.V.: Noisy Optimization With Evolution Strategies. Kluwer Academic Publishers, Dordrecht (2002)CrossRefzbMATHGoogle Scholar
  12. 12.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  13. 13.
    Glasmachers, T., Igel, C.: Gradient-based Adaptation of General Gaussian Kernels. Neural Computation 17, 2099–2105 (2005)CrossRefzbMATHMathSciNetGoogle Scholar
  14. 14.
    Rätsch, G., Onoda, T., Müller, K.R.: Soft Margins for AdaBoost. Machine Learning 42, 287–320 (2001)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Tobias Glasmachers
    • 1
  • Christian Igel
    • 1
  1. 1.Institut für NeuroinformatikRuhr-Universität BochumGermany

Personalised recommendations