Improving Support Vector Machines Performance Using Local Search

  • S. ConsoliEmail author
  • J. Kustra
  • P. Vos
  • M. Hendriks
  • D. Mavroeidis
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10710)


In this paper, we propose a method for optimization of the parameters of a Support Vector Machine which is more accurate than the usually applied grid search method. The method is based on Iterated Local Search, a classic metaheuristic that performs multiple local searches in different parts of the space domain. When the local search arrives at a local optimum, a perturbation step is performed to calculate the starting point of a new local search based on the previously found local optimum. In this way, exploration of the space domain is balanced against wasting time in areas that are not giving good results. We show a preliminary evaluation of our method on a radial-basis kernel and some sample data, showing that it is more accurate than an application of grid search on the same problem. The method is applicable to other kernels and future work should demonstrate to what extent our Iterated Local Search based method outperforms the standard grid search method over other heterogeneous datasets from different domains.


  1. 1.
    Aarts, E., Korst, J., Michiels, W.: Simulated annealing. Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques, pp. 187–210 (2005)Google Scholar
  2. 2.
    Alpaydin, E.: Introduction to Machine Learning (Adaptive Computation and Machine Learning). The MIT Press, Cambridge (2009)zbMATHGoogle Scholar
  3. 3.
    Anaissi, A., Goyal, M., Catchpoole, D.R., Braytee, A., Kennedy, P.J.: Ensemble feature learning of genomic data using support vector machine. PLoS One 11(6), 1 June 2016, Article Number e0157330 (2016)CrossRefGoogle Scholar
  4. 4.
    Balaprakash, P., Birattari, M., Stützle, T.: Improvement strategies for the F-race algorithm: sampling design and iterative refinement. In: Bartz-Beielstein, T., Blesa Aguilera, M.J., Blum, C., Naujoks, B., Roli, A., Rudolph, G., Sampels, M. (eds.) HM 2007. LNCS, vol. 4771, pp. 108–122. Springer, Heidelberg (2007). Scholar
  5. 5.
    Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(1), 281–305 (2012)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Ceylan, O., Taşkn, G.: SVM parameter selection based on harmony search with an application to hyperspectral image classification. In: 24th Signal Processing and Communication Application Conference (SIU), pp. 657–660 (2016)Google Scholar
  7. 7.
    Cherkassky, V., Ma, Y.: Practical selection of SVM parameters and noise estimation for SVM regression. Neural Networks 17(1), 113–126 (2004)CrossRefGoogle Scholar
  8. 8.
    Conca, P., Stracquadanio, G., Nicosia, G.: Automatic tuning of algorithms through sensitivity minimization. In: Pardalos, P., Pavone, M., Farinella, G.M., Cutello, V. (eds.) MOD 2015. LNCS, vol. 9432, pp. 14–25. Springer, Cham (2015). Scholar
  9. 9.
    Cortes, C., Vapnik, V.N.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)zbMATHGoogle Scholar
  10. 10.
    Gatos, I., Tsantis, S., Spiliopoulos, S., Karnabatidis, D., Theotokas, I., Zoumpoulis, P., Loupas, T., Hazle, J.D., Kagadis, G.C.: A new computer aided diagnosis system for evaluation of chronic liver disease with ultrasound shear wave elastography imaging. Med. Phys. 43(3), 1428–1436 (2016)CrossRefGoogle Scholar
  11. 11.
    Hansen, P., Mladenović, N., Moreno-Pérez, J.A.: Variable neighbourhood search: methods and applications. Ann. Oper. Res. 175(1), 367–407 (2010)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). Scholar
  13. 13.
    Hutter, F., Stützle, T., Leyton-Brown, K., Hoos, H.H.: ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. 36(1), 267–306 (2009)CrossRefGoogle Scholar
  14. 14.
    Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 137–142. Springer, Heidelberg (1998). Scholar
  15. 15.
    Kecman, V.: Learning and Soft Computing. The MIT Press, Cambridge (2001)zbMATHGoogle Scholar
  16. 16.
    Keerthi, S.: Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms. IEEE Trans. Neural Networks 13(5), 1225–1229 (2002)CrossRefGoogle Scholar
  17. 17.
    Kwok, J.T., Tsang, I.W.: Linear dependency between \(\epsilon \) and the input noise in \(\epsilon \)-support vector regression. IEEE Trans. Neural Networks 14(3), 544–553 (2003)CrossRefGoogle Scholar
  18. 18.
    Lameski, P., Zdravevski, E., Mingov, R., Kulakov, A.: SVM parameter tuning with grid search and its impact on reduction of model over-fitting. In: Yao, Y., Hu, Q., Yu, H., Grzymala-Busse, J.W. (eds.) RSFDGrC 2015. LNCS (LNAI), vol. 9437, pp. 464–474. Springer, Cham (2015). Scholar
  19. 19.
    López-Ibáñez, M., Dubois-Lacoste, J., Pérez-Cáceres, L., Birattari, M., Stützle, T.: The irace package: Iterated racing for automatic algorithm configuration. Operat. Res. Perspect. 3, 43–58 (2016)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Lourenço, H.R.: Job-shop scheduling: computational study of local search and large-step optimization methods. Eur. J. Oper. Res. 83(2), 347–364 (1995)CrossRefGoogle Scholar
  21. 21.
    Lourenço, H.R., Martin, O.C., Stützle, T.: Iterated local search: framework and applications. In: Gendreau, M., Potvin, J.Y. (eds.) Handbook of Metaheuristics. International Series in Operations Research & Management Science, vol. 146, pp. 363–397. Springer, Boston (2010). Scholar
  22. 22.
    Mattera, D., Haykin, S.: Support vector machines for dynamic reconstruction of a chaotic system. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods, pp. 211–241. MIT Press, Cambridge (1999)Google Scholar
  23. 23.
    McLachlan, G.J., Do, K.-A., Ambroise, C.: Analyzing Microarray Gene Expression Data. Wiley, New York (2004)CrossRefGoogle Scholar
  24. 24.
    Melvin, I., Ie, E., Kuang, R., Weston, J., Stafford, W.N.N., Leslie, C.: SVM-Fold: a tool for discriminative multi-class protein fold and superfamily recognition. BMC Bioinform. 8(Suppl. 4), S2 (2007)CrossRefGoogle Scholar
  25. 25.
    Osuna, E., Freund, R., Girosit, F.: Training support vector machines: an application to face detection. In: Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR 1997), pp. 130–137. IEEE Computer Society (1997)Google Scholar
  26. 26.
    Pardalos, P.M., Resende, M.G.C.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)CrossRefGoogle Scholar
  27. 27.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, New York (2004)CrossRefGoogle Scholar
  28. 28.
    Sherin, B.M., Supriya, M.H.: Selection and parameter optimization of SVM kernel function for underwater target classification. In: 2015 IEEE Underwater Technology (UT), pp. 1–5 (2015)Google Scholar
  29. 29.
    Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (2000)CrossRefGoogle Scholar
  30. 30.
    Vos, P.C., Hambrock, T., Hulsbergen van de Kaa, C.A., Futterer, J.J., Barentsz, J.O., Huisman, H.J.: Computerized analysis of prostate lesions in the peripheral zone using dynamic contrast enhanced MRI. Med. Phys. 35(3), 888–899 (2008)CrossRefGoogle Scholar
  31. 31.
    Yang, C., Ding, L., Liao, S.: Parameter tuning via kernel matrix approximation for support vector machine. J. Comput. 7(8), 2047–2054 (2012)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • S. Consoli
    • 1
    Email author
  • J. Kustra
    • 1
  • P. Vos
    • 1
  • M. Hendriks
    • 1
  • D. Mavroeidis
    • 1
  1. 1.Philips ResearchEindhovenThe Netherlands

Personalised recommendations