Ensembled Support Vector Machines for Meta-Modeling

  • Yeboon YunEmail author
  • Hirotaka Nakayama
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 660)


In many practical engineering problems, function forms cannot be given explicitly in terms of decision variables, but the value of functions can be evaluated for given decision variables through some experiments such as structural analysis, fluid mechanic analysis and so on. These experiments are usually expensive. In such cases, therefore, meta-models are usually constructed on the basis of a less number of samples. Those meta-models are improved in sequence by adding a few samples at one time in order to obtain a good approximate model with as a less number of samples as possible. Support vector machines (SVM) can be effectively applied to meta-modeling. In practical implementation of SVMs, however, it is important to tune parameters of kernels appropriately. Usually, cross validation (CV) techniques are applied to this purpose. However, CV techniques require a lot of function evaluation, which is not realistic in many real engineering problems. This paper shows that applying ensembled support vector machines makes it possible to tune parameters automatically within reasonable computation time.


Support vector machines Bagging Boosting Gauss function Parameter tuning 


  1. 1.
    Cortes, C., Vapnik, V.: Support vector networks. Mach. Learn. 20, 273–297 (1995)zbMATHGoogle Scholar
  2. 2.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, New York (2000)CrossRefzbMATHGoogle Scholar
  3. 3.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall, Upper Saddle River (1998)zbMATHGoogle Scholar
  4. 4.
    Kitayama, S., Arakawa, M., Yamazaki, K.: Global optimization by generalized random tunneling algorithm (5th report, approximate optimization using RBF network). Trans. Japan Soc. Mech. Eng. Part C 73(5), 1299–1306 (2007)CrossRefGoogle Scholar
  5. 5.
    Moustapha, M, Sudret, B., Bourinet, J.M., Guillaume, B.: Metamodeling for crashworthiness design: comparative study of kriging and support vector regression. In: Proceedings of the 2nd International Symposium on Uncertainty Quantification and Stochastic Modeling (2014)Google Scholar
  6. 6.
    Nakayama, H., Arakawa, M., Sasaki, R.: Simulation-based optimization using computational intelligence. Optim. Eng. 3, 201–214 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Nakayama, H., Yun, Y.B., Uno, Y.: Parameter tuning of large scale support vector machines using ensemble learning with applications to imbalanced data sets. IEEE Syst. Man Cybern. Conf. 2012, 2815–2820 (2012)Google Scholar
  8. 8.
    Nakayama, H., Yun, Y.B., Uno, Y.: Combining predetermined models and SVM/RBFN for regression problems. In: The 6th China-Japan-Korea Joint Symposium on Optimization of Structural and Mechanical Systems, PaperNo. J-64, 8 pages in CD-ROM (2010)Google Scholar
  9. 9.
    Nakayama, H., Yun, Y.B., Yoon, M.: Sequential Approximate Multiobjective Optimization using Computational Intelligence. Springer, Heidelberg (2009)zbMATHGoogle Scholar
  10. 10.
    Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)Google Scholar
  11. 11.
    Surjanovic, S., Bingham, D. Last Updated: January 2015
  12. 12.
    Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2016

Authors and Affiliations

  1. 1.Kansai UniversityOsakaJapan
  2. 2.Konan UniversityKobeJapan

Personalised recommendations