Skip to main content

Ensembled Support Vector Machines for Meta-Modeling

  • Conference paper
  • First Online:
Knowledge and Systems Sciences (KSS 2016)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 660))

Included in the following conference series:

Abstract

In many practical engineering problems, function forms cannot be given explicitly in terms of decision variables, but the value of functions can be evaluated for given decision variables through some experiments such as structural analysis, fluid mechanic analysis and so on. These experiments are usually expensive. In such cases, therefore, meta-models are usually constructed on the basis of a less number of samples. Those meta-models are improved in sequence by adding a few samples at one time in order to obtain a good approximate model with as a less number of samples as possible. Support vector machines (SVM) can be effectively applied to meta-modeling. In practical implementation of SVMs, however, it is important to tune parameters of kernels appropriately. Usually, cross validation (CV) techniques are applied to this purpose. However, CV techniques require a lot of function evaluation, which is not realistic in many real engineering problems. This paper shows that applying ensembled support vector machines makes it possible to tune parameters automatically within reasonable computation time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Depending on the number of data, we should decide the number of weak learners (M) at each layer. However, it is known from our experiences that M should be taken between 3 and 6, and results by the proposed method are not so sensitive to the number of weak learners.

  2. 2.

    Especially, because Sobol’s g-function is strongly anisotropic, it is difficult to provide good performance by using single machine of SVRs [5].

References

  1. Cortes, C., Vapnik, V.: Support vector networks. Mach. Learn. 20, 273–297 (1995)

    MATH  Google Scholar 

  2. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, New York (2000)

    Book  MATH  Google Scholar 

  3. Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall, Upper Saddle River (1998)

    MATH  Google Scholar 

  4. Kitayama, S., Arakawa, M., Yamazaki, K.: Global optimization by generalized random tunneling algorithm (5th report, approximate optimization using RBF network). Trans. Japan Soc. Mech. Eng. Part C 73(5), 1299–1306 (2007)

    Article  Google Scholar 

  5. Moustapha, M, Sudret, B., Bourinet, J.M., Guillaume, B.: Metamodeling for crashworthiness design: comparative study of kriging and support vector regression. In: Proceedings of the 2nd International Symposium on Uncertainty Quantification and Stochastic Modeling (2014)

    Google Scholar 

  6. Nakayama, H., Arakawa, M., Sasaki, R.: Simulation-based optimization using computational intelligence. Optim. Eng. 3, 201–214 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. Nakayama, H., Yun, Y.B., Uno, Y.: Parameter tuning of large scale support vector machines using ensemble learning with applications to imbalanced data sets. IEEE Syst. Man Cybern. Conf. 2012, 2815–2820 (2012)

    Google Scholar 

  8. Nakayama, H., Yun, Y.B., Uno, Y.: Combining predetermined models and SVM/RBFN for regression problems. In: The 6th China-Japan-Korea Joint Symposium on Optimization of Structural and Mechanical Systems, PaperNo. J-64, 8 pages in CD-ROM (2010)

    Google Scholar 

  9. Nakayama, H., Yun, Y.B., Yoon, M.: Sequential Approximate Multiobjective Optimization using Computational Intelligence. Springer, Heidelberg (2009)

    MATH  Google Scholar 

  10. Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)

    Google Scholar 

  11. Surjanovic, S., Bingham, D. http://www.sfu.ca/~ssurjano/gfunc.html. Last Updated: January 2015

  12. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yeboon Yun .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Nature Singapore Pte Ltd.

About this paper

Cite this paper

Yun, Y., Nakayama, H. (2016). Ensembled Support Vector Machines for Meta-Modeling. In: Chen, J., Nakamori, Y., Yue, W., Tang, X. (eds) Knowledge and Systems Sciences. KSS 2016. Communications in Computer and Information Science, vol 660. Springer, Singapore. https://doi.org/10.1007/978-981-10-2857-1_18

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-2857-1_18

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-2856-4

  • Online ISBN: 978-981-10-2857-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics