Abstract
In many practical engineering problems, function forms cannot be given explicitly in terms of decision variables, but the value of functions can be evaluated for given decision variables through some experiments such as structural analysis, fluid mechanic analysis and so on. These experiments are usually expensive. In such cases, therefore, meta-models are usually constructed on the basis of a less number of samples. Those meta-models are improved in sequence by adding a few samples at one time in order to obtain a good approximate model with as a less number of samples as possible. Support vector machines (SVM) can be effectively applied to meta-modeling. In practical implementation of SVMs, however, it is important to tune parameters of kernels appropriately. Usually, cross validation (CV) techniques are applied to this purpose. However, CV techniques require a lot of function evaluation, which is not realistic in many real engineering problems. This paper shows that applying ensembled support vector machines makes it possible to tune parameters automatically within reasonable computation time.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Depending on the number of data, we should decide the number of weak learners (M) at each layer. However, it is known from our experiences that M should be taken between 3 and 6, and results by the proposed method are not so sensitive to the number of weak learners.
- 2.
Especially, because Sobol’s g-function is strongly anisotropic, it is difficult to provide good performance by using single machine of SVRs [5].
References
Cortes, C., Vapnik, V.: Support vector networks. Mach. Learn. 20, 273–297 (1995)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, New York (2000)
Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall, Upper Saddle River (1998)
Kitayama, S., Arakawa, M., Yamazaki, K.: Global optimization by generalized random tunneling algorithm (5th report, approximate optimization using RBF network). Trans. Japan Soc. Mech. Eng. Part C 73(5), 1299–1306 (2007)
Moustapha, M, Sudret, B., Bourinet, J.M., Guillaume, B.: Metamodeling for crashworthiness design: comparative study of kriging and support vector regression. In: Proceedings of the 2nd International Symposium on Uncertainty Quantification and Stochastic Modeling (2014)
Nakayama, H., Arakawa, M., Sasaki, R.: Simulation-based optimization using computational intelligence. Optim. Eng. 3, 201–214 (2002)
Nakayama, H., Yun, Y.B., Uno, Y.: Parameter tuning of large scale support vector machines using ensemble learning with applications to imbalanced data sets. IEEE Syst. Man Cybern. Conf. 2012, 2815–2820 (2012)
Nakayama, H., Yun, Y.B., Uno, Y.: Combining predetermined models and SVM/RBFN for regression problems. In: The 6th China-Japan-Korea Joint Symposium on Optimization of Structural and Mechanical Systems, PaperNo. J-64, 8 pages in CD-ROM (2010)
Nakayama, H., Yun, Y.B., Yoon, M.: Sequential Approximate Multiobjective Optimization using Computational Intelligence. Springer, Heidelberg (2009)
Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)
Surjanovic, S., Bingham, D. http://www.sfu.ca/~ssurjano/gfunc.html. Last Updated: January 2015
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yun, Y., Nakayama, H. (2016). Ensembled Support Vector Machines for Meta-Modeling. In: Chen, J., Nakamori, Y., Yue, W., Tang, X. (eds) Knowledge and Systems Sciences. KSS 2016. Communications in Computer and Information Science, vol 660. Springer, Singapore. https://doi.org/10.1007/978-981-10-2857-1_18
Download citation
DOI: https://doi.org/10.1007/978-981-10-2857-1_18
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-2856-4
Online ISBN: 978-981-10-2857-1
eBook Packages: Computer ScienceComputer Science (R0)