Abstract
The paper deals with surrogate modelling, a modern approach to the optimization of objective functions evaluated via measurements. The approach leads to a substantial decrease of time and costs of evaluation of the objective function, a property that is particularly attractive in evolutionary optimization. The paper recalls common strategies for using surrogate models in evolutionary optimization, and proposes two extensions to those strategies – extension to boosted surrogate models and extension to using a set of models. These are currently being implemented, in connection with surrogate modelling based on feed-forward neural networks, in a software tool for problem-tailored evolutionary optimization of catalytic materials. The paper presents results of experimentally testing already implemented parts and comparing boosted surrogate models with models without boosting, which clearly confirms the usefulness of both proposed extensions.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Baerns, M., Holeňa, M.: Combinatorial Development of Solid Catalytic Materials. Design of High-Throughput Experiments, Data Analysis, Data Mining. World Scientific, Singapore (2009)
Ong, Y., Nair, P., Keane, A., Wong, K.: Surrogate-assisted evolutionary optimization frameworks for high-fidelity engineering design problems. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 307–331. Springer, Berlin (2005)
Zhou, Z., Ong, Y., Nair, P., Keane, A., Lum, K.: Combining global and local surrogate models to accellerate evolutionary optimization. IEEE Transactions on Systems, Man and Cybernetics. Part C: Applications and Reviews 37, 66–76 (2007)
Brooker, A., Dennis, J., Frank, P., Serafini, D.: A rigorous framework for optimization by surrogates. Structural and Multidisciplinary Optimization 17, 1–13 (1998)
Ulmer, H., Streichert, F., Zell, A.: Model assisted evolution strategies. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 333–355. Springer, Heidelberg (2005)
Jin, Y., Hüsken, M., Olhofer, M.B.S.: Neural networks for fitness approximation in evolutionary optimization. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 281–306. Springer, Berlin (2005)
Ratle, A.: Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 87–96. Springer, Heidelberg (1998)
Hornik, K.: Approximation capabilities of multilayer neural networks. Neural Networks 4, 251–257 (1991)
Pinkus, A.: Approximation theory of the MPL model in neural networks. Acta Numerica 8, 277–283 (1998)
Kainen, P., Kůrková, V., Sanguineti, M.: Estimates of approximation rates by gaussian radial-basis functions. In: Adaptive and Natural Computing Algorithms, pp. 11–18. Springer, Berlin (2007)
Drucker, H.: Improving regressors using boosting techniques. In: Sharkey, A. (ed.) Proceedings of the 14th International Conference on Machine Learning, pp. 107–115. Springer, London (1997)
Altinçay, H.: Optimal resampling and classifier prototype selection in classifier ensembles using genetic algorithms. Pattern Analysis and Applications 7, 285–295 (2004)
Larrañaga, P., Lozano, J.: Estimation of Distribution Algorithms. Kluwer Academic Publishers, Boston (2002)
Vovk, V., Gammerman, A., Shafer, G.: Algorithmic Learning in a Random World. Springer, Berlin (2005)
Vapnik, V.: Statistical Learning Theory. John Wiley and Sons, New York (1998)
Bosnić, Z., Kononenko, I.: Comparison of approaches for estimating reliability of individual regression predictions. Data & Knowledge Engineering 67, 504–516 (2008)
Möhmel, S., Steinfeldt, N., Endgelschalt, S., Holeňa, M., Kolf, S., Dingerdissen, U., Wolf, D., Weber, R., Bewersdorf, M.: New catalytic materials for the high-temperature synthesis of hydrocyanic acid from methane and ammonia by high-throughput approach. Applied Catalysis A: General 334, 73–83 (2008)
Holeňa, M., Baerns, M.: Computer-aided strategies for catalyst development. In: Ertl, G., Knözinger, H., Schüth, F., Eitkamp, J. (eds.) Handbook of Heterogeneous Catalysis, pp. 66–81. Wiley-VCH, Weinheim (2008)
Holeňa, M., Cukic, T., Rodemerck, U., Linke, D.: Optimization of catalysts using specific, description based genetic algorithms. Journal of Chemical Information and Modeling 48, 274–282 (2008)
Holeňa, M.: Present trends in the application of genetic algorithms to heterogeneous catalysis. In: Hagemeyer, A., Strasser, P., Volpe, A. (eds.) High-Throughput Screening in Chemical Catalysis, pp. 153–172. Wiley-VCH, Weinheim (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Holeňa, M., Linke, D., Rodemerck, U., Bajer, L. (2010). Neural Networks as Surrogate Models for Measurements in Optimization Algorithms. In: Al-Begain, K., Fiems, D., Knottenbelt, W.J. (eds) Analytical and Stochastic Modeling Techniques and Applications. ASMTA 2010. Lecture Notes in Computer Science, vol 6148. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13568-2_25
Download citation
DOI: https://doi.org/10.1007/978-3-642-13568-2_25
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-13567-5
Online ISBN: 978-3-642-13568-2
eBook Packages: Computer ScienceComputer Science (R0)