Abstract
The paper deals with a neural-network-based version of surrogate modelling, a modern approach to the optimization of empirical objective functions. The approach leads to a substantial decrease of time and costs of evaluation of the objective function, a property that is particularly attractive in evolutionary optimization. In the paper, an extension of surrogate modelling with regression boosting is proposed, which increases the accuracy of surrogate models, thus also the agreement between results obtained with the model and those obtained with the original objective function. The extension is illustrated on a case study in materials science. Presented case study results clearly confirm the usefulness of boosting for neural-network-based surrogate models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Reeves, C., Rowe, J.: Genetic Algorithms: Principles and Perspectives. Kluwer Academic Publishers, Boston (2003)
Schaefer, R.: Foundation of Global Genetic Optimization. Springer, Heidelberg (2007)
Baerns, M., Holeňa, M.: Combinatorial Development of Solid Catalytic Materials. Design of High-Throughput Experiments, Data Analysis, Data Mining. World Scientific, Singapore (2009)
Brooker, A., Dennis, J., Frank, P., Serafini, D.V.T., Trosset, M.: A rigorous framework for optimization by surrogates. Structural and Multidisciplinary Optimization 17, 1–13 (1998)
Ong, Y., Nair, P., Keane, A., Wong, K.: Surrogate-assisted evolutionary optimization frameworks for high-fidelity engineering design problems. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 307–331. Springer, Heidelberg (2005)
Ulmer, H., Streichert, F., Zell, A.: Model assisted evolution strategies. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 333–355. Springer, Heidelberg (2005)
Zhou, Z., Ong, Y., Nair, P., Keane, A., Lum, K.: Combining global and local surrogate models to accellerate evolutionary optimization. IEEE Transactions on Systems, Man and Cybernetics. Part C: Applications and Reviews 37, 66–76 (2007)
Jin, Y., Hüsken, M., Olhofer, M., Sendhoff, B.: Neural networks for fitness approximation in evolutionary optimization. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 281–306. Springer, Heidelberg (2005)
Ratle, A.: Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 87–96. Springer, Heidelberg (1998)
Hornik, K.: Approximation capabilities of multilayer neural networks. Neural Networks 4, 251–257 (1991)
Pinkus, A.: Approximation theory of the MPL model in neural networks. Acta Numerica 8, 277–283 (1998)
Ertl, G., Knözinger, H., Schüth, F., Eitkamp, J. (eds.): Handbook of Heterogeneous Catalysis. Wiley-VCH, Weinheim (2008)
Schapire, R.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)
Friedman, J.: Greedy function approximation: A gradient boosting machine. Annals of Statistics 29, 1189–1232 (2001)
Shrestha, D.: Experiments with AdaBoost.RT, an improved boosting scheme for regression. Neural Computation 18, 1678–1710 (2006)
Drucker, H.: Improving regression using boosting techniques. In: Sharkey, A. (ed.) Proceedings of the 14th International Conference on Machine Learning, pp. 107–115. Springer, Heidelberg (1997)
Altinçay, H.: Optimal resampling and classifier prototype selection in classifier ensembles using genetic algorithms. Pattern Analysis and Applications 7, 285–295 (2004)
Möhmel, S., Steinfeldt, N., Endgelschalt, S., Holeňa, M., Kolf, S., Dingerdissen, U., Wolf, D., Weber, R., Bewersdorf, M.: New catalytic materials for the high-temperature synthesis of hydrocyanic acid from methane and ammonia by high-throughput approach. Applied Catalysis A: General 334, 73–83 (2008)
Wolf, D., Buyevskaya, O., Baerns, M.: An evolutionary approach in the combinatorial selection and optimization of catalytic materials. Applied Catalyst A: General 200, 63–77 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Holeňa, M., Linke, D., Steinfeldt, N. (2009). Boosted Neural Networks in Evolutionary Computation. In: Leung, C.S., Lee, M., Chan, J.H. (eds) Neural Information Processing. ICONIP 2009. Lecture Notes in Computer Science, vol 5864. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10684-2_15
Download citation
DOI: https://doi.org/10.1007/978-3-642-10684-2_15
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10682-8
Online ISBN: 978-3-642-10684-2
eBook Packages: Computer ScienceComputer Science (R0)