Skip to main content

Neural Networks as Surrogate Models for Measurements in Optimization Algorithms

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 6148))

Abstract

The paper deals with surrogate modelling, a modern approach to the optimization of objective functions evaluated via measurements. The approach leads to a substantial decrease of time and costs of evaluation of the objective function, a property that is particularly attractive in evolutionary optimization. The paper recalls common strategies for using surrogate models in evolutionary optimization, and proposes two extensions to those strategies – extension to boosted surrogate models and extension to using a set of models. These are currently being implemented, in connection with surrogate modelling based on feed-forward neural networks, in a software tool for problem-tailored evolutionary optimization of catalytic materials. The paper presents results of experimentally testing already implemented parts and comparing boosted surrogate models with models without boosting, which clearly confirms the usefulness of both proposed extensions.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baerns, M., Holeňa, M.: Combinatorial Development of Solid Catalytic Materials. Design of High-Throughput Experiments, Data Analysis, Data Mining. World Scientific, Singapore (2009)

    Book  Google Scholar 

  2. Ong, Y., Nair, P., Keane, A., Wong, K.: Surrogate-assisted evolutionary optimization frameworks for high-fidelity engineering design problems. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 307–331. Springer, Berlin (2005)

    Google Scholar 

  3. Zhou, Z., Ong, Y., Nair, P., Keane, A., Lum, K.: Combining global and local surrogate models to accellerate evolutionary optimization. IEEE Transactions on Systems, Man and Cybernetics. Part C: Applications and Reviews 37, 66–76 (2007)

    Article  Google Scholar 

  4. Brooker, A., Dennis, J., Frank, P., Serafini, D.: A rigorous framework for optimization by surrogates. Structural and Multidisciplinary Optimization 17, 1–13 (1998)

    Google Scholar 

  5. Ulmer, H., Streichert, F., Zell, A.: Model assisted evolution strategies. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 333–355. Springer, Heidelberg (2005)

    Google Scholar 

  6. Jin, Y., Hüsken, M., Olhofer, M.B.S.: Neural networks for fitness approximation in evolutionary optimization. In: Jin, Y. (ed.) Knowledge Incorporation in Evolutionary Computation, pp. 281–306. Springer, Berlin (2005)

    Google Scholar 

  7. Ratle, A.: Accelerating the convergence of evolutionary algorithms by fitness landscape approximation. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 87–96. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  8. Hornik, K.: Approximation capabilities of multilayer neural networks. Neural Networks 4, 251–257 (1991)

    Article  Google Scholar 

  9. Pinkus, A.: Approximation theory of the MPL model in neural networks. Acta Numerica 8, 277–283 (1998)

    Google Scholar 

  10. Kainen, P., Kůrková, V., Sanguineti, M.: Estimates of approximation rates by gaussian radial-basis functions. In: Adaptive and Natural Computing Algorithms, pp. 11–18. Springer, Berlin (2007)

    Google Scholar 

  11. Drucker, H.: Improving regressors using boosting techniques. In: Sharkey, A. (ed.) Proceedings of the 14th International Conference on Machine Learning, pp. 107–115. Springer, London (1997)

    Google Scholar 

  12. Altinçay, H.: Optimal resampling and classifier prototype selection in classifier ensembles using genetic algorithms. Pattern Analysis and Applications 7, 285–295 (2004)

    Google Scholar 

  13. Larrañaga, P., Lozano, J.: Estimation of Distribution Algorithms. Kluwer Academic Publishers, Boston (2002)

    MATH  Google Scholar 

  14. Vovk, V., Gammerman, A., Shafer, G.: Algorithmic Learning in a Random World. Springer, Berlin (2005)

    MATH  Google Scholar 

  15. Vapnik, V.: Statistical Learning Theory. John Wiley and Sons, New York (1998)

    MATH  Google Scholar 

  16. Bosnić, Z., Kononenko, I.: Comparison of approaches for estimating reliability of individual regression predictions. Data & Knowledge Engineering 67, 504–516 (2008)

    Article  Google Scholar 

  17. Möhmel, S., Steinfeldt, N., Endgelschalt, S., Holeňa, M., Kolf, S., Dingerdissen, U., Wolf, D., Weber, R., Bewersdorf, M.: New catalytic materials for the high-temperature synthesis of hydrocyanic acid from methane and ammonia by high-throughput approach. Applied Catalysis A: General 334, 73–83 (2008)

    Article  Google Scholar 

  18. Holeňa, M., Baerns, M.: Computer-aided strategies for catalyst development. In: Ertl, G., Knözinger, H., Schüth, F., Eitkamp, J. (eds.) Handbook of Heterogeneous Catalysis, pp. 66–81. Wiley-VCH, Weinheim (2008)

    Google Scholar 

  19. Holeňa, M., Cukic, T., Rodemerck, U., Linke, D.: Optimization of catalysts using specific, description based genetic algorithms. Journal of Chemical Information and Modeling 48, 274–282 (2008)

    Article  Google Scholar 

  20. Holeňa, M.: Present trends in the application of genetic algorithms to heterogeneous catalysis. In: Hagemeyer, A., Strasser, P., Volpe, A. (eds.) High-Throughput Screening in Chemical Catalysis, pp. 153–172. Wiley-VCH, Weinheim (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Holeňa, M., Linke, D., Rodemerck, U., Bajer, L. (2010). Neural Networks as Surrogate Models for Measurements in Optimization Algorithms. In: Al-Begain, K., Fiems, D., Knottenbelt, W.J. (eds) Analytical and Stochastic Modeling Techniques and Applications. ASMTA 2010. Lecture Notes in Computer Science, vol 6148. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13568-2_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-13568-2_25

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-13567-5

  • Online ISBN: 978-3-642-13568-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics