Skip to main content

Optimization of Expensive Functions by Surrogates Created from Neural Network Ensembles

  • Conference paper
Book cover Artificial Neural Nets and Genetic Algorithms
  • 283 Accesses

Abstract

The goal of this paper is to model hypothesis testing. A “real situation” is given in the form of a response surface, which is defined by a derivative-free continuous expensive objective function. An ideal hypothesis should correspond to a global minimum of this function. Thus, hypothesis testing is converted into optimization of a response surface. First, an objective function is evaluated at a few points. Then, the hypothetical (surrogate) surface landscape is created from an ensemble of approximations of the objective function. Approximations result from neural networks, which use already evaluated samples as the training set. The hypothesis landscape adapted by a merit function estimates a possibility of getting at a given point a better value, than is the currently achieved value from already evaluated points. The most promising point (a minimum of the adapted function) is used as the next sample point for the true expensive objective function. Its value is then used to adapt neural networks, creating a new hypothesis landscape. The results suggest that (1) in order to get a global minimum, it may be useful to have an estimation of the whole response surface, and therefore to explore also those points, where maxima are predicted, and (2) an assembly of modules predicting the next sample point from the same set of sample points can be more advantageous than a single neural network predictor.

This work was supported by grants # 1/7336/20 and # 1/5229/98 of the work of the Slovak Republic Grant Agency.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Fodor, J.A.: The Modularity of Mind. Cambridge, MA: MIT Press 1983.

    Google Scholar 

  2. Opitz, D. Shavlik, J.: A Genetic Algorithm Approach for Creating Neural Network Ensembles. In: Sharkey A. (ed.): Combining Artificial Neural Nets, pp. 79–97. London: Springer 1999.

    Google Scholar 

  3. Krogh, A. Vedelsby J.: Neural Network Ensembles, Cross Validation and Active Learning. In: Touretzky, D.S. Tesauro, G. Leen, T.K., (eds.): Adv. Neural Inf. Process. Syst., 7,231–238 (199

    Google Scholar 

  4. Wolpert, D.H.: Stacked Generalization. Neural Networks, 5, 241–259 (1992).

    Article  Google Scholar 

  5. Perrone, M.: Improving regression estimation: averag ing methods for variance reduction with extensions to general convex measure optimization. PhD thesis, Brown Univ., Phys. Dept. 1993.

    Google Scholar 

  6. Alpaydin, E.: Techniques for Combining Multiple Learners. In: Alpaydin, E. (ed.): Proc. Eng. Intell. Syst. ’98, vol. 2, pp. 6–12. ICSC Press 1998.

    Google Scholar 

  7. Tyler, L.K., Marslen-Wilson W.D.: Conjectures and refutations: a reply to Norris. Cognition, 11, 103–107 (1982).

    Article  Google Scholar 

  8. Bull, L.: On Model-Based Evolutionary Computation. Soft Comp., 3, 76–82 (1999).

    Article  Google Scholar 

  9. Jin, Y, Olhofer, M., Sendhoff, B.: On Evolutionary Optimization with Approximate Fitness Function. In: Whitley, D., Goldberg, D., Cantú-Paz, E., Spector, L., Parmee, I., Beyer, H.-G. (eds.): GECCO-2000 Proc., pp. 786–793. M. Kaufmann Pub. 2000.

    Google Scholar 

  10. Jones, D. Schonlau, M. Welch, W.: Efficient Global Optimization of Expensive Black-Box Functions. J. Global Optim., 14, 455–492 (1998).

    Google Scholar 

  11. Booker, A.J. Dennis, J.E. Jr. Frank, P.D. Serafini, D.B. Torczon, V. Trosset, M.W.: A rigorous framework for optimization of expensive functions by surrogates. Struct. Optim., 17, 1–13 (1999).

    Google Scholar 

  12. Torczon, V. Trosset M.W.: Using approximations to accelerate engineering design optimization. In: Proc. 7th Symp. Multidisc. Anal. Optim., St. Louis, MO, Sept. 2–4, 1998.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Wien

About this paper

Cite this paper

Pospíchal, J. (2001). Optimization of Expensive Functions by Surrogates Created from Neural Network Ensembles. In: Kůrková, V., Neruda, R., Kárný, M., Steele, N.C. (eds) Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6230-9_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-7091-6230-9_11

  • Publisher Name: Springer, Vienna

  • Print ISBN: 978-3-211-83651-4

  • Online ISBN: 978-3-7091-6230-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics