Optimization of Expensive Functions by Surrogates Created from Neural Network Ensembles

  • Jiří Pospíchal
Conference paper


The goal of this paper is to model hypothesis testing. A “real situation” is given in the form of a response surface, which is defined by a derivative-free continuous expensive objective function. An ideal hypothesis should correspond to a global minimum of this function. Thus, hypothesis testing is converted into optimization of a response surface. First, an objective function is evaluated at a few points. Then, the hypothetical (surrogate) surface landscape is created from an ensemble of approximations of the objective function. Approximations result from neural networks, which use already evaluated samples as the training set. The hypothesis landscape adapted by a merit function estimates a possibility of getting at a given point a better value, than is the currently achieved value from already evaluated points. The most promising point (a minimum of the adapted function) is used as the next sample point for the true expensive objective function. Its value is then used to adapt neural networks, creating a new hypothesis landscape. The results suggest that (1) in order to get a global minimum, it may be useful to have an estimation of the whole response surface, and therefore to explore also those points, where maxima are predicted, and (2) an assembly of modules predicting the next sample point from the same set of sample points can be more advantageous than a single neural network predictor.


Neural Network Response Surface Merit Function Search Point Neural Network Ensemble 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Fodor, J.A.: The Modularity of Mind. Cambridge, MA: MIT Press 1983.Google Scholar
  2. [2]
    Opitz, D. Shavlik, J.: A Genetic Algorithm Approach for Creating Neural Network Ensembles. In: Sharkey A. (ed.): Combining Artificial Neural Nets, pp. 79–97. London: Springer 1999.Google Scholar
  3. [3]
    Krogh, A. Vedelsby J.: Neural Network Ensembles, Cross Validation and Active Learning. In: Touretzky, D.S. Tesauro, G. Leen, T.K., (eds.): Adv. Neural Inf. Process. Syst., 7,231–238 (199Google Scholar
  4. [4]
    Wolpert, D.H.: Stacked Generalization. Neural Networks, 5, 241–259 (1992).CrossRefGoogle Scholar
  5. [5]
    Perrone, M.: Improving regression estimation: averag ing methods for variance reduction with extensions to general convex measure optimization. PhD thesis, Brown Univ., Phys. Dept. 1993.Google Scholar
  6. [6]
    Alpaydin, E.: Techniques for Combining Multiple Learners. In: Alpaydin, E. (ed.): Proc. Eng. Intell. Syst. ’98, vol. 2, pp. 6–12. ICSC Press 1998.Google Scholar
  7. [7]
    Tyler, L.K., Marslen-Wilson W.D.: Conjectures and refutations: a reply to Norris. Cognition, 11, 103–107 (1982).CrossRefGoogle Scholar
  8. [8]
    Bull, L.: On Model-Based Evolutionary Computation. Soft Comp., 3, 76–82 (1999).CrossRefGoogle Scholar
  9. [9]
    Jin, Y, Olhofer, M., Sendhoff, B.: On Evolutionary Optimization with Approximate Fitness Function. In: Whitley, D., Goldberg, D., Cantú-Paz, E., Spector, L., Parmee, I., Beyer, H.-G. (eds.): GECCO-2000 Proc., pp. 786–793. M. Kaufmann Pub. 2000.Google Scholar
  10. [10]
    Jones, D. Schonlau, M. Welch, W.: Efficient Global Optimization of Expensive Black-Box Functions. J. Global Optim., 14, 455–492 (1998).Google Scholar
  11. [11]
    Booker, A.J. Dennis, J.E. Jr. Frank, P.D. Serafini, D.B. Torczon, V. Trosset, M.W.: A rigorous framework for optimization of expensive functions by surrogates. Struct. Optim., 17, 1–13 (1999).Google Scholar
  12. [12]
    Torczon, V. Trosset M.W.: Using approximations to accelerate engineering design optimization. In: Proc. 7th Symp. Multidisc. Anal. Optim., St. Louis, MO, Sept. 2–4, 1998.Google Scholar

Copyright information

© Springer-Verlag Wien 2001

Authors and Affiliations

  • Jiří Pospíchal
    • 1
  1. 1.Dept. MathematicsSlovak Tech. Univ.BratislavaSlovakia

Personalised recommendations