Advertisement

Evolution strategies on noisy functions how to improve convergence properties

  • Ulrich Hammel
  • Thomas Bäck
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 866)

Abstract

Evolution Strategies are reported to be robust in the presence of noise which in general hinders the optimization process. In this paper we discuss the influence of some of the stratey parameters and strategy variants on the convergence process and discuss measures for improvement of the convergence properties. After having a broad look to the theory for the dynamics of a (1,λ)-ES on a simple quadratic function we numerically investigate the influence of the parent population size and the introduction of recombination. Finally we compare the effects of multiple sampling of the objective function versus the enlargment of the population size for the convergence precision as well as the convergence reliability by the example of the multimodal Rastrigins function.

Keywords

Evolution Strategy Strategy Parameter Observation Error Strategy Variable Convergence Process 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Th. Bäck. Evolutionary Algorithms in Theory and Practice. Dissertation, Universität Dortmund, 1994.Google Scholar
  2. 2.
    Th. Bäck and H.-P. Schwefel. An overview of evolutionary algorithms for parameter optimization. Evolutionary Computation, 1(1):1–23, 1993.Google Scholar
  3. 3.
    Thomas Bäck and Ulrich Hammel. Evolution Strategies Applied to Perturbed Objective Functions. In Proceedings of the IEEE World Congress of Computational Intelligence, Orlando, Florida, 1994.Google Scholar
  4. 4.
    H.-G. Beyer. Towards a theory of evolution strategies: Some asymptotical results from the (1,+λ)-theory. Evolutionary Computation, 1(2):165–188, 1993.Google Scholar
  5. 5.
    J. Michael Fitzpatrick and John J. Grefenstette. Genetic algorithms in noisy environments. Machine Learning, (3):101–120, 1988.Google Scholar
  6. 6.
    J. Klockgether and H.-P. Schwefel. Two-phase nozzle and hollow core jet experiments. In D.G. Elliott, editor, Proc. 11th Symp. Engineering Aspects of Magnetohydrodynamics, pages 141–148, California Institute of Technology, Pasadena CA, March 24–26, 1970.Google Scholar
  7. 7.
    I. Rechenberg. Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog, Stuttgart, 1973.Google Scholar
  8. 8.
    H.-P. Schwefel. Numerical Optimization of Computer Models. Wiley, Chichester, 1981.Google Scholar
  9. 9.
    H.-P. Schwefel. Collective phenomena in evolutionary systems. In Preprints of the 31st Annual Meeting of the International Society for General System Research, Budapest, volume 2, pages 1025–1033, June 1987.Google Scholar
  10. 10.
    H.-P. Schwefel. Natural evolution and collective optimum-seeking. In A. Sydow, editor, Computational Systems Analysis: Topics and Trends, pages 5–14. Elsevier, Amsterdam, 1992.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1994

Authors and Affiliations

  • Ulrich Hammel
    • 1
  • Thomas Bäck
    • 1
  1. 1.Department of Computer Science, LSXIUniversity of DortmundDortmundGermany

Personalised recommendations