Skip to main content

Principled Design of Continuous Stochastic Search: From Theory to Practice

  • Chapter
  • First Online:
Theory and Principled Methods for the Design of Metaheuristics

Part of the book series: Natural Computing Series ((NCS))

Abstract

We derive a stochastic search procedure for parameter optimization from two first principles: (1) imposing the least prior assumptions, namely by maximum entropy sampling, unbiasedness and invariance; (2) exploiting all available information under the constraints imposed by (1). We additionally require that two of the most basic functions can be solved reasonably fast. Given these principles, two principal heuristics are used: reinforcing of good solutions and good steps (increasing their likelihood) and rendering successive steps orthogonal. The resulting search algorithm is the covariance matrix adaptation evolution strategy, CMA-ES, that coincides to a great extent to a natural gradient descent. The invariance properties of the CMA-ES are formalized, as are its maximum likelihood and stationarity properties. A small parameter study for a specific heuristic—deduced from the principles of reinforcing good steps and exploiting all information—is presented, namely for the cumulation of an evolution or search path. Experiments on two noisy functions are provided.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    That is, to find a sequence θ k , \(k = 1, 2, 3,\ldots\), such that \(\lim _{k\rightarrow \infty }E(f(\boldsymbol{x}\vert \theta _{k})) = {f}^{{\ast}}\).

  2. 2.

    Different learning rates might be related to some parameters in the distribution being orthogonal.

  3. 3.

    In the (μ, λ)-ES, only the μ best samples are selected for the next iteration. Given μ = 1, a very general optimality condition for λ states that the currently second best solution must resemble the f-value of the previous best solution [24]. Consequently, on any linear function, λ = 2 and λ = 3 are optimal [24, 36]. On the sphere function Eq. (8.22), λ = 5 is optimal [33]. On the latter, \(\lambda \approx 3.7\mu\) can also be shown optimal for μ ≥ 2 and equal recombination weights [9], compare (8.12). For λ < 5, the original strategy parameter setting for CMA-ES has been rectified in [10], but only mirrored sampling leads to satisfactory performance in this case [10].

  4. 4.

    The positive symmetric square root satisfies \({\boldsymbol{C}_{k}}^{-\frac{1} {2} }{\boldsymbol{C}_{ k}}^{-\frac{1} {2} } ={ \boldsymbol{C}_{ k}}^{-1}\), has only positive eigenvalues and is unique.

  5. 5.

    Source code is available at http://www.lri.fr/~hansen/cmaes_inmatlab.html and will be accessible at http://cma.gforge.inria.fr/ in the future. In our experiment, version 3.40.beta was used with Matlab.

  6. 6.

    There is a simple way to smooth the landscape: A single evaluation can be replaced by the median (not the mean) of a number of evaluations. Only a few evaluations reduce the dispersion considerably, but about 1, 000 evaluations are necessary to render the landscape similarly smooth as with α N  = 0. 01. Together with (\(\mu /\mu _{\mathrm{}\,{w}}\))-CMA-ES, single evaluations, as in Fig. 8.10, need overall the least number of function evaluations (comprising restarts).

References

  1. Y. Akimoto, Y. Nagata, I. Ono, S. Kobayashi, Bidirectional relation between CMA evolution strategies and natural evolution strategies, in Proceedings of the Parallel Problem Solving from Nature – PPSN XI, Part I, Kraków, ed. by R. Schaefer, C. Cotta, J. Kolodziej, G. Rudolph. Lecture Notes in Computer Science, vol. 6238 (Springer, 2010), pp. 154–163

    Google Scholar 

  2. D. Arnold, Optimal weighted recombination, in Foundations on Genetic Algorithms FOGA 2005, Aizu-Wakamatsu City. Lecture Notes in Computer Science, vol. 3469 (Springer, 2005), pp. 215–237

    Google Scholar 

  3. D. Arnold, Weighted multirecombination evolution strategies. Theor. Comput. Sci. 361(1), 18–37 (2006)

    Article  MATH  Google Scholar 

  4. D.V. Arnold, N. Hansen, Active covariance matrix adaptation for the (1+1)-CMA-ES, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2010, Portland, 2010, pp. 385–392

    Google Scholar 

  5. L. Arnold, A. Auger, N. Hansen, Y. Ollivier, Information-geometric optimization algorithms: a unifying picture via invariance principles. arXiv:1106.3708 (Arxiv preprint) (2011)

    Google Scholar 

  6. A. Auger, N. Hansen, A restart CMA evolution strategy with increasing population size, in The 2005 IEEE International Congress on Evolutionary Computation (CEC 2005), Edinburgh, ed. by B. McKay et al., vol. 2, 2005, pp. 1769–1776

    Google Scholar 

  7. A.Auger, N. Hansen, Reconsidering the progress rate theory for evolution strategies in finite dimensions, in Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation GECCO, Seattle (ACM, 2006), pp. 445–452

    Google Scholar 

  8. A. Auger, N. Hansen, J. Zerpa, R. Ros, M. Schoenauer, Experimental comparisons of derivative free optimization algorithms, in 8th International Symposion on Experimental Algorithms SEA 2009, Dortmund. Lecture Notes in Computer Science, vol. 5526 (Springer, 2009), pp. 3–15

    Google Scholar 

  9. H.G. Beyer, The Theory of Evolution Strategies. Natural Computing Series (Springer, Heidelberg, 2001)

    Book  Google Scholar 

  10. D. Brockhoff, A. Auger, N. Hansen, D.V. Arnold, T. Hohm, Mirrored sampling and sequential selection for evolution strategies, in Parallel Problem Solving from Nature (PPSN XI), Kraków, ed. by R. Schaefer et al. LNCS, vol. 6238 (Springer, 2010) pp. 11–20

    Google Scholar 

  11. T. Glasmachers, T. Schaul, Y. Sun, D. Wierstra, J. Schmidhuber, Exponential natural evolution strategies, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2010, Portland, ed. by M. Pelikan, J. Branke (ACM, 2010) pp. 393–400

    Google Scholar 

  12. N. Hansen, The CMA evolution strategy: a tutorial, http://www.lri.fr/~hansen/cmatutorial.pdf

  13. N. Hansen, Invariance, self-adaptation and correlated mutations in evolution strategies, in Proceedings of PPSN VI, Parallel Problem Solving from Nature, Paris, ed. by M. Schoenauer, K. Deb, G. Rudolph, X. Yao, E. Lutton, J. Merelo, H.P. Schwefel (Springer, 2000), pp. 355–364

    Google Scholar 

  14. N. Hansen, An analysis of mutative σ-self-adaptation on linear fitness functions. Evol. Comput. 14(3), 255–275 (2006)

    Article  Google Scholar 

  15. N. Hansen, The CMA evolution strategy: a comparing review, in Towards a New Evolutionary Computation. Advances on Estimation of Distribution Algorithms, Hefei, ed. by J. Lozano, P. Larranaga, I. Inza, E. Bengoetxea (Springer, 2006), pp. 75–102

    Google Scholar 

  16. N. Hansen, Adaptive encoding for optimization. Research Report RR-6518, INRIA (2008), http://hal.inria.fr/inria-00275983/en/

  17. N. Hansen, Adaptive encoding: how to render search coordinate system invariant, in Parallel Problem Solving from Nature (PPSN X), Dortmund, ed. by G. Rudolph et al. LNCS, 2008, pp. 205–214

    Google Scholar 

  18. N. Hansen, CMA-ES with two-point step-size adaptation. Technical Report, RR-6527, INRIA (2008), http://hal.inria.fr/inria-00276854/en/

  19. N. Hansen, Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed, in Workshop Proceedings of the GECCO Genetic and Evolutionary Computation Conference, Montreal (ACM, 2009), pp. 2389–2395

    Google Scholar 

  20. N. Hansen, Benchmarking a BI-population CMA-ES on the BBOB-2009 noisy testbed, in Workshop Proceedings of the GECCO Genetic and Evolutionary Computation Conference, Montreal (ACM, 2009), pp. 2397–2402

    Google Scholar 

  21. N. Hansen, S. Kern, Evaluating the CMA evolution strategy on multimodal test functions, in Parallel Problem Solving from Nature PPSN VIII, Birmingham, ed. by X. Yao, et al. Lecture Notes in Computer Science, vol. 3242 (Springer, 2004), pp. 282–291

    Google Scholar 

  22. N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  23. N. Hansen, R. Ros, Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noiseless testbed, in Genetic and Evolutionary Computation Conference, GECCO 2010, Companion Material, Portland, 2010, pp. 1673–1680

    Google Scholar 

  24. N. Hansen, A. Gawelczyk, A. Ostermeier, Sizing the population with respect to the local progress in (1, λ)-evolution strategies—a theoretical analysis, in IEEE International Conference on Evolutionary Computation, Perth, vol. 1, 1995, pp. 80–85

    Google Scholar 

  25. N. Hansen, S. Niederberger, L. Guzzella, P. Koumoutsakos, A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion. IEEE Trans. Evol. Comput. 13(1), 180–197 (2009)

    Article  Google Scholar 

  26. N. Hansen, A. Auger, R. Ros, S. Finck, P. Pošík, Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009, in Workshop Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2010), Portland (ACM, 2010), pp. 1689–1696

    Google Scholar 

  27. J. Jägersküpper, Lower bounds for hit-and-run direct search, in Stochastic Algorithms: Foundations and Applications – SAGA 2007, Zurich, ed. by Yao, Xin et al. LNCS, vol. 4665 (Springer, Berlin/Heidelberg, 2007) pp. 118–129

    Google Scholar 

  28. J. Jägersküpper, Lower bounds for randomized direct search with isotropic sampling. Oper. Res. Lett. 36(3), 327–332 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  29. G. Jastrebski, D. Arnold, Improving evolution strategies through active covariance matrix adaptation, in The 2006 IEEE International Congress on Evolutionary Computation (CEC 2006), Vancouver, 2006, pp. 2814–2821

    Google Scholar 

  30. M. Jebalia, A. Auger, N. Hansen, Log linear convergence and divergence of the scale-invariant (1+1)-ES in noisy environments. Algorithmica 59(3), 425–460 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  31. T. Jones, S. Forrest, Fitness distance correlation as a measure of problem difficulty for genetic algorithms, in Proceedings of the 6th International Conference on Genetic Algorithms, ICGA, Pittsburgh, ed. by L.J. Eshelman (Morgan Kaufmann, 1995), pp. 184–192

    Google Scholar 

  32. A. Ostermeier, A. Gawelczyk, N. Hansen, Step-size adaptation based on non-local use of selection information, in Parallel Problem Solving from Nature PPSN IV, Jerusalem, ed. by Y. Davidor et al. Lecture Notes in Computer Science, vol. 866 (Springer, 1994), pp. 189–198

    Google Scholar 

  33. I. Rechenberg, Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog, Stuttgart, 1973)

    Google Scholar 

  34. R. Salomon, J.L. van Hemmen, Accelerating backpropagation through dynamic self-adaptation. Neural Netw. 9(4), 589–601 (1996)

    Article  Google Scholar 

  35. M. Schumer, K. Steiglitz, Adaptive step size random search. IEEE Trans. Autom. Control 13(3), 270–276 (1968)

    Article  Google Scholar 

  36. H.P. Schwefel, Numerical Optimization of Computer Models (Wiley, New York, 1981)

    MATH  Google Scholar 

  37. T. Suttorp, N. Hansen, C. Igel, Efficient covariance matrix update for variable metric evolution strategies. Mach. Learn. 75(2), 167–197 (2009)

    Article  Google Scholar 

  38. O. Teytaud, H. Fournier, Lower bounds for evolution strategies using VC-dimension, in Parallel Problem Solving from Nature PPSN X, Dortmund. Lecture Notes in Computer Science, vol. 5199 (Springer, 2008) pp. 102–111

    Google Scholar 

  39. O. Teytaud, S. Gelly, General lower bounds for evolutionary algorithms, in Parallel Problem Solving from Nature PPSN IX, Reykjavik. Lecture Notes in Computer Science, vol. 4193 (Springer, 2006), pp. 21–31

    Google Scholar 

  40. D. Wierstra, T. Schaul, J. Peters, J. Schmidhuber, Natural evolution strategies, in IEEE Congress on Evolutionary Computation, Hong Kong (IEEE, 2008), pp. 3381–3387

    Google Scholar 

Download references

Acknowledgements

The authors would like to express their gratitude to Marc Schoenauer for his kind and consistent support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nikolaus Hansen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Hansen, N., Auger, A. (2014). Principled Design of Continuous Stochastic Search: From Theory to Practice. In: Borenstein, Y., Moraglio, A. (eds) Theory and Principled Methods for the Design of Metaheuristics. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33206-7_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33206-7_8

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33205-0

  • Online ISBN: 978-3-642-33206-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics