Skip to main content

Maximum Likelihood-Based Online Adaptation of Hyper-Parameters in CMA-ES

  • Conference paper
Parallel Problem Solving from Nature – PPSN XIII (PPSN 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8672))

Included in the following conference series:

Abstract

The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is widely accepted as a robust derivative-free continuous optimization algorithm for non-linear and non-convex optimization problems. CMA-ES is well known to be almost parameterless, meaning that only one hyper-parameter, the population size, is proposed to be tuned by the user. In this paper, we propose a principled approach called self-CMA-ES to achieve the online adaptation of CMA-ES hyper-parameters in order to improve its overall performance. Experimental results show that for larger-than-default population size, the default settings of hyper-parameters of CMA-ES are far from being optimal, and that self-CMA-ES allows for dynamically approaching optimal settings.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Auger, A., Hansen, N.: A Restart CMA Evolution Strategy With Increasing Population Size. In: IEEE Congress on Evolutionary Computation, pp. 1769–1776. IEEE Press (2005)

    Google Scholar 

  2. Beyer, H.-G., Hellwig, M.: Controlling population size and mutation strength by meta-es under fitness noise. In: Proceedings of the Twelfth Workshop on Foundations of Genetic Algorithms XII, FOGA XII 2013, pp. 11–24. ACM (2013)

    Google Scholar 

  3. Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: GECCO Companion, pp. 2389–2396 (2009)

    Google Scholar 

  4. Hansen, N., Auger, A., Finck, S., Ros, R.: Real-Parameter Black-Box Optimization Benchmarking 2010: Experimental Setup. Technical report, INRIA (2010)

    Google Scholar 

  5. Hansen, N., Müller, S., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation 11(1), 1–18 (2003)

    Article  Google Scholar 

  6. Hansen, N., Ostermeier, A.: Adapting Arbitrary Normal Mutation Distributions in Evolution Strategies: The Covariance Matrix Adaptation. In: International Conference on Evolutionary Computation, pp. 312–317 (1996)

    Google Scholar 

  7. Hansen, N., Ostermeier, A.: Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation 9(2), 159–195 (2001)

    Article  Google Scholar 

  8. Hansen, N., Ros, R.: Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noisy testbed. In: GECCO 2010: Proceedings of the 12th Annual Conference Comp on Genetic and Evolutionary Computation, pp. 1681–1688. ACM, New York (2010)

    Google Scholar 

  9. Hoffmann, F., Holemann, S.: Controlled Model Assisted Evolution Strategy with Adaptive Preselection. In: International Symposium on Evolving Fuzzy Systems, pp. 182–187. IEEE (2006)

    Google Scholar 

  10. Igel, C., Hüsken, M.: Empirical evaluation of the improved rprop learning algorithms. Neurocomputing 50, 105–123 (2003)

    Article  MATH  Google Scholar 

  11. Liao, T., Stützle, T.: Benchmark results for a simple hybrid algorithm on the CEC 2013 benchmark set for real-parameter optimization. In: IEEE Congress on Evolutionary Computation (CEC), pp. 1938–1944. IEEE Press (2013)

    Google Scholar 

  12. Loshchilov, I., Schoenauer, M., Sebag, M.: Alternative Restart Strategies for CMA-ES. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012, Part I. LNCS, vol. 7491, pp. 296–305. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  13. Loshchilov, I., Schoenauer, M., Sebag, M.: Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 321–328. ACM Press (July 2012)

    Google Scholar 

  14. Loshchilov, I., Schoenauer, M., Sebag, M.: Intensive Surrogate Model Exploitation in Self-adaptive Surrogate-assisted CMA-ES (saACM-ES). In: Genetic and Evolutionary Computation Conference, pp. 439–446. ACM (2013)

    Google Scholar 

  15. Schaul, T.: Comparing natural evolution strategies to bipop-cma-es on noiseless and noisy black-box optimization testbeds. In: Genetic and Evolutionary Computation Conference Companion, pp. 237–244. ACM (2012)

    Google Scholar 

  16. Smit, S., Eiben, A.: Beating the ‘world champion’ Evolutionary Algorithm via REVAC Tuning. IEEE Congress on Evolutionary Computation, 1–8 (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Loshchilov, I., Schoenauer, M., Sebag, M., Hansen, N. (2014). Maximum Likelihood-Based Online Adaptation of Hyper-Parameters in CMA-ES. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds) Parallel Problem Solving from Nature – PPSN XIII. PPSN 2014. Lecture Notes in Computer Science, vol 8672. Springer, Cham. https://doi.org/10.1007/978-3-319-10762-2_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-10762-2_7

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-10761-5

  • Online ISBN: 978-3-319-10762-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics