Abstract
The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is widely accepted as a robust derivative-free continuous optimization algorithm for non-linear and non-convex optimization problems. CMA-ES is well known to be almost parameterless, meaning that only one hyper-parameter, the population size, is proposed to be tuned by the user. In this paper, we propose a principled approach called self-CMA-ES to achieve the online adaptation of CMA-ES hyper-parameters in order to improve its overall performance. Experimental results show that for larger-than-default population size, the default settings of hyper-parameters of CMA-ES are far from being optimal, and that self-CMA-ES allows for dynamically approaching optimal settings.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Auger, A., Hansen, N.: A Restart CMA Evolution Strategy With Increasing Population Size. In: IEEE Congress on Evolutionary Computation, pp. 1769–1776. IEEE Press (2005)
Beyer, H.-G., Hellwig, M.: Controlling population size and mutation strength by meta-es under fitness noise. In: Proceedings of the Twelfth Workshop on Foundations of Genetic Algorithms XII, FOGA XII 2013, pp. 11–24. ACM (2013)
Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: GECCO Companion, pp. 2389–2396 (2009)
Hansen, N., Auger, A., Finck, S., Ros, R.: Real-Parameter Black-Box Optimization Benchmarking 2010: Experimental Setup. Technical report, INRIA (2010)
Hansen, N., Müller, S., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation 11(1), 1–18 (2003)
Hansen, N., Ostermeier, A.: Adapting Arbitrary Normal Mutation Distributions in Evolution Strategies: The Covariance Matrix Adaptation. In: International Conference on Evolutionary Computation, pp. 312–317 (1996)
Hansen, N., Ostermeier, A.: Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation 9(2), 159–195 (2001)
Hansen, N., Ros, R.: Benchmarking a weighted negative covariance matrix update on the BBOB-2010 noisy testbed. In: GECCO 2010: Proceedings of the 12th Annual Conference Comp on Genetic and Evolutionary Computation, pp. 1681–1688. ACM, New York (2010)
Hoffmann, F., Holemann, S.: Controlled Model Assisted Evolution Strategy with Adaptive Preselection. In: International Symposium on Evolving Fuzzy Systems, pp. 182–187. IEEE (2006)
Igel, C., Hüsken, M.: Empirical evaluation of the improved rprop learning algorithms. Neurocomputing 50, 105–123 (2003)
Liao, T., Stützle, T.: Benchmark results for a simple hybrid algorithm on the CEC 2013 benchmark set for real-parameter optimization. In: IEEE Congress on Evolutionary Computation (CEC), pp. 1938–1944. IEEE Press (2013)
Loshchilov, I., Schoenauer, M., Sebag, M.: Alternative Restart Strategies for CMA-ES. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012, Part I. LNCS, vol. 7491, pp. 296–305. Springer, Heidelberg (2012)
Loshchilov, I., Schoenauer, M., Sebag, M.: Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 321–328. ACM Press (July 2012)
Loshchilov, I., Schoenauer, M., Sebag, M.: Intensive Surrogate Model Exploitation in Self-adaptive Surrogate-assisted CMA-ES (saACM-ES). In: Genetic and Evolutionary Computation Conference, pp. 439–446. ACM (2013)
Schaul, T.: Comparing natural evolution strategies to bipop-cma-es on noiseless and noisy black-box optimization testbeds. In: Genetic and Evolutionary Computation Conference Companion, pp. 237–244. ACM (2012)
Smit, S., Eiben, A.: Beating the ‘world champion’ Evolutionary Algorithm via REVAC Tuning. IEEE Congress on Evolutionary Computation, 1–8 (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Loshchilov, I., Schoenauer, M., Sebag, M., Hansen, N. (2014). Maximum Likelihood-Based Online Adaptation of Hyper-Parameters in CMA-ES. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds) Parallel Problem Solving from Nature – PPSN XIII. PPSN 2014. Lecture Notes in Computer Science, vol 8672. Springer, Cham. https://doi.org/10.1007/978-3-319-10762-2_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-10762-2_7
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-10761-5
Online ISBN: 978-3-319-10762-2
eBook Packages: Computer ScienceComputer Science (R0)