A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity

  • Raymond Ros
  • Nikolaus Hansen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5199)


This paper proposes a simple modification of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) for high dimensional objective functions, reducing the internal time and space complexity from quadratic to linear. The covariance matrix is constrained to be diagonal and the resulting algorithm, sep-CMA-ES, samples each coordinate independently. Because the model complexity is reduced, the learning rate for the covariance matrix can be increased. Consequently, on essentially separable functions, sep-CMA-ES significantly outperforms CMA-ES. For dimensions larger than a hundred, even on the non-separable Rosenbrock function, the sep-CMA-ES needs fewer function evaluations than CMA-ES.


Covariance Matrix Function Evaluation Learning Rate Space Complexity Target Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Arnold, D., Salomon, R.: Evolutionary gradient search revisited. IEEE Transactions on Evolutionary Computation 11(4), 480–495 (2007)CrossRefGoogle Scholar
  2. 2.
    Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J., Larrañaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a new evolutionary computation. Advances on estimation of distribution algorithms, pp. 75–102. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  3. 3.
    Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation. Evolutionary Computation 11(1), 1–18 (2003)CrossRefGoogle Scholar
  4. 4.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  5. 5.
    Hansen, N., Ostermeier, A., Gawelczyk, A.: On the adaptation of arbitrary normal mutation distributions in evolution strategies: The generating set adaptation. In: Eshelman, L.J. (ed.) Proceedings of the 6th International Conference on Genetic Algorithms, pp. 57–64. Morgan Kaufmann, San Francisco (1995)Google Scholar
  6. 6.
    Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES. In: GECCO 2007: Proceedings of the 9th annual conference on Genetic and evolutionary computation, pp. 658–665. ACM Press, New York (2007)Google Scholar
  7. 7.
    Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES: Addendum. In: Errata for [6] (March 2008),
  8. 8.
    Ostermeier, A., Gawelczyk, A., Hansen, N.: Step-size Adaptation Based on Non-local Use of Selection Information. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN 1994. LNCS, vol. 866, pp. 189–198. Springer, Heidelberg (1994)CrossRefGoogle Scholar
  9. 9.
    Poland, J., Zell, A.: Main vector adaptation: A CMA variant with linear time and space complexity. In: Spector, L., Goodman, E.D., Wu, A., Langdon, W.B., Voigt, H.-M., Gen, M., Sen, S., Dorigo, M., Pezeshk, S., Garzon, M.H., Burke, E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 7–11. Morgan Kaufmann, San Francisco (2001)Google Scholar
  10. 10.
    Ros, R., Hansen, N.: A simple modification in CMA-ES achieving linear time and space complexity. Research Report 6498, INRIA (April 2008),

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Raymond Ros
    • 1
  • Nikolaus Hansen
    • 2
  1. 1.Univ. Paris-Sud, LRI, UMR 8623 / INRIA Saclay, projet TAOOrsayFrance
  2. 2.Microsoft Research–INRIA Joint CentreOrsay CedexFrance

Personalised recommendations