Gaussian Adaptation Revisited – An Entropic View on Covariance Matrix Adaptation

  • Christian L. Müller
  • Ivo F. Sbalzarini
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6024)


We revisit Gaussian Adaptation (GaA), a black-box optimizer for discrete and continuous problems that has been developed in the late 1960’s. This largely neglected search heuristic shares several interesting features with the well-known Covariance Matrix Adaptation Evolution Strategy (CMA-ES) and with Simulated Annealing (SA). GaA samples single candidate solutions from a multivariate normal distribution and continuously adapts its first and second moments (mean and covariance) such as to maximize the entropy of the search distribution. Sample-point selection is controlled by a monotonically decreasing acceptance threshold, reminiscent of the cooling schedule in SA. We describe the theoretical foundations of GaA and analyze some key features of this algorithm. We empirically show that GaA converges log-linearly on the sphere function and analyze its behavior on selected non-convex test functions.


Gaussian Adaptation Entropy Covariance Matrix Adaptation  Evolution Strategy Black-Box Optimization 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Kirkpatrick, S., Gelatt, C., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)CrossRefMathSciNetGoogle Scholar
  2. 2.
    Hansen, N., Ostermeier, A.: Completely Derandomized Self-Adaption in Evolution Strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  3. 3.
    Kjellström, G.: Network Optimization by Random Variation of Component Values. Ericsson Technics 25(3), 133–151 (1969)Google Scholar
  4. 4.
    Kjellström, G., Taxen, L.: Stochastic Optimization in System Design. IEEE Trans. Circ. and Syst. 28(7) (July 1981)Google Scholar
  5. 5.
    Hansen, N., Kern, S.: Evaluating the CMA Evolution Strategy on Multimodal Test Functions. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 282–291. Springer, Heidelberg (2004)Google Scholar
  6. 6.
    Hansen, N.: The CMA Evolution Strategy: A Tutorial (2007)Google Scholar
  7. 7.
    Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: Proc. of IEEE Congress on Evolutionary Computation (CEC 2005), vol. 2, pp. 1769–1776 (2005)Google Scholar
  8. 8.
    Igel, C., Suttorp, T., Hansen, N.: A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies. In: GECCO 2006: Proceedings of the 8th annual conference on Genetic and evolutionary computation, pp. 453–460. ACM, New York (2006)CrossRefGoogle Scholar
  9. 9.
    Jaynes, E.T.: Information Theory and Statistical Mechanics. Phys. Rev. 106(4), 620–630 (1957)CrossRefMathSciNetGoogle Scholar
  10. 10.
    Kjellström, G.: On the Efficiency of Gaussian Adaptation. J. Optim. Theory Appl. 71(3) (December 1991)Google Scholar
  11. 11.
    Kjellström, G.: Personal communicationGoogle Scholar
  12. 12.
    Kjellström, G., Taxen, L.: Gaussian Adaptation, an evolution-based efficient global optimizer. In: Comp. Appl. Math., pp. 267–276. Elsevier Science, Amsterdam (1992)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Christian L. Müller
    • 1
  • Ivo F. Sbalzarini
    • 1
  1. 1.Institute of Theoretical Computer Science and Swiss Institute of BioinformaticsETH ZurichZurichSwitzerland

Personalised recommendations