Skip to main content

Diversity Loss in General Estimation of Distribution Algorithms

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4193))

Abstract

A very general class of EDAs is defined, on which universal results on the rate of diversity loss can be derived. This EDA class, denoted SML-EDA, requires two restrictions: 1) in each generation, the new probability model is build using only data sampled from the current probability model; and 2) maximum likelihood is used to set model parameters. This class is very general; it includes simple forms of many well-known EDAs, e.g. BOA, MIMIC, FDA, UMDA, etc. To study the diversity loss in SML-EDAs, the trace of the empirical covariance matrix is the proposed statistic. Two simple results are derived. Let N be the number of data vectors evaluated in each generation. It is shown that on a flat landscape, the expected value of the statistic decreases by a factor 1–1/N in each generation. This result is used to show that for the Needle problem, the algorithm will with a high probability never find the optimum unless the population size grows exponentially in the number of search variables.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Larrañaga, P., Lozano, J.A.: Estimation of Distribution Algorithms, A New Tool for Evolutionary Computation. Kluwer Academic Publishers, Dordrecht (2002)

    Book  Google Scholar 

  2. Baluja, S.: Population-based incremental learning: A method for integrating genetic search based function optimization and competive learning. Technical Report CMU-CS-94-163, Computer Science Department, Carnegie Mellon University (1994)

    Google Scholar 

  3. Mühlenbein, H., Paaß, G.: From recombination of genes to the estimation of distributions i: Binary parameters. In: Proceedings of PPSN IV, pp. 178–187 (1996)

    Google Scholar 

  4. Shapiro, J.L.: The sensitivity of pbil to its learning rate and how detailed balance can remove it. In: Jong, K.A.D., Poli, R., Rowe, J.E. (eds.) Foundations of Genetic Algorithms, vol. 7, pp. 115–132. Morgan Kaufmann, San Francisco (2003)

    Google Scholar 

  5. Shapiro, J.L.: The detailed balance principle in estimation of distribution algorhithm. Evolutionary Computation 13(1), 99–124 (2005)

    Article  Google Scholar 

  6. Droste, S.: Not all linear functions are equally difficult for the compact genetic algorithm. In: GECCO 2005, pp. 679–686. ACM, New York (2005)

    Google Scholar 

  7. Gonzalez, C., Lozano, J., Larrañaga, P.: Mathematical modelling of umcac algorithm with tournament selection: Behaviour on linear and quadratic functions. International Journal of Approximate Reasoning 31(3), 313–340 (2002)

    Article  MathSciNet  Google Scholar 

  8. Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: Bayesian optimization algorithm, population sizing, and time to converge. In: Proceedings of GECCO 2000, pp. 275–282 (2000)

    Google Scholar 

  9. Pelikan, M.: Bayesian optimization algorithm: From single level to hierarchy. PhD thesis, University of Illinois at Urbana-Champaign, Urbana, IL (2002); Also IlliGAL Report No 2002023

    Google Scholar 

  10. Mühlenbein, H., Mahnig, T.: Evolutionary computation and beyond. In: Uesaka, Y., et al. (eds.) Foundations of Real-World Intelligence, pp. 123–186. CLSI Publications (2001)

    Google Scholar 

  11. Motwani, R., Raghavan, P.: Randomized Algorithms. Cambridge University Press, Cambridge (1995)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Shapiro, J.L. (2006). Diversity Loss in General Estimation of Distribution Algorithms. In: Runarsson, T.P., Beyer, HG., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds) Parallel Problem Solving from Nature - PPSN IX. PPSN 2006. Lecture Notes in Computer Science, vol 4193. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11844297_10

Download citation

  • DOI: https://doi.org/10.1007/11844297_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38990-3

  • Online ISBN: 978-3-540-38991-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics