Continuous mimetic evolution

  • Antoine Ducoulombier
  • Michèle Sebag
Genetic Algorithms
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1398)


There exists no memory of biologic evolution besides the individuals themselves. Indeed, the biologic milieu can change and a previously unfit action or individual can come to be more fit; it would be most dangerous to rely on the memory of the past.

This contrasts with artificial evolution most often considering a fixed milieu: the generation of an unfit individual previously explored is only a waste of time. This paper aims at constructing a memory of evolution, and using it to avoid such fruitless explorations. A new evolution scheme, called mimetic evolution, gradually constructs two models along evolution, respectively memorizing the best and the worst individuals of the past generations. Standard crossover and mutation are replaced by mimetic mutation: individuals are attracted or repelled by these models. Mimetic evolution is extended from binary to continuous search spaces. Results of experiments on large-sized problems are detailed and discussed.


Good Individual Fitness Landscape Social Strategy Artificial Evolution Social Mutation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. B95.
    T. Bäck. Evolutionary Algorithms in theory and practice. New-York:Oxford University Press, 1995.Google Scholar
  2. BC95.
    S. Baluja and R. Caruana. Removing the genetics from the standard genetic algorithms. In A. Prieditis and S. Russel, editors, Proc. of ICML95, pages 38–46. Morgan Kaufmann, 1995.Google Scholar
  3. Bal95.
    S. Baluja. An empirical comparizon of seven iterative and evolutionary function optimization heuristics. Technical Report CMU-CS-95-193, Carnegie Mellon University, 1995.Google Scholar
  4. BS93.
    T. Bäck and H.-P. Schwefel. An overview of evolutionary algorithms for parameter optimization. Evolutionary Computation, 1(1):1–23, 1993.Google Scholar
  5. Dav89.
    L. Davis. Adapting operator probabilities in genetic algorithms. In J. D. Schaffer, editor, Proc. of the 3rd ICGA, pages 61–69. M. Kaufmann, 1989.Google Scholar
  6. ER96.
    A.E. Eiben and Z. Ruttkay. Self-adaptivity for constraint satisfaction: Learning penalty functions. In T. Fukuda, editor, Proc. of the 3rd IEEE ICEC, pages 258–261. IEEE Service Center, 1996.Google Scholar
  7. Eva97.
    I.K. Evans. Enhancing recombination with the complementary surrogate genetic algorithm. In T. Bäck, Z. Michalewicz, and X. Yao, editors, Proc. of the Fourth IEEE ICEC, pages 97–102. IEEE Press, 1997.Google Scholar
  8. Gol89.
    D. E. Goldberg. Genetic algorithms in search, optimization and machine learning. Addison Wesley, 1989.Google Scholar
  9. HG95.
    J. Horn and D.E. Goldberg. Genetic algorithms difficulty and the modality of fitness landscapes. In L. D. Whitley and M. D. Vose, editors, Foundations of Genetic Algorithms 3, pages 243–269. Morgan Kaufmann, 1995.Google Scholar
  10. HOG95.
    N. Hansen, A. Ostermeier, and A. Gawelczyk. On the adaptation of arbitrary normal mutation distributions in evolution strategies: The generating set adaptation. In L. J. Eshelman, editor, Proc. of the 6th ICGA, pages 57–64. Morgan Kaufmann, 1995.Google Scholar
  11. Hol75.
    J. H. Holland. Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor, 1975.Google Scholar
  12. Jon95.
    T. Jones. Crossover, macromutation and population-based search. In L. J. Eshelman, editor, Proc. of the 6th ICGA, pages 73–80. Morgan Kaufmann, 1995.Google Scholar
  13. PA94.
    P.Collard and J.P. Aurand. Dual ga: An efficient genetic algorithm. In Proc. of ECAI, pages 487–491. Amsterdam, Wiley and sons, August 1994.Google Scholar
  14. PDR+97.
    M. Peyral, A. Ducoulombier, C. Ravisé, M. Schoenauer, and M. Sebag. Mimetic evolution. In Artificial Evolution'97. To appear, Springer-Verlag.Google Scholar
  15. Rey94.
    R.G. Reynolds. An introduction to cultural algorithms. In Proc. of the 3rd Annual Conference on EP, pages 131–139. World Scientific, 1994.Google Scholar
  16. RS96.
    C. Ravisé and M. Sebag. An advanced evolution should not repeat its past errors. In L. Saitta, editor, Proc. of the 13th ICML, pages 400–408, 1996.Google Scholar
  17. Sch81.
    H.-P. Schwefel. Numerical Optimization of Computer Models. John Wiley & Sons, New-York, 1981. 1995 — 2nd edition.Google Scholar
  18. SS96.
    M. Sebag and M. Schoenauer. Mutation by imitation in boolean evolution strategies. In H.-M. Voigt, W. Ebeling, I. Rechenberg, and H.-P. Schwefel, editors, Proc. of the 4th Conference on PPSN, pages 356–365. Springer-Verlag, Springer-Verlag, 1141, 1996.Google Scholar
  19. SSR97.
    M. Sebag, M. Schoenauer, and C. Ravisé. Toward civilized evolution: Developing inhibitions. In Th. Bäeck, editor, Proc. of the 7th ICGA. Morgan Kaufmann, 1997.Google Scholar
  20. STMS97.
    I. Servet, L. Trave-Massuyes, and D. Stern. Telephone network traffic overloading diagnosis and evolutionary computation technique. In Artificial Evolution'97. To appear, Springer-Verlag.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Antoine Ducoulombier
    • 1
  • Michèle Sebag
    • 2
    • 1
  1. 1.LRI, CNRS URA 410Université d'OrsayOrsay Cedex
  2. 2.LMS, CNRS URA 317Ecole PolytechniquePalaiseau Cedex

Personalised recommendations