Skip to main content

Convergence Rates of Evolutionary Algorithms and Parallel Evolutionary Algorithms

  • Chapter
  • First Online:
Theory and Principled Methods for the Design of Metaheuristics

Part of the book series: Natural Computing Series ((NCS))

Abstract

This chapter discusses the advantages (robustness) and drawbacks (slowness) of algorithms searching the optimum by comparisons between fitness values only. The results are mathematical proofs, but practical implications in terms of speed-up for algorithms applied on parallel machines are presented, as well as practical hints for tuning parallel optimization algorithms and on the feasibility of some specific forms of optimization. Throughout the chapter, \([[a,b]] =\{ a,a + 1,\ldots,b\}\).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The optimal speed-up is asymptotically logarithmic, but as explained in this chapter non-asymptotically we can reach a linear speed-up (until λ = Θ(d)).

  2. 2.

    Importantly, the result is based on the fact that those objectives can all be conflicting – results are very different if we forbid too many conflicting objectives.

References

  1. A. Auger, Convergence results for (1,λ)-SA-ES using the theory of \(\varphi\)-irreducible Markov chains. Theor. Comput. Sci. 334, 35–69 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  2. T. Bäck, F. Hoffmeister, H.-P. Schwefel, Extended selection mechanisms in genetic algorithms, in Proceedings of the Fourth International Conference on Genetic Algorithms, San Diego, ed. by R.K. Belew, L.B. Booker (Morgan Kaufmann, 1991)

    Google Scholar 

  3. J.E. Baker, Reducing bias and inefficiency in the selection algorithm, in Proceedings of the Second International Conference on Genetic Algorithms and Their Application, Cambridge (Lawrence Erlbaum, 1987), pp. 14–21

    Google Scholar 

  4. H.-G. Beyer, The Theory of Evolution Strategies (Springer, Heidelberg, 2001)

    Book  Google Scholar 

  5. D. Brockhoff, E. Zitzler, Objective reduction in evolutionary multiobjective optimization: theory and applications. Evol. Comput. 17(2), 135–166 (2009)

    Article  Google Scholar 

  6. C.G. Broyden, The convergence of a class of double-rank minimization algorithms 2. New Algorithm J. Inst. Math. Appl. 6, 222–231 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  7. A. Conn, K. Scheinberg, L. Toint, Recent progress in unconstrained nonlinear optimization without derivatives. Math. Program. 79, 397–414 (1997)

    MATH  MathSciNet  Google Scholar 

  8. R. Fletcher, A new approach to variable-metric algorithms. Comput. J. 13, 317–322 (1970)

    Article  MATH  Google Scholar 

  9. S. Gelly, S. Ruette, O. Teytaud, Comparison-based algorithms are robust and randomized algorithms are anytime. Evol. Comput. Special Issue on Bridging Theory and Practice 15(4), 26 (2007)

    Google Scholar 

  10. D. Goldfarb, A family of variable-metric algorithms derived by variational means. Math. Comput. 24, 23–26 (1970)

    Article  MATH  MathSciNet  Google Scholar 

  11. N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  12. P. Larranaga, J.A. Lozano, Estimation of Distribution Algorithms. A New Tool for Evolutionary Computation (Kluwer, Boston, 2001)

    Google Scholar 

  13. I. Loshchilov, M. Schoenauer, M. Sebag, Comparison-based optimizers need comparison-based surrogates, in Parallel Problem Solving from Nature (PPSN XI), ed. by R. Schaefer et al. LNCS, vol. 6238 (Springer, New York, 2010), pp. 364–373

    Google Scholar 

  14. J. Matoušek, Lectures on Discrete Geometry. Graduate Texts in Mathematics, vol 212 (Springer, New York, 2002)

    Google Scholar 

  15. I. Rechenberg, Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog Verlag, Stuttgart, 1973)

    Google Scholar 

  16. O. Rudenko, M. Schoenauer, Dominance based crossover operator for evolutionary multi-objective algorithms. CoRR, abs/cs/0505080 (2005)

    Google Scholar 

  17. H.-P. Schwefel, Adaptive Mechanismen in der biologischen Evolution und ihr Einfluss auf die Evolutionsgeschwindigkeit. Interner Bericht der Arbeitsgruppe Bionik und Evolutionstechnik am Institut für Mess- und Regelungstechnik Re 215/3, Technische Universität Berlin, Juli 1974.

    Google Scholar 

  18. D.F. Shanno, Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24, 647–656 (1970)

    Article  MathSciNet  Google Scholar 

  19. O. Teytaud, On the hardness of offline multiobjective optimization. Evol. Comput. 15, 475–491 (2007)

    Article  Google Scholar 

  20. F. Teytaud, A new selection ratio for large population sizes, in Applications of Evolutionary Computation. LNCS, vol. 6024 (Springer, Berlin, 2010), pp. 452–460

    Google Scholar 

  21. O. Teytaud, H. Fournier, Lower bounds for evolution strategies using VC-dimension, in Proceedings of PPSN, Dortmund (Springer, 2008), pp. 102–111

    Google Scholar 

  22. O. Teytaud, S. Gelly, General lower bounds for evolutionary computation, in Proceedings of PPSN, Reykjavik (Springer, 2006), pp. 21–31

    Google Scholar 

  23. F. Teytaud, O. Teytaud, Bias and variance in continuous EDA, in Proceedings of EA09, New York. LNCS (Springer, 2009)

    Google Scholar 

  24. F. Teytaud, O. Teytaud, On the parallel speed-up of estimation of multivariate normal algorithm and evolution strategies, in Proceedings of EvoStar, Tübingen, 2009, pp. 655–664

    Google Scholar 

  25. F. Teytaud, O. Teytaud, Log(lambda) modifications for optimal parallelism, in Parallel Problem Solving from Nature (Springer, New York, 2010)

    Google Scholar 

  26. V. Vapnik, A. Chervonenkis, On the uniform convergence of frequencies of occurence events to their probabilities. Sov. Math. Dokl. 9, 915–918 (1968)

    MATH  Google Scholar 

  27. D. Whitley, The GENITOR algorithm and selection pressure: why rank-based allocation of reproductive trials is best, in Proceedings of the Third International Conference on Genetic Algorithms, San Mateo, ed. by J.D. Schaffer (Morgan Kaufmann, 1989)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fabien Teytaud .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Teytaud, F., Teytaud, O. (2014). Convergence Rates of Evolutionary Algorithms and Parallel Evolutionary Algorithms. In: Borenstein, Y., Moraglio, A. (eds) Theory and Principled Methods for the Design of Metaheuristics. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33206-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33206-7_2

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33205-0

  • Online ISBN: 978-3-642-33206-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics