Abstract
This chapter discusses the advantages (robustness) and drawbacks (slowness) of algorithms searching the optimum by comparisons between fitness values only. The results are mathematical proofs, but practical implications in terms of speed-up for algorithms applied on parallel machines are presented, as well as practical hints for tuning parallel optimization algorithms and on the feasibility of some specific forms of optimization. Throughout the chapter, \([[a,b]] =\{ a,a + 1,\ldots,b\}\).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The optimal speed-up is asymptotically logarithmic, but as explained in this chapter non-asymptotically we can reach a linear speed-up (until λ = Θ(d)).
- 2.
Importantly, the result is based on the fact that those objectives can all be conflicting – results are very different if we forbid too many conflicting objectives.
References
A. Auger, Convergence results for (1,λ)-SA-ES using the theory of \(\varphi\)-irreducible Markov chains. Theor. Comput. Sci. 334, 35–69 (2005)
T. Bäck, F. Hoffmeister, H.-P. Schwefel, Extended selection mechanisms in genetic algorithms, in Proceedings of the Fourth International Conference on Genetic Algorithms, San Diego, ed. by R.K. Belew, L.B. Booker (Morgan Kaufmann, 1991)
J.E. Baker, Reducing bias and inefficiency in the selection algorithm, in Proceedings of the Second International Conference on Genetic Algorithms and Their Application, Cambridge (Lawrence Erlbaum, 1987), pp. 14–21
H.-G. Beyer, The Theory of Evolution Strategies (Springer, Heidelberg, 2001)
D. Brockhoff, E. Zitzler, Objective reduction in evolutionary multiobjective optimization: theory and applications. Evol. Comput. 17(2), 135–166 (2009)
C.G. Broyden, The convergence of a class of double-rank minimization algorithms 2. New Algorithm J. Inst. Math. Appl. 6, 222–231 (1970)
A. Conn, K. Scheinberg, L. Toint, Recent progress in unconstrained nonlinear optimization without derivatives. Math. Program. 79, 397–414 (1997)
R. Fletcher, A new approach to variable-metric algorithms. Comput. J. 13, 317–322 (1970)
S. Gelly, S. Ruette, O. Teytaud, Comparison-based algorithms are robust and randomized algorithms are anytime. Evol. Comput. Special Issue on Bridging Theory and Practice 15(4), 26 (2007)
D. Goldfarb, A family of variable-metric algorithms derived by variational means. Math. Comput. 24, 23–26 (1970)
N. Hansen, A. Ostermeier, Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)
P. Larranaga, J.A. Lozano, Estimation of Distribution Algorithms. A New Tool for Evolutionary Computation (Kluwer, Boston, 2001)
I. Loshchilov, M. Schoenauer, M. Sebag, Comparison-based optimizers need comparison-based surrogates, in Parallel Problem Solving from Nature (PPSN XI), ed. by R. Schaefer et al. LNCS, vol. 6238 (Springer, New York, 2010), pp. 364–373
J. Matoušek, Lectures on Discrete Geometry. Graduate Texts in Mathematics, vol 212 (Springer, New York, 2002)
I. Rechenberg, Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog Verlag, Stuttgart, 1973)
O. Rudenko, M. Schoenauer, Dominance based crossover operator for evolutionary multi-objective algorithms. CoRR, abs/cs/0505080 (2005)
H.-P. Schwefel, Adaptive Mechanismen in der biologischen Evolution und ihr Einfluss auf die Evolutionsgeschwindigkeit. Interner Bericht der Arbeitsgruppe Bionik und Evolutionstechnik am Institut für Mess- und Regelungstechnik Re 215/3, Technische Universität Berlin, Juli 1974.
D.F. Shanno, Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24, 647–656 (1970)
O. Teytaud, On the hardness of offline multiobjective optimization. Evol. Comput. 15, 475–491 (2007)
F. Teytaud, A new selection ratio for large population sizes, in Applications of Evolutionary Computation. LNCS, vol. 6024 (Springer, Berlin, 2010), pp. 452–460
O. Teytaud, H. Fournier, Lower bounds for evolution strategies using VC-dimension, in Proceedings of PPSN, Dortmund (Springer, 2008), pp. 102–111
O. Teytaud, S. Gelly, General lower bounds for evolutionary computation, in Proceedings of PPSN, Reykjavik (Springer, 2006), pp. 21–31
F. Teytaud, O. Teytaud, Bias and variance in continuous EDA, in Proceedings of EA09, New York. LNCS (Springer, 2009)
F. Teytaud, O. Teytaud, On the parallel speed-up of estimation of multivariate normal algorithm and evolution strategies, in Proceedings of EvoStar, Tübingen, 2009, pp. 655–664
F. Teytaud, O. Teytaud, Log(lambda) modifications for optimal parallelism, in Parallel Problem Solving from Nature (Springer, New York, 2010)
V. Vapnik, A. Chervonenkis, On the uniform convergence of frequencies of occurence events to their probabilities. Sov. Math. Dokl. 9, 915–918 (1968)
D. Whitley, The GENITOR algorithm and selection pressure: why rank-based allocation of reproductive trials is best, in Proceedings of the Third International Conference on Genetic Algorithms, San Mateo, ed. by J.D. Schaffer (Morgan Kaufmann, 1989)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Teytaud, F., Teytaud, O. (2014). Convergence Rates of Evolutionary Algorithms and Parallel Evolutionary Algorithms. In: Borenstein, Y., Moraglio, A. (eds) Theory and Principled Methods for the Design of Metaheuristics. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33206-7_2
Download citation
DOI: https://doi.org/10.1007/978-3-642-33206-7_2
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33205-0
Online ISBN: 978-3-642-33206-7
eBook Packages: Computer ScienceComputer Science (R0)