The (µ, λ)-ES: Distributed Populations
Analyzing the local performance of the(µ λ)-ES is much more difficult than analyzing that of the(1+1)-ES. While for the (1 + 1)-ES the population consists of a single candidate solution,for the(µ λ)-ES with p > 1 the population of candidate solutions that emerges in the course of evolution is distributed in search space and needs to be modeled. In the absence of noise, Beyer presented a moment-based analysis of the performance of the(µ λ)-ES for spherically symmetric fitness function. The approach was based on expanding the distribution of offspring candidate solutions in terms of derivatives of the normal distribution. Approximations to the lower-order central moments of the distribution and subsequently to the progress rate were obtained by imposing “self-consistency conditions” and solving a resulting system of equations. The results that were obtained are quite accurate even if only variance and skewness of the distribution are considered. A main result of Beyer’s analysis, which was also stated by Rechenberg, is the observation that on the noise-free sphere the performance of the(µ λ)-ES with p > 1 is never superior to that of the(1,λ)-ES, and that thus no benefits can be gained from retaining any but the best candidate solution generated. However, Rechenberg also provided empirical evidence that this is not true in the presence of noise. Simple computer experiments can be used to demonstrate that for the very same fitness function, significant speed-up factors over the(1,λ)-ES can be achieved by retaining more than just the(seemingly) best candidate solution if there is noise present. Nissen and Propach[59,60] provided empirical evidence that it may be the use of a population that is distributed in
Unable to display preview. Download preview PDF.