Abstract
The “end-game” of evolutionary optimisation is often largely governed by the efficiency and effectiveness of searching regions of space known to contain high quality solutions. In a traditional EA this role is done via mutation, which creates a tension with its other different role of maintaining diversity. One approach to improving the efficiency of this phase is self-adaptation of the mutation rates. This leaves the fitness landscape unchanged, but adapts the shape of the probability distribution function governing the generation of new solutions. A different approach is the incorporation of local search – so-called Memetic Algorithms. Depending on the paradigm, this approach either changes the fitness landscape (Baldwinian learning) or causes a mapping to a reduced subset of the previous fitness landscape (Lamarkian learning). This paper explores the interaction between these two mechanisms. Initial results suggest that the reduction in landscape gradients brought about by the Baldwin effect can reduce the effectiveness of self-adaptation. In contrast Lamarkian learning appears to enhance the process of self-adaptation, with very different behaviours seen on different problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bäck, T.: The interaction of mutation rate, selection and self-adaptation within a genetic algorithm. In: Männer, R., Manderick, B. (eds.) Proceedings of the 2nd Conference on Parallel Problem Solving from Nature, pp. 85–94. North-Holland, Amsterdam (1992)
Bäck, T.: Self adaptation in genetic algorithms. In: Varela, F., Bourgine, P. (eds.) Toward a Practice of Autonomous Systems: Proceedings of the 1st European Conference on Artificial Life, pp. 263–271. MIT Press, Cambridge (1992)
Beyer, H.-G.: The Theory of Evolution Strategies. Springer, New York (2001)
Eiben, A., Michalewicz, Z., Schoenauer, M., Smith, J.: Parameter Control in Evolutionary Algorithms. In: Lobo, F.G., Lima, C.F., Michalewicz, Z. (eds.) Parameter Setting in Evolutionary Algorithms. SCI, vol. 54, pp. 19–46. Springer, Heidelberg (2007)
Eiben, A., Smith, J.: Introduction to Evolutionary Computation. Springer (2003)
Forrest, S., Mitchell, M.: Relative building block fitness and the building block hypothesis. In: Whitley, L. (ed.) Foundations of Genetic Algorithms 2, pp. 109–126. Morgan Kaufmann, San Francisco (1992)
Glickman, M., Sycara, K.: Reasons for premature convergence of self-adaptating mutation rates. In: 2000 Congress on Evolutionary Computation (CEC 2000), pp. 62–69. IEEE Press, Piscataway (2000)
Hinton, G., Nowlan, S.: How learning can guide evolution. Complex Systems 1, 495–502 (1987)
Krasnogor, N., Smith, J.: A tutorial for competent memetic algorithms: Model, taxonomy and design issues. IEEE Transactions on Evolutionary Computation 9(5), 474–488 (2005)
Meyer-Nieberg, S., Beyer, H.-G.: Self-Adaptation in Evolutionary Algorithms. In: Parameter setting in evolutionary algorithms, pp. 47–75. Springer, Heidelberg (2007)
Ong, Y.S., Lim, M.H., Chen, X.: Memetic Computation—Past, Present & Future. IEEE Computational Intelligence Magazine 5(2), 24–31 (2010)
Preuss, M., Bartz-Beielstein, T.: Sequential parameter optimization applied to self-adaptation for binary-coded evolutionary algorithms. In: Lobo, F.G., Lima, C.F., Michalewicz, Z. (eds.) Parameter Setting in Evolutionary Algorithms. SCI, vol. 54, pp. 91–119. Springer, Heidelberg (2007)
Rudolph, G.: Self-adaptive mutations lead to premature convergence. IEEE Transactions on Evolutionary Computation 5, 410–414 (2001)
Schwefel, H.-P.: Numerical Optimisation of Computer Models. Wiley (1981)
Serpell, M., Smith, J.: Self-adaption of mutation operator and probability for permutation representations in genetic algorithms. Evolutionary Computation 18(3), 491–514 (2010)
Smith, J.: Modelling GAs with self-adaptive mutation rates. In: Spector, L., et al. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 599–606. Morgan Kaufmann, San Francisco (2001)
Smith, J.: Parameter perturbation mechanisms in binary coded gas with self-adaptive mutation. In: Rowe, Poli, DeJong, Cotta (eds.) Foundations of Genetic Algorithms 7, pp. 329–346. Morgan Kaufmann, San Francisco (2003)
Smith, J.: Co-evolving memetic algorithms: A review and progress report. IEEE Transactions in Systems, Man and Cybernetics, Part B 37(1), 6–17 (2007)
Smith, J.: Estimating meme fitness in adaptive memetic algorithms for combinatorial problems. Evolutionary Computation 20(2), 165–188 (2012)
Smith, J., Fogarty, T.: Self adaptation of mutation rates in a steady state genetic algorithm. In: Proceedings of the 1996 IEEE Conference on Evolutionary Computation, pp. 318–323. IEEE Press, Piscataway (1996)
Stone, C., Smith, J.: Strategy parameter variety in self-adaption. In: Langdon, W., et al. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2002), 9-13 July, pp. 586–593. Morgan Kaufmann, San Francisco (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Smith, J. (2014). The Baldwin Effect Hinders Self-Adaptation. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds) Parallel Problem Solving from Nature – PPSN XIII. PPSN 2014. Lecture Notes in Computer Science, vol 8672. Springer, Cham. https://doi.org/10.1007/978-3-319-10762-2_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-10762-2_12
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-10761-5
Online ISBN: 978-3-319-10762-2
eBook Packages: Computer ScienceComputer Science (R0)