Abstract
Using genetic algorithms for solving dynamic optimization problems is an important area of current research. In this work, we investigate effects of speciation in NeuroEvolution of Augmenting Topologies (NEAT), a well-known method for evolving neural network topologies, on problems with dynamic fitness function. NEAT uses speciation as a method of maintaining diversity in the population and protecting new solutions against competition. We show that NEAT outperforms non-speciated genetic algorithm (GA) not only on problems with static fitness function, but also on problems with gradually moving optimum. We also demonstrate that NEAT fails to achieve better performance on problems where the optimum moves rapidly. We propose a novel method called DynNEAT, which extends NEAT by changing the size of each species based on its historical performance. We demonstrate that DynNEAT outperforms both NEAT and non-speciated GA on problems with rapidly moving optimum, while it achieves performance similar to NEAT on problems with static or slowly moving optimum.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Branke, J.: Memory enhanced evolutionary algorithms for changing optimization problems. In: Congress on Evolutionary Computation, CEC 1999, pp. 1875–1882. IEEE (1999)
Jin, Y., Branke, J.: Evolutionary optimization in uncertain environments-a survey. IEEE Trans. Evolutionary Computation, 303–317 (2005)
Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artificial Life 15(2), 185–212 (2009)
Stanley, K.O., Miikkulainen, R.: Efficient reinforcement learning through evolving neural network topologies. In: Proceedings of the Genetic and Evolutionary Computation Conference. Morgan Kaufmann, San Francisco (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Krčah, P. (2012). Effects of Speciation on Evolution of Neural Networks in Highly Dynamic Environments. In: Hamadi, Y., Schoenauer, M. (eds) Learning and Intelligent Optimization. LION 2012. Lecture Notes in Computer Science, vol 7219. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34413-8_39
Download citation
DOI: https://doi.org/10.1007/978-3-642-34413-8_39
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-34412-1
Online ISBN: 978-3-642-34413-8
eBook Packages: Computer ScienceComputer Science (R0)