Abstract
In developing algorithms that dynamically changes the structure and weights of ANN (Artificial Neural Networks), there must be a proper balance between network complexity and its generalization capability. SEPA addresses these issues using an encoding scheme where network weights and connections are encoded in matrices of real numbers. Network parameters are locally encoded and locally adapted with fitness evaluation consisting mainly of fast feed-forward operations. Experimental results in some well-known classification problems demonstrate SEPA’s high consistency performance in classification, fast convergence, and good optimality of structure.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Palmes, P.P., Hayasaka, T., Usui, S. (2003). SEPA: Structure Evolution and Parameter Adaptation in Feed-Forward Neural Networks. In: Cantú-Paz, E., et al. Genetic and Evolutionary Computation — GECCO 2003. GECCO 2003. Lecture Notes in Computer Science, vol 2724. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45110-2_44
Download citation
DOI: https://doi.org/10.1007/3-540-45110-2_44
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40603-7
Online ISBN: 978-3-540-45110-5
eBook Packages: Springer Book Archive