Skip to main content

A General Framework for Encoding and Evolving Neural Networks

  • Conference paper
Book cover KI 2007: Advances in Artificial Intelligence (KI 2007)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4667))

Included in the following conference series:

Abstract

In this paper we present a novel general framework for encoding and evolving networks called Common Genetic Encoding (CGE) that can be applied to both direct and indirect encoding methods. The encoding has important properties that makes it suitable for evolving neural networks: (1) It is complete in that it is able to represent all types of valid phenotype networks. (2) It is closed, i. e. every valid genotype represents a valid phenotype. Similarly, the encoding is closed under genetic operators such as structural mutation and crossover that act upon the genotype. Moreover, the encoding’s genotype can be seen as a composition of several subgenomes, which makes it to inherently support the evolution of modular networks in both direct and indirect encoding cases. To demonstrate our encoding, we present an experiment where direct encoding is used to learn the dynamic model of a two-link arm robot. We also provide an illustration of how the indirect-encoding features of CGE can be used in the area of artificial embryogeny.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Angeline, P.J., Saunders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurrent neural networks. IEEE Transactions on Neural Networks 5, 54–65 (1994)

    Article  Google Scholar 

  2. Bentley, P., Kumar, S.: Three ways to grow designs: A comparison of embryogenies for an evolutionary design problem. In: Banzhaf, W., Daida, J., Eiben, A.E., Garzon, M.H., Honavar, V., Jakiela, M., Smith, R.E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference, Orlando, Florida, USA, 13-17 July, 1999, vol. 1, pp. 35–43. Morgan Kaufmann, San Francisco (1999)

    Google Scholar 

  3. Bongard, J.C., Pfeifer, R.: Repeated structure and dissociation of genotypic and phenotypic complexity in artificial ontogeny. In: GECCO-2001. Proceedings of the Genetic and Evolutionary Computation Conference, pp. 829–836 (2001)

    Google Scholar 

  4. Dellaert, F., Beer, R.D.: A developmental model for the evolution of complete autonomous agents. In: Proceedings of the Fourth International Conference on Simulation of Adaptive Behavior, pp. 393–401 (1996)

    Google Scholar 

  5. Gruau, F.: Neural Network Synthesis Using Cellular Encoding and the Genetic Algorithm. PhD thesis, Ecole Normale Superieure de Lyon, Laboratoire de l’Informatique du Parallelisme, France (January 1994)

    Google Scholar 

  6. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9(2), 159–195 (2001)

    Article  Google Scholar 

  7. Jakobi, N.: Harnessing morphogenesis. In: Proceedings of Information Processing in Cells and Tissues, pp. 29–41 (1995)

    Google Scholar 

  8. Kassahun, Y.: Towards a Unified Approach to Learning and Adaptation. PhD thesis, Technical Report 0602, Institute of Computer Science and Applied Mathematics, Christian-Albrechts University, Kiel, Germany (February 2006)

    Google Scholar 

  9. Kassahun, Y., Edgington, M., Metzen, J.H., Sommer, G., Kirchner, F.: A common genetic encoding for both direct and indirect encodings of networks. In: GECCO-2007. Proceedings of the Genetic and Evolutionary Computation Conference (accepted, July 2007)

    Google Scholar 

  10. Kitano, H.: Designing neural networks using genetic algoithms with graph generation system. Complex Systems 4, 461–476 (1990)

    MATH  Google Scholar 

  11. Lewis, F.L., Dawson, D.M., Abdallah, C.T.: Robot Manipulator Control: Theory and Practice. Marcel Dekker, Inc., New York, Basel (2004)

    Google Scholar 

  12. Lindenmayer, A.: Mathematical models for cellular interactions in development, parts I and II. Journal of Theoretical Biology 18, 280–315 (1968)

    Article  Google Scholar 

  13. Luke, S., Spector, L.: Evolving graphs and networks with edge encoding: Preliminary report. In: Late-breaking papers of Genetic Programming 1996, Stanford, CA (1996)

    Google Scholar 

  14. Nolfi, S., Parisi, D.: Growing neural networks. Technical Report PCIA-91-15, Institute of Psychology, Rome (1991)

    Google Scholar 

  15. Schaffer, J., Whitley, L.D., Eshelmann, L.J.: Combination of genetic algorithms and neural networks: A survey of the state of the art. In: Proceedings of COGANN92 International Workshop on the Combination of Genetic Algorithm and Neural Networks, pp. 1–37. IEEE Computer Society Press, Los Alamitos (1992)

    Chapter  Google Scholar 

  16. Sendhoff, B., Kreutz, M.: Variable encoding of modular neural networks for time series prediction. In: CEC 1999. Congress on Evolutionary Computation, pp. 259–266 (1999)

    Google Scholar 

  17. Stanley, K.O.: Efficient Evolution of Neural Networks through Complexification. PhD thesis, Artificial Intelligence Laboratory. The University of Texas at Austin, Austin, USA (August 2004)

    Google Scholar 

  18. Vaario, J., Onitsuka, A., Shimohara, K.: Formation of neural structures. In: ECAL 1997. Proceedings of the Fourth European Conference on Articial Life, pp. 214–223 (1997)

    Google Scholar 

  19. Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Joachim Hertzberg Michael Beetz Roman Englert

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kassahun, Y., Metzen, J.H., de Gea, J., Edgington, M., Kirchner, F. (2007). A General Framework for Encoding and Evolving Neural Networks. In: Hertzberg, J., Beetz, M., Englert, R. (eds) KI 2007: Advances in Artificial Intelligence. KI 2007. Lecture Notes in Computer Science(), vol 4667. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74565-5_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74565-5_17

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74564-8

  • Online ISBN: 978-3-540-74565-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics