Skip to main content

How a Generative Encoding Fares as Problem-Regularity Decreases

  • Conference paper
Parallel Problem Solving from Nature – PPSN X (PPSN 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5199))

Included in the following conference series:

Abstract

It has been shown that generative representations, which allow the reuse of code, perform well on problems with high regularity (i.e. where a phenotypic motif must be repeated many times). To date, however, generative representations have not been tested on irregular problems. It is unknown how they will fare on problems with intermediate and low amounts of regularity. This paper compares a generative representation to a direct representation on problems that range from having multiple types of regularity to one that is completely irregular. As the regularity of the problem decreases, the performance of the generative representation degrades to, and then underperforms, the direct encoding. The degradation is not linear, however, yet tends to be consistent for different types of problem regularity. Furthermore, if the regularity of each type is sufficiently high, the generative encoding can simultaneously exploit different types of regularities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hornby, G.S., Pollack, J.B.: Creating High-Level Components with a Generative Representation for Body-Brain Evolution. Artificial Life 8(3), 223–246 (2002)

    Article  Google Scholar 

  2. D’Ambrosio, D.B., Stanley, K.O.: A novel generative encoding for exploiting neural network sensor and output geometry. In: Whitley, D., Goldber, D., Cantu-Paz, E., Spector, L., Parmee, I., Beyer, H.-G. (eds.) GECCO 2007, pp. 974–981. ACM Press, New York (2007)

    Google Scholar 

  3. Gauci, J.J., Stanley, K.O.: Generating Large-Scale Neural Networks Through Discovering Geometric Regularities. In: Whitley, D., Goldber, D., Cantu-Paz, E., Spector, L., Parmee, I., Beyer, H.-G. (eds.) GECCO 2007, pp. 997–1004. ACM Press, New York (2007)

    Google Scholar 

  4. Gruau, F.: Genetic Synthesis of Boolean Neural Networks with a Cell Rewriting Developmental Process. International Workshop on Combinations of Genetic Algorithms and Neural Networks 6, 55–74 (1992)

    Google Scholar 

  5. Gruau, F., Whitley, D., Pyeatt, L.: A Comparison Between Cellular Encoding and Direct Encoding for Genetic Neural Networks. In: Proc. 1st Ann. Conf. on Genetic Programming 1996, pp. 81–89. MIT Press, Cambridge (1996)

    Google Scholar 

  6. Stanley, K.O., Miikkulainen, R.: A Taxonomy for Artificial Embryogeny. Artificial Life 9(2), 93–130 (2003)

    Article  Google Scholar 

  7. Reisinger, J., Miikkulainen, R.: Acquiring Evolvability Through Adaptive Representations. In: Whitley, D., Goldber, D., Cantu-Paz, E., Spector, L., Parmee, I., Beyer, H.-G. (eds.) GECCO 2007, pp. 1045–1052. ACM Press, New York (2007)

    Google Scholar 

  8. Nolfi, S., Miglino, O., Parisi, D.: Phenotypic Plasticity in Evolving Neural Networks. In: Proc. Intl. Conf. from Perception to Action. IEEE Press, Los Alamitos (1994)

    Google Scholar 

  9. Stanley, K.O.: Compositional Pattern Producing Networks: A Novel Abstraction of Development. Genetic Programming and Evolvable Machines Special Issue on Developmental Systems 8(2), 131–162 (2007)

    Article  Google Scholar 

  10. Southan, C.: Has the Yo-Yo Stopped? An Assessment of Human Protein-Coding Gene Number. Proteomics 4(6), 1712–1726 (2004)

    Article  Google Scholar 

  11. Stanley, K.O., Miikkulainen, R.: Evolving Neural Networks Through Augmenting Topologies. Evolutionary Computation 10(2), 99–127 (2002)

    Article  Google Scholar 

  12. Wolpert, D.H., Macready, W.G.: No Free Lunch Theorems for Optimization. IEEE Transactions on Evolutionary Computation 1, 67–82 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Clune, J., Ofria, C., Pennock, R.T. (2008). How a Generative Encoding Fares as Problem-Regularity Decreases. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds) Parallel Problem Solving from Nature – PPSN X. PPSN 2008. Lecture Notes in Computer Science, vol 5199. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87700-4_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-87700-4_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-87699-1

  • Online ISBN: 978-3-540-87700-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics