Advertisement

Toward Code Evolution by Artificial Economies

  • Eric B. Baum
  • Igor Durdanovic
Part of the Natural Computing Series book series (NCS)

Abstract

We have begun exploring code evolution by artificial economies. We implemented a reinforcement learning machine called Hayek2 consisting of agents, written in a machine language inspired by Ray’s Tierra, that interact economically. The economic structure of Hayek2 addresses credit assignment at both the agent and meta levels. Hayek2 succeeds in evolving code to solve Blocks World problems, and has been more effective at this than our hillclimbing program and our genetic program (GP). Our hillclimber and our GP also performed well, learning algorithms as strong as a simple search program that incorporates hand-coded domain knowledge. We made efforts to optimize our hillclimbing program and it has features that may be of independent interest. Our GP using crossover performed far better than a version utilizing other macro-mutations or our hillclimber, bearing on a controversy in the genetic programming literature.

Keywords

Genetic Program Code Evolution Expression Tree Random Line Credit Assignment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    P. Angeline. Subtree Crossover: Building Block Engine or Macromutation. In Koza et al., editor, Genetic Programming 1997, Proc 2nd ann., pages 9–17, 1997. Morgan Kaufmann, San Francisco, CA.Google Scholar
  2. 2.
    F. Bacchus and F. Kabanza. Using Temporal Logic to Control Search in Planning. In European Workshop on Planning, 1995. Unpublished document available from http://logos.uwaterloo.ca/tlplan/tlplan.html, 1995.Google Scholar
  3. 3.
    W. Banzhaf, P. Nordin, R. E. Keller, and F. D. Francone. Genetic Programming, An Introduction. Morgan Kaufmann, San Francisco, CA, 1998.zbMATHGoogle Scholar
  4. 4.
    E. B. Baum. Toward a Model of Mind as a Laissez-faire Economy of Idiots, extended abstract. In L. Saitta, editor. Proceedings of 13th International Conference on Machine Learning ’96, pages 28–36, 1996. Morgan Kaufmann, San Francisco, CA.Google Scholar
  5. 5.
    E. B. Baum. Manifesto for an Evolutionary Economics of Intelligence. In C.M. Bishop, editor. Neural Networks and Machine Learning, pages 285–344, 1998. NATO ASI Series F, Computer and System Sciences, Vol 168, Springer-Verlag, Berlin.Google Scholar
  6. 6.
    E. B. Baum, D. Boneh, and C. Garrett. On Genetic Algorithms. In COLT ’95: Proceedings of the Eighth Annual Conference on Computational Learning Theory, pages 230–239, 1995. Association for Computing Machinery, New York.CrossRefGoogle Scholar
  7. 7.
    E. B. Baum and I. Durdanovic. Toward Code Evolution by Artificial Economies. Technical Report TR-98-065, NECI, 1998.Google Scholar
  8. 8.
    E. B. Baum and I. Durdanovic. Evolution of Cooperative Problem Solving in an Artificial Economy. Neural Computation, to appear, 2000.Google Scholar
  9. 9.
    A. Birk and W. J. Paul. Schemas and Genetic Programming. In 1994 Conference on Integration of Elementary Functions into Complex Behavior, 1995. Bielefeld.Google Scholar
  10. 10.
    K. E. Drexler and M. S. Miller. Incentive Engineering for Computational Resource Management. In B.A. Huberman, editor. The Ecology of Computation Studies in Computer Science and Artificial Intelligence 2, pages 231–266, 1988. North Hohand, New York.Google Scholar
  11. 11.
    G. Hardin. The Tragedy of the Commons. Science, 162:1243–1248, 1968.CrossRefGoogle Scholar
  12. 12.
    J. H. Holland. MIT Press, Cambridge, MA, 1975.Google Scholar
  13. 13.
    J. H. Holland. Escaping Brittleness: The Possibilities of General Purpose Learning Algorithms Applied to Parallel Rule-Based Systems. In T.M. Mitchell R.S. Michalski, J.G. Carbonell, editor. Machine Learning II, pages 593–623, 1986. Morgan Kauffman, Los Altos,CA.Google Scholar
  14. 14.
    J. R. Koza. Genetic Programming. MIT Press, Cambridge, MA, 1992.zbMATHGoogle Scholar
  15. 15.
    K. Lang. Hill Climbing Beats Genetic Search on a Boolean Circuit Synthesis Task of Koza’s. In The Twelfth International Conference on Machine Learning, pages 340–343, 1995.Google Scholar
  16. 16.
    M. S. Miller and K. E. Drexler. Comparative Ecology, a Computational Perspective. In B.A. Huberman, editor. The Ecology of Computation, Studies in Computer Science and Artificial Intelligence 2, pages 51–76, 1988. North Hohand, New York.Google Scholar
  17. 17.
    D. J. Montana. Strongly Typed Genetic Programming. Evolutionary Computation, 3(2):199–230, 1994.CrossRefGoogle Scholar
  18. 18.
    U. M. O’Reilly and F. Oppacher. Program Search with a Hierarchical Variable Representation: Genetic Programming, Simulated Annealing, and Hill Climbing. In H. P. Schwefel and R. Manner, editors. Parallel Problem Solving from Nature-PPSN1, Lecture Notes in Computer Science Vol 866 pp 397–406. Springer-Verlag, Berlin, 1994.Google Scholar
  19. 19.
    T.S. Ray. An Approach to the Synthesis of Life. In C. Langton and C. Taylor, editors. Artificial Life II, volume XI, pages 371–408, 1991. Addison-Wesley, Redwood City, CA.Google Scholar
  20. 20.
    S. D. Whitehead and D. H. Ballard. Learning to Perceive and Act. Machine Learning, 7(1):45–83, 1991.Google Scholar
  21. 21.
    T. Winograd. Understanding Natural Language. Academic Press, New York, 1972.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Eric B. Baum
  • Igor Durdanovic

There are no affiliations available

Personalised recommendations