Advertisement

Genetic Search for Optimal Representations in Neural Networks

  • Paul W. Munro

Abstract

An approach to learning in feed-forward neural networks is put forward that combines gradual synaptic modification at the output layer with genetic adaptation in the lower layer(s). In this “GA-delta” technique, the alleles are linear threshold units (a set of weights and a threshold); a chromosome is a collection of such units, and hence defines a mapping from the input layer to a hidden layer. The fitness is evaluated by measuring the error after a small number of delta rule iterations on the hidden-output weights. Genetic operators are defined on these chromosomes to facilitate search for a mapping that renders the task solvable by a single layer of weights. The performance of GA-delta is presented on several tasks, and the effects of the various operators are analyzed.

Keywords

Genetic Algorithm Response Property Hide Unit Output Unit Input Unit 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Rosenblatt, F. Principles of Neurodynamics. Spartan: Washington DC, 1962.MATHGoogle Scholar
  2. [2]
    Minsky, M. and Papert, S. Perceptrons Cambridge: MIT Press, 1968.Google Scholar
  3. [3]
    Werbos, P. Beyond regression: new tools for prediction and analysis in the behavioral sciences. Unpublished doctoral dissertation, Harvard University. November, 1974.Google Scholar
  4. [4]
    Parker, D. Learning logic. TR-47. MIT Center for Computational Economics and Statistics. Cambridge MA, 1985.Google Scholar
  5. [5]
    Rumelhart D., Hinton G., and Williams R. Learning representations by back-propagating errors. Nature 323: 533–536, 1986.CrossRefGoogle Scholar
  6. [6]
    Belew, R. and Gherrity, M. Evolution, learning, and culture: computational metaphors for adaptive search. CSE TR CS89-156 UCSD. La Jolla, California, 1989.Google Scholar
  7. [7]
    Davis, L. Mapping neural networks onto classifier systems. In: Proceedings of the Third International Conference on Genetic Algorithms. San Mateo CA: Morgan Kaufmann, 1989.Google Scholar
  8. [8]
    Miller, G. Todd, P., and Hegde, S. Designing neural networks using genetic algo — rithms. In: Proceedings of the Third International Conference on Genetic Algorithms. San Mateo CA: Morgan Kaufmann, 1989.Google Scholar
  9. [9]
    Chalmers, D. The evolution of learning: an experiment in genetic connectionism. In: Proceedings of the connectionist Summer School. D. S. Touretsky, J. L. Elman, T. J. Sejnowski, and G. E. Hinton, eds. San Mateo CA: Morgan Kaufmann, 1990.Google Scholar
  10. [10]
    Schaffer, J. D., Whitley, D., and Eschelman, L. J. Combinations of Genetic Algorithms and Neural Networks: A Survey of the State of the Art. In: Proceedings of the International Workshop on Combinations of Genetic Algorithms and Neural Networks, J. D. Schaffer and Darrell Whitley, eds. IEEE, 1992.Google Scholar
  11. [11]
    Wilson, S. Perceptron Redux: Emergence of Structure. In: Proceedings of the Conference on Emergent Computation. Los Alamos, 1990.Google Scholar
  12. [12]
    Whitley, D., Starkweather, T., and Bogart, G. Genetic algorithms and neural networks: optimizing connections and connectivity. Parallel Computing 14:347–361, 1990.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag/Wien 1993

Authors and Affiliations

  • Paul W. Munro
    • 1
  1. 1.Department of Information ScienceUniversity of PittsburghPittsburghUSA

Personalised recommendations