Advertisement

Optimal Genetic Representation of Complete Strictly-Layered Feedforward Neural Networks

  • Spyros Raptis
  • Spyros Tzafestas
  • Hermione Karagianni
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2085)

Abstract

The automatic eveolution of neural networks is both an attractive and a rewarding task. The connectivity matrix is the most common way of directly encoding a neural network for the purpose of genetic optimization. However, this representation presents several disadvantages mostly stemming from its inherent redundancy and its lack of rebustness. We propose a novel representation scheme for encoding complete strictly-layered feedforward neural networks and prove that it is optimal in the sense that it utilizes the minimum possible number of bits. We argue that this scheme has a number of important advantages over the direct encoding of the connectivity matrix. It does not suffer from the curse of dimensionality, it allows only legal networks to be represented which relieves the genetic algorithm from a number of checking and rejections, and the mapping from the genotypes to phenotypes is one-to-one. Additionally, the resulting networks have a simpler structure assuring an easier learning phase.

Keywords

Neural Network Genetic Algorithm Artificial Intelligence Representation Scheme Problem Complexity 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    X. Yao, “Evolving Artificial Neural Networks”, Proceedings of the IEEE, Vol. 87, No. 9. pp. 1423–1447, September (1999)Google Scholar
  2. 2.
    T.R.J Bossomair and N. Snoad, “Evolution and Modularity in Neural Networks”, in Proc. IEEE Workshop on Non-Linear Signal and Image Processing. I. Pitas, Ed., IEEE, Thessaloniki, Greece, pp. 282–292 (1992)Google Scholar
  3. 3.
    D. Dasgupta and D.R. McGregor, “Designing Application-Specific Neural Networks Using the Structured Genetic Algorithm”, in Proc. COGANN-92: Int't Workshop on Combinations of Genetic Algorithms and Neural Networks, L. D. Whitley and J. D. Schaffer (Eds.), IEEE Computer Society Press, June 5 (1992)Google Scholar
  4. 4.
    D. Whitley and C. Bogart, “The evolution of connectivity: pruning neural networks using genetic algorithms”, in Proc. Int’l Joint Conf. on Neural Networks, Vol. 1 pp. 134–137, Washington, DC, Lawrence Erlbaum Associates, Hillsdale, NJ (1990)Google Scholar
  5. 5.
    N.J. Radeliffe, “Genetic Set Recombination and its Application to Neural Network Topology Optimization”, Neural Computing and Applications, Vol. 1, pp. 67–90 (1993)Google Scholar
  6. 6.
    V. Maniezzo, “Genetic Evolution of the Network Topology and Weight Distribution of Neural Networks”, IEEE Trans. Neural Networks, Vol. 5, No. 1, pp. 39–53, January(1994)Google Scholar
  7. 7.
    M. Mandischer, “Representation and evolution of neural networks”, in R. F. Albrecht, C. R. Reeves and U. C. Steele (Eds.), Artificial Neural Nets and Genetic Algorithms, pp.643–649 (1993)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Spyros Raptis
    • 1
  • Spyros Tzafestas
    • 1
  • Hermione Karagianni
    • 1
  1. 1.Department of Electrical and Electronic Engineering National Technical University of Athens Zographou CampusIntelligent Robotics and Automation Laboratoty Division of Signals, Control and RoboticsAthensGREECE

Personalised recommendations