A Comparative Study of Neural Network Optimization Techniques

  • T. Ragg
  • H. Braun
  • H. Landsberg


In the last years we developed ENZO, an evolutionary neural network optimizer which we compare in this study to standard techniques for topology optimization: optimal brain surgeon (OBS), magnitude based pruning (MbP), and unit-OBS, an improved algorithm deduced from OBS. The algorithms are evaluated on several benchmark problems. We conclude that using an evolutionary algorithm as meta-heuristic like ENZO does is currently the best available optimization technique with regard to network size and performance. We show that the time complexity of ENZO is similar to magnitude based pruning and unit-OBS, while achieving significantly smaller topologies. Standard OBS is outperformed in both size reduction and time complexity.


Time Complexity Network Size Classification Error Search Point Learning Error 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    C.M. Bishop. Neural Networks for Pattern Recognition. Oxford Press, 1995.Google Scholar
  2. [2]
    H. Braun and T. Ragg. ENZO — Evolution of Neural Networks, User Manual and Implementation Guide, http://illwww.ira.uka.de. Technical Report 21/96, Universität Karlsruhe, 1996.Google Scholar
  3. [3]
    Y.L. Cun, J.S. Denker, and S.A. Solla. Optimal Brain Damage. In NIPS 2, 1990.Google Scholar
  4. [4]
    B. Hassibi and D.G. Stork. Second order derivatives for network pruning: Optimal Brain Surgeon. In NIPS 4, 1992.Google Scholar
  5. [5]
    T. Ragg. Parallelization of an Evolutionary Neural Network Optimizer Based on PVM. In Parallel Virtual Machine — EuroPVM’96, Lecture Notes in Computer Science 1156, 1996.Google Scholar
  6. [6]
    M. Riedmiller and H. Braun. A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm. In Proceedings of the ICNN, 1993.Google Scholar
  7. [7]
    J. Schäfer and H. Braun. Optimizing classifiers for handwritten digits by genetic algorithms. In Artificial Neural Networks and Genetic Algorithms, D. W. Pearson, N.C. Steele, R.F. Albrecht (editors), pages 10–13, Wien New York, 1995. Springer-Verlag.Google Scholar
  8. [8]
    W. Schiffmann, M. Joost, and R. Werner. Application of genetic algorithms to the construction of topologies for multilayer perceptrons. In Artificial Neural Networks and Genetic Algorithms,. D.W. Pearson, N.C. Steele, R.F. Albrecht (editors), pages 675–682, Wien New York, 1993. Springer-Verlag.Google Scholar
  9. [9]
    A. Stahlberger and M. Riedmiller. Fast network pruning and feature extraction by removing complete units. In NIPS 9. MIT Press, 1997.Google Scholar

Copyright information

© Springer-Verlag Wien 1998

Authors and Affiliations

  • T. Ragg
    • 1
  • H. Braun
    • 1
  • H. Landsberg
    • 1
  1. 1.Institute of Logic, Complexity and Deduction SystemsUniversity of KarlsruheKarlsruheGermany

Personalised recommendations