Skip to main content

A Comparative Study of Neural Network Optimization Techniques

  • Conference paper
Artificial Neural Nets and Genetic Algorithms

Abstract

In the last years we developed ENZO, an evolutionary neural network optimizer which we compare in this study to standard techniques for topology optimization: optimal brain surgeon (OBS), magnitude based pruning (MbP), and unit-OBS, an improved algorithm deduced from OBS. The algorithms are evaluated on several benchmark problems. We conclude that using an evolutionary algorithm as meta-heuristic like ENZO does is currently the best available optimization technique with regard to network size and performance. We show that the time complexity of ENZO is similar to magnitude based pruning and unit-OBS, while achieving significantly smaller topologies. Standard OBS is outperformed in both size reduction and time complexity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C.M. Bishop. Neural Networks for Pattern Recognition. Oxford Press, 1995.

    Google Scholar 

  2. H. Braun and T. Ragg. ENZO — Evolution of Neural Networks, User Manual and Implementation Guide, http://illwww.ira.uka.de. Technical Report 21/96, Universität Karlsruhe, 1996.

    Google Scholar 

  3. Y.L. Cun, J.S. Denker, and S.A. Solla. Optimal Brain Damage. In NIPS 2, 1990.

    Google Scholar 

  4. B. Hassibi and D.G. Stork. Second order derivatives for network pruning: Optimal Brain Surgeon. In NIPS 4, 1992.

    Google Scholar 

  5. T. Ragg. Parallelization of an Evolutionary Neural Network Optimizer Based on PVM. In Parallel Virtual Machine — EuroPVM’96, Lecture Notes in Computer Science 1156, 1996.

    Google Scholar 

  6. M. Riedmiller and H. Braun. A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm. In Proceedings of the ICNN, 1993.

    Google Scholar 

  7. J. Schäfer and H. Braun. Optimizing classifiers for handwritten digits by genetic algorithms. In Artificial Neural Networks and Genetic Algorithms, D. W. Pearson, N.C. Steele, R.F. Albrecht (editors), pages 10–13, Wien New York, 1995. Springer-Verlag.

    Google Scholar 

  8. W. Schiffmann, M. Joost, and R. Werner. Application of genetic algorithms to the construction of topologies for multilayer perceptrons. In Artificial Neural Networks and Genetic Algorithms,. D.W. Pearson, N.C. Steele, R.F. Albrecht (editors), pages 675–682, Wien New York, 1993. Springer-Verlag.

    Google Scholar 

  9. A. Stahlberger and M. Riedmiller. Fast network pruning and feature extraction by removing complete units. In NIPS 9. MIT Press, 1997.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Wien

About this paper

Cite this paper

Ragg, T., Braun, H., Landsberg, H. (1998). A Comparative Study of Neural Network Optimization Techniques. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6492-1_75

Download citation

  • DOI: https://doi.org/10.1007/978-3-7091-6492-1_75

  • Publisher Name: Springer, Vienna

  • Print ISBN: 978-3-211-83087-1

  • Online ISBN: 978-3-7091-6492-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics