Skip to main content

Comparison of Pruning Algorithms in Neural Networks

  • Conference paper
Data Science, Classification, and Related Methods
  • 2024 Accesses

Summary

In order to select the right-sized network, many pruning algorithms have been proposed. One may ask which of the pruning algorithms is best in terms of the generalization error of the resulting artificial neural network classifiers. In this paper, we compare the performance of four pruning algorithms in small training sample size situations. A comparative study with artificial and real data suggests that the weight-elimination method proposed by Weigend et al. is best.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Gabor, D. (1946): Theory of communication, J. Inst. Elect. Engr., 93, 429–459.

    Google Scholar 

  • Hamamoto, Y. et al. (1996a): On the behavior of artificial neural network classifiers in high-dimensional spaces, IEEE Trans. Pattern Analysis and Machine Intelligence, 18, 5, 571–574.

    Article  Google Scholar 

  • Hamamoto, Y.(1996b): Recognition of handwritten numerals using Gabor features, In Proc. of 13th Int. Conf. Pattern Recognition,Vienna, in press.

    Google Scholar 

  • Jain, A. K. and Chandrasekaran, B. (1982): Dimensionality and sample size considerations in pattern recognition practice, In Handbook of Statistics, Vol. 2, P. R. Krishnaiah and L. N. Kanal, Eds., North-Holland, 835–855.

    Google Scholar 

  • Karnin, E. D. (1990): A simple procedure for pruning back-propagation trained neural networks, IEEE Trans. Neural Networks, 1, 2, 239–242.

    Article  Google Scholar 

  • Kruschke, J. K. (1988): Creating local and distributed bottlenecks in hidden layers of back-propagation networks, In Proc. 1988 Connectionist Models Summer School, 120–126.

    Google Scholar 

  • Le Cun, Y.(1990): Optimal brain damage, In Advances in Neural Information Processing (2), Denver, 598–605.

    Google Scholar 

  • Reed, R. (1993): Pruning algorithms–A survey, IEEE Trans. Neural Networks, 4, 5, 740–747.

    Article  Google Scholar 

  • Rumelhart, D. E. et al. (1986): Learning internal representations by error propagation, In D. E. Rumelhart and J. L. McClelland (Eds.), Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol.1: Foundations., MIT Press.

    Google Scholar 

  • Van Ness, J. (1980): On the dominance of non-parametric Bayes rule discriminant algorithms in high dimensions, Pattern Recognition, 12, 355–368.

    Article  Google Scholar 

  • Weigend, A. S. et al. (1991): Generalization by weight-elimination with application to forecasting, In Advances in Neural Information Processing ( 3 ), 875–882.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer Japan

About this paper

Cite this paper

Hamamoto, Y., Hase, T., Nakai, S., Tomita, S. (1998). Comparison of Pruning Algorithms in Neural Networks. In: Hayashi, C., Yajima, K., Bock, HH., Ohsumi, N., Tanaka, Y., Baba, Y. (eds) Data Science, Classification, and Related Methods. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Tokyo. https://doi.org/10.1007/978-4-431-65950-1_36

Download citation

  • DOI: https://doi.org/10.1007/978-4-431-65950-1_36

  • Publisher Name: Springer, Tokyo

  • Print ISBN: 978-4-431-70208-5

  • Online ISBN: 978-4-431-65950-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics