Skip to main content

Equivalent Number of Degrees of Freedom for Neural Networks

  • Conference paper
Advances in Data Analysis

Abstract

The notion of equivalent number of degrees of freedom (e.d.f.) to be used in neural network modeling from small datasets has been introduced in Ingrassia and [Morlini (2005)]. It is much smaller than the total number of parameters and it does not depend on the number of input variables. We generalize our previous results and discuss the use of the e.d.f. in the general framework of multivariate nonparametric model selection. Through numerical simulations, we also investigate the behavior of model selection criteria like AIC, GCV and BIC/SBC, when the e.d.f. is used instead of the total number of the adaptive parameters in the model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • BARTLETT, P.L. (1998): The Sample Complexity of Pattern Classification With Neural Networks: The Size of the Weights Is More Important Than the Size of the Network. IEEE Transaction on Information Theory, 44, 525–536.

    Article  MathSciNet  MATH  Google Scholar 

  • HWANG, J.T.G. and DING, A.A. (1997): GPrediction Intervals for Artificial Neural Networks. Journal of the American Statistical Association, 92,438, 748–757.

    Article  MathSciNet  MATH  Google Scholar 

  • INGRASSIA, S. (1999): Geometrical Aspects of Discrimination by Multilayer Perceptrons. Journal of Multivariate Analysis, 68, 226–234.

    Article  MathSciNet  MATH  Google Scholar 

  • INGRASSIA, S. and MORLINI, I. (2005): Neural Network Modeling for Small Datasets. Technometrics, 47, 297–311.

    Article  MathSciNet  Google Scholar 

  • KADANE, J.P. and LAZAR, N.A. (2004): Methods and Criteria for Model Selection. Journal of the American Statistical Association, 99, 279–290.

    Article  MathSciNet  MATH  Google Scholar 

  • KATZ, R.W. (1981): On Some Criteria for Estimating the Order of a Markov Chain. Technometrics, 23, 243–249.

    Article  MathSciNet  MATH  Google Scholar 

  • KOEHLER, A.B. and MURPHREE, E.S. (1988): A Comparison of the Aikake and Schwarz Criteria for Selecting Model Order. Applied Statistics, 37, 187–195.

    Article  MathSciNet  Google Scholar 

  • RAFTERY, A.E. (1995): Bayesian Model Selection in Social Research. Sociological Methodology, 25, 111–163.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ingrassia, S., Morlini, I. (2007). Equivalent Number of Degrees of Freedom for Neural Networks. In: Decker, R., Lenz, H.J. (eds) Advances in Data Analysis. Studies in Classification, Data Analysis, and Knowledge Organization. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-70981-7_26

Download citation

Publish with us

Policies and ethics