Skip to main content

What Size Needs Testing?

  • Conference paper
Neural Nets WIRN VIETRI-97

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

  • 106 Accesses

Abstract

In spite of the general wisdom that much more examples need to test than to train a neural network, vice-versa we show that testing the approximation capability of a neural network generally demands smaller sample size than training it.

We move in an extended PAC learning framework and use some recent results in terms of sentry functions of a concept class to statistically proof our claims.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Apolloni B. (1992) Design of Algorithms for Neural Networks. Supervised Learning. Computers and artificial intelligence, vol. 5, 457–480, 1992.

    Google Scholar 

  2. Apolloni B., Chiaravalli S. PAC Learning of concept classes through the boundaries of their items. J. Theoretical Computer Science, vol. 172, 91–120, 1997.

    Article  MATH  MathSciNet  Google Scholar 

  3. Apolloni B., C. Ferretti, G. Mauri, Approximation of optimization problems and learnability. Proc. of GAA 92,Roma, 261–268, 1992.

    Google Scholar 

  4. Barron A., Neural networks approximation, Procs. of 7th Yale Workshop on Adaprive and Learning Systems, pp.69–72, 1992.

    Google Scholar 

  5. Baum E.,B., and D. Haussler, What size net gives valid generalizations?, Neural Comp. 1, 151–160, 1989

    Article  Google Scholar 

  6. Blumer A., Ehrenfeucht A., Haussler D., Warmuth M., Learnability and the Vapnik- Chervonenkis dimension, J. of ACM 36, 929 - 965, 1989.

    Article  MATH  MathSciNet  Google Scholar 

  7. D. Haussler, Generalizing the PAC model for neural net and other learning applications. Res. Rep. UCSC-CRL-89-30, University of California, Santa Cruz, 1989.

    Google Scholar 

  8. Lawrence S, A.C. Tsoi and C.Lee. Giles, Local minima and generalization, Procs. ICNN, vol. 1, pp. 371–376, 1996.

    Google Scholar 

  9. Valiant L.G. A theory of the learnable. Communication of ACM 27, 11 (1984), pp. 1134–1142.

    Article  MATH  Google Scholar 

  10. Vapnik V.V., Estimation of dependences based on empirical data, Springer-Verlag N.Y.,1982.

    Google Scholar 

  11. Vapnik V.V., The nature of Statistical Learning Theory, Springer-Verlag N.Y.,1982.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag London Limited

About this paper

Cite this paper

Apolloni, B. (1998). What Size Needs Testing?. In: Marinaro, M., Tagliaferri, R. (eds) Neural Nets WIRN VIETRI-97. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-1520-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-1520-5_5

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-1522-9

  • Online ISBN: 978-1-4471-1520-5

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics