Skip to main content

Bounding VC-dimension for neural networks: Progress and prospects

  • Conference paper
  • First Online:
Computational Learning Theory (EuroCOLT 1995)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 904))

Included in the following conference series:

Abstract

Techniques from differential topology are used to give polynomial bounds for the VC-dimension of sigmoidal neural networks. The bounds are quadratic in ω, the dimension of the space of weights. Similar results are obtained for a wide class of Pfaffian activation functions. The obstruction (in differential topology) to improving the bound to an optimal bound \({\cal O}(w log w)\)(ω log ω) is discussed, and attention is paid to the role of other parameters involved in the network architecture.

Research partially supported by the DFG Grant KA 673/4-1, and by ESPRIT BR Grants 7097 and ECUS 030.

Research supported in part by a Senior Research Fellowship of the SERC.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. Anthony, N. Biggs, Computational Learning Theory: An Introduction, Cambridge University Press, 1992.

    Google Scholar 

  2. M. Anthony, J. Shawe-Taylor, A Result of Vapnik with Applications, Discrete Applied Math. 47 (1993), pp. 207–217.

    Google Scholar 

  3. A. Borodin, P. Tiwari, On the Decidability of Sparse Univariate Polynomial Interpolation, Proc. 22nd ACM STOC (1990), pp. 535–545.

    Google Scholar 

  4. L. van den Dries, Tame Topology and 0-minimal Structures, preprint, University of Illinois, Urbana, 1992; to appear as a book.

    Google Scholar 

  5. L. van den Dries, A.Macintyre and D.Marker, The Elementary Theory of Restricted Analytic Fields with Exponentation, Annuals of Mathematics 140 (1994), pp 183–205.

    Google Scholar 

  6. P.Goldberg and M.Jerrum, Bounding the Vapnik Chervonenkis Dimension of Concept Classes Parametrized by Real Numbers. Machine Learning, 1994 (to appear). A preliminary version appeared in Proc. 6th ACM Workshop on Computational Learning Theory, pp. 361–369, 1993.

    Google Scholar 

  7. G.H. Hardy, Properties of Logarithmic-Exponential Functions, Proc. London Math. Soc. 10 (1912), pp. 54–90.

    Google Scholar 

  8. D. Haussler, Decision Theoretic Generalizations of the PAC Model for Neural Nets and other Learning Applications, Information an Computation 100, (1992), pp. 78–150.

    Google Scholar 

  9. J. Hertz, A. Krogh and R. G. Palmer, Introduction to the Theory of Neural Computation, Addison-Wesley, 1991.

    Google Scholar 

  10. M. W. Hirsch, Differential Topology, Springer-Verlag, 1976.

    Google Scholar 

  11. M. Karpinski and A. Macintyre, Quadratic Bounds for VC Dimension at Sigmoidal Neural Networks, Research Report No. 85116-CS, Universität Bonn, 1994; to be submitted.

    Google Scholar 

  12. M.Karpinski and T.Werther, VC Dimension and Uniform Learnability of Sparse Polynomials and Rational Functions, SIAM J. Computing 22 (1993), pp 1276–1285.

    Google Scholar 

  13. A.G.Khovanski, Fewnomials, American Mathematical Society, Providence, R.I., 1991.

    Google Scholar 

  14. J.Knight, A.Pillay and C.Steinhorn, Definable Sets and Ordered Structures II, Trans. American Mathematical Society 295 (1986), pp.593–605.

    Google Scholar 

  15. M.C.Laskowsky, Vapnik-Chervonenkis Classes od Definable Sets, J.London Math. Society 45 (1992), pp 377–384.

    Google Scholar 

  16. W.Maass, On the Complexity of Learning on Feedforward Neural Nets, in Proc. EATCS Advanced School on Computational Learning and Cryptography, Vietri sul Mare, 1993.

    Google Scholar 

  17. W. Maass, G. Schnitger and E. D. Sontag, On the Computational Power of Sigmoidal versus Boolean Threshold Circuits, Proc. 32nd IEEE FOGS (1991), pp. 767–776.

    Google Scholar 

  18. A.J.Macintyre and E.D.Sontag, Finiteness results for Sigmoidal Neural Networks, Proc. 25th ACM STOC (1993), pp.325–334.

    Google Scholar 

  19. J.Milnor, On the Betti Numbers of Real Varieties, Proc. of the American Mathematical Society 15 (1964), pp 275–280.

    Google Scholar 

  20. J.Milnor, Topology from the Differentiable Viewpoint, Univ.Press, Virginia, 1965.

    Google Scholar 

  21. J. Shawe-Taylor, Sample Sizes for Sigmoidal Neural Networks, Preprint, University of London, 1994.

    Google Scholar 

  22. G. Turan and F. Vatan, On the Computation of Boolean Functions by Analog Circuits of Bounded Fan-in, Proc. 35th IEEE FOCS (1994), pp. 553–564.

    Google Scholar 

  23. H.E.Warren, Lower Bounds for Approximation by Non-linear Manifolds, Trans. of the AMS 133 (1968), pp. 167–178.

    Google Scholar 

  24. A.J.Wilkie, Model Completeness Results of Restricted Pfaffian Functions and the Exponential Function, to appear in Journal of the AMS, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Paul Vitányi

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Karpinski, M., Macintyre, A. (1995). Bounding VC-dimension for neural networks: Progress and prospects. In: Vitányi, P. (eds) Computational Learning Theory. EuroCOLT 1995. Lecture Notes in Computer Science, vol 904. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-59119-2_189

Download citation

  • DOI: https://doi.org/10.1007/3-540-59119-2_189

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-59119-1

  • Online ISBN: 978-3-540-49195-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics