Skip to main content

Learnability Models and Vapnik-Chervonenkis Combinatorics

  • Chapter
  • 610 Accesses

Part of the book series: Progress in Probability ((PRPR,volume 30))

Abstract

This paper surveys several models of learnability proposed and investigated by computational learning theorists during the past few years. Computational learning theory is the study of learning as seen from a computational complexity point of view. In addition to the usual space and time complexity, computational learning theory studies the sample complexity, the number of examples seen by the learner. (In a statistical setting, this is known as the sample size.) This paper will cover those models of learnability where ideas from Vapnik-Chervonenkis combinatorics have had the greatest impact. There are a few short proofs to give a flavor of some of the ideas involved, but most of the proofs are too long to be included here. The focus is on giving an idea of the variety of models and the relationships between them. For more complete surveys of computational learning theory see (1988), (1990), (1991), (1992), or the proceedings of the annual Workshop on Computational Learning Theory published by Morgan Kaufmann. Some attempt has been made to keep the notation consistent within this paper, which means that it will be inconsistent with a large subset of the references.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Alexander, K. (1987). Rates of growth and sample moduli for weighted empirical processes indexed by sets. Probability Theory and Related Fields 75, 379–423.

    Article  MathSciNet  MATH  Google Scholar 

  • Angluin, D. (1987). Queries and concept learning. Machine Learning 2(4), 319–42.

    Google Scholar 

  • Ben-David, S., A. Itai, and E. Kushilevitz. (1990). Learning by distances. Proc. 3rd Annual Workshop on Computational Learning Theory. Santa Cruz, CA Morgan Kaufmann. 232–45.

    Google Scholar 

  • Blum, M. (1990). Separating PAC and mistake-bound learning models over the Boolean domain. Proc. 31st Ann. Symp. Found. Comp. Sci.

    Google Scholar 

  • Blumer, A., A. Ehrenfeucht, D. Haussler, and M.K. Warmuth. (1987). Occam’s razor. Inf. Proc. Let. 24, 377–80.

    Article  MathSciNet  MATH  Google Scholar 

  • Blumer, A., A. Ehrenfeucht, D. Haussler, and M.K. Warmuth. (1989). Learnability and the Vapnik-Chervonenkis dimension. JACM 36(4), 929–65.

    Article  MathSciNet  MATH  Google Scholar 

  • Blumer, A. and N. Littlestone. (1989). Learning faster than promised by the Vapnik-Chervonenkis dimension. Discrete Applied Mathematics 24.

    Google Scholar 

  • Dudley, R.M. (1978). Central limit theorems for empirical measures. Ann. Prob. 6(6), 899–929.

    Article  MathSciNet  MATH  Google Scholar 

  • Dudley, R.M. (1987). Universal Donsker classes and metric entropy. Ann. Prob. 15(4), 1306–26.

    Article  MathSciNet  MATH  Google Scholar 

  • Durst, M. and R.M. Dudley. (1980). Empirical processes, Vapnik-Chervonenkis classes and Poisson processes. Probability and Mathematical Statistics 1(2), 109–15.

    MathSciNet  MATH  Google Scholar 

  • Ehrenfeucht, A., D. Haussier, M. Kearns, and L. Valiant. (1989). A general lower bound on the number of examples needed for learning. Information and Computation 82, 247–61.

    Article  MathSciNet  MATH  Google Scholar 

  • Giné, E. and J. Zinn. (1984). Some limit theorems for empirical processes. Annals of Probability, 12, 929–89.

    Article  MathSciNet  MATH  Google Scholar 

  • Haussler, D. (1990). Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications. University of California at Santa Cruz Technical Report UCSC-CRL-91-02. Information and Computation, to appear.

    Google Scholar 

  • Haussler, D., M. Kearns, and R. Schapire (1991). Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension. Proc. 4th Annual Workshop on Computational Learning Theory. Santa Cruz, CA Morgan Kaufmann.

    Google Scholar 

  • Haussler, D., N. Littlestone, and M.K. Warmuth. (1990). Predicting {0,1}-Functions on Randomly Drawn Points. University of California at Santa Cruz Technical Report UCSC-CRL-90-54. Information and Computation, to appear.

    Google Scholar 

  • Kearns, M. (1990). The Computational Complexity of Machine Learning. MIT Press, Cambridge, MA.

    Google Scholar 

  • Laird, P.D. (1988). Learning From Good and Bad Data. Kluwer, Boston, MA.

    Book  MATH  Google Scholar 

  • Littlestone, N. (1989). Learning quickly when irrelevant attributes abound. Machine Learning 2(4), 285–318.

    Google Scholar 

  • Maass, W. and G. Turán. (1992). Lower bound methods and separation results for on-line learning models. Machine Learning to appear.

    Google Scholar 

  • Massart, P. (1986). Rates of convergence in the central limit theorem for empirical processes, in Geometrical and Statistical Aspects of Probability in Banach Spaces, Lecture Notes in Mathematics 1193, Springer-Verlag. 73–109.

    Google Scholar 

  • Natarajan, B.K. (1987). On learning Boolean functions. Proc. 19th ACM Symp. on Theory of Computing New York. 296–304.

    Google Scholar 

  • Natarajan, B.K. (1991). Machine Learning: A Theoretical Approach. Morgan Kaufman, San Mateo, CA.

    Google Scholar 

  • Pollard, D. (1984). Convergence of Stochastic Processes. Springer-Verlag, NY.

    Book  MATH  Google Scholar 

  • Pollard, D. (1986). Rates of uniform almost-sure convergence for empirical processes indexed by unbounded classes of functions, manuscript.

    Google Scholar 

  • Pollard, D. (1990). Empirical Processes: Theory and Applications. Volume 2 of NSF-CBMS Regional Conference Series in Probability and Statistics. Inst. Math. Stat, and Am. Stat. Assoc.

    Google Scholar 

  • Talagrand, M. (1987). The Glivenko-Cantelli problem. Annals of Probability 15 837–70.

    Article  MathSciNet  MATH  Google Scholar 

  • Talagrand, M. (1988) Donsker classes of sets. Probability Theory and Related Fields 78 169–91.

    Article  MathSciNet  MATH  Google Scholar 

  • Valiant, L.G. (1984). A theory of the learnable. Comm. ACM 27(11), 1134–42.

    Article  MATH  Google Scholar 

  • Vapnik, V.N. and A. Ya. Chervonenkis. (1971). On the uniform convergence of relative frequencies of events to their probabilities. Th. Prob. and its Appl. 16(2), 264–80.

    Article  MathSciNet  MATH  Google Scholar 

  • Vapnik, V.N. (1982). Estimation of Dependences Based on Empirical Data. Springer-Verlag, New York.

    MATH  Google Scholar 

  • Vapnik, V.N. (1989). Inductive principles of the search for empirical dependences (methods based on weak convergence of probability measures). Proceedings of the 2nd Annual Workshop on Computational Learning Theory, Santa Cruz, CA. Morgan Kaufmann. 3–21.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1992 Springer Science+Business Media New York

About this chapter

Cite this chapter

Blumer, A. (1992). Learnability Models and Vapnik-Chervonenkis Combinatorics. In: Dudley, R.M., Hahn, M.G., Kuelbs, J. (eds) Probability in Banach Spaces, 8: Proceedings of the Eighth International Conference. Progress in Probability, vol 30. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-1-4612-0367-4_27

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-0367-4_27

  • Publisher Name: Birkhäuser, Boston, MA

  • Print ISBN: 978-1-4612-6728-7

  • Online ISBN: 978-1-4612-0367-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics