Abstract
This paper surveys several models of learnability proposed and investigated by computational learning theorists during the past few years. Computational learning theory is the study of learning as seen from a computational complexity point of view. In addition to the usual space and time complexity, computational learning theory studies the sample complexity, the number of examples seen by the learner. (In a statistical setting, this is known as the sample size.) This paper will cover those models of learnability where ideas from Vapnik-Chervonenkis combinatorics have had the greatest impact. There are a few short proofs to give a flavor of some of the ideas involved, but most of the proofs are too long to be included here. The focus is on giving an idea of the variety of models and the relationships between them. For more complete surveys of computational learning theory see (1988), (1990), (1991), (1992), or the proceedings of the annual Workshop on Computational Learning Theory published by Morgan Kaufmann. Some attempt has been made to keep the notation consistent within this paper, which means that it will be inconsistent with a large subset of the references.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Alexander, K. (1987). Rates of growth and sample moduli for weighted empirical processes indexed by sets. Probability Theory and Related Fields 75, 379–423.
Angluin, D. (1987). Queries and concept learning. Machine Learning 2(4), 319–42.
Ben-David, S., A. Itai, and E. Kushilevitz. (1990). Learning by distances. Proc. 3rd Annual Workshop on Computational Learning Theory. Santa Cruz, CA Morgan Kaufmann. 232–45.
Blum, M. (1990). Separating PAC and mistake-bound learning models over the Boolean domain. Proc. 31st Ann. Symp. Found. Comp. Sci.
Blumer, A., A. Ehrenfeucht, D. Haussler, and M.K. Warmuth. (1987). Occam’s razor. Inf. Proc. Let. 24, 377–80.
Blumer, A., A. Ehrenfeucht, D. Haussler, and M.K. Warmuth. (1989). Learnability and the Vapnik-Chervonenkis dimension. JACM 36(4), 929–65.
Blumer, A. and N. Littlestone. (1989). Learning faster than promised by the Vapnik-Chervonenkis dimension. Discrete Applied Mathematics 24.
Dudley, R.M. (1978). Central limit theorems for empirical measures. Ann. Prob. 6(6), 899–929.
Dudley, R.M. (1987). Universal Donsker classes and metric entropy. Ann. Prob. 15(4), 1306–26.
Durst, M. and R.M. Dudley. (1980). Empirical processes, Vapnik-Chervonenkis classes and Poisson processes. Probability and Mathematical Statistics 1(2), 109–15.
Ehrenfeucht, A., D. Haussier, M. Kearns, and L. Valiant. (1989). A general lower bound on the number of examples needed for learning. Information and Computation 82, 247–61.
Giné, E. and J. Zinn. (1984). Some limit theorems for empirical processes. Annals of Probability, 12, 929–89.
Haussler, D. (1990). Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications. University of California at Santa Cruz Technical Report UCSC-CRL-91-02. Information and Computation, to appear.
Haussler, D., M. Kearns, and R. Schapire (1991). Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension. Proc. 4th Annual Workshop on Computational Learning Theory. Santa Cruz, CA Morgan Kaufmann.
Haussler, D., N. Littlestone, and M.K. Warmuth. (1990). Predicting {0,1}-Functions on Randomly Drawn Points. University of California at Santa Cruz Technical Report UCSC-CRL-90-54. Information and Computation, to appear.
Kearns, M. (1990). The Computational Complexity of Machine Learning. MIT Press, Cambridge, MA.
Laird, P.D. (1988). Learning From Good and Bad Data. Kluwer, Boston, MA.
Littlestone, N. (1989). Learning quickly when irrelevant attributes abound. Machine Learning 2(4), 285–318.
Maass, W. and G. Turán. (1992). Lower bound methods and separation results for on-line learning models. Machine Learning to appear.
Massart, P. (1986). Rates of convergence in the central limit theorem for empirical processes, in Geometrical and Statistical Aspects of Probability in Banach Spaces, Lecture Notes in Mathematics 1193, Springer-Verlag. 73–109.
Natarajan, B.K. (1987). On learning Boolean functions. Proc. 19th ACM Symp. on Theory of Computing New York. 296–304.
Natarajan, B.K. (1991). Machine Learning: A Theoretical Approach. Morgan Kaufman, San Mateo, CA.
Pollard, D. (1984). Convergence of Stochastic Processes. Springer-Verlag, NY.
Pollard, D. (1986). Rates of uniform almost-sure convergence for empirical processes indexed by unbounded classes of functions, manuscript.
Pollard, D. (1990). Empirical Processes: Theory and Applications. Volume 2 of NSF-CBMS Regional Conference Series in Probability and Statistics. Inst. Math. Stat, and Am. Stat. Assoc.
Talagrand, M. (1987). The Glivenko-Cantelli problem. Annals of Probability 15 837–70.
Talagrand, M. (1988) Donsker classes of sets. Probability Theory and Related Fields 78 169–91.
Valiant, L.G. (1984). A theory of the learnable. Comm. ACM 27(11), 1134–42.
Vapnik, V.N. and A. Ya. Chervonenkis. (1971). On the uniform convergence of relative frequencies of events to their probabilities. Th. Prob. and its Appl. 16(2), 264–80.
Vapnik, V.N. (1982). Estimation of Dependences Based on Empirical Data. Springer-Verlag, New York.
Vapnik, V.N. (1989). Inductive principles of the search for empirical dependences (methods based on weak convergence of probability measures). Proceedings of the 2nd Annual Workshop on Computational Learning Theory, Santa Cruz, CA. Morgan Kaufmann. 3–21.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1992 Springer Science+Business Media New York
About this chapter
Cite this chapter
Blumer, A. (1992). Learnability Models and Vapnik-Chervonenkis Combinatorics. In: Dudley, R.M., Hahn, M.G., Kuelbs, J. (eds) Probability in Banach Spaces, 8: Proceedings of the Eighth International Conference. Progress in Probability, vol 30. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-1-4612-0367-4_27
Download citation
DOI: https://doi.org/10.1007/978-1-4612-0367-4_27
Publisher Name: Birkhäuser, Boston, MA
Print ISBN: 978-1-4612-6728-7
Online ISBN: 978-1-4612-0367-4
eBook Packages: Springer Book Archive