Abstract
We prove by means of a counterexample that it is not sufficient, for probably approximately correct (PAC) learning under a class of distributions, to have a uniform bound on the metric entropy of the class of concepts to be learned. This settles a conjecture of Benedek and Itai.
Manuscript received No. 9, 1992; revised Sept. 22, 1993. The research of R. M. Dudley was partially supported by National Sciences Foundation grants. The work of S. Kulkarni was supported in part by the Army Research Office under Grant DAAL03-91-G-0320 and by the National Science Foundation under Grant IRI-92-09577. The work of O. Zeitouni was done while visiting the Center for Intelligent Control Systems at M.I.T. under support of the U.S. Army Research Office Grant DAAL03-92-G-0115.
IEEE Log Number 9401814.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
L. G. Valiant, “A theory of the learnable,” Commun. ACM, vol. 27, no. 11, pp. 1134–1142, 1984.
A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth, “Learnability and the Vapnik–Chervonenkis dimension,” J. ACM, vol. 36, no. 4, pp. 929–965, 1989.
D. Haussler, “Decision theoretic generalizations of the PAC model for neural net and other learning applications,” Inf. Comput., vol. 20, pp. 78–150, 1992.
G. M. Benedek and A. Itai, “Learnability with respect to a fixed distribution,” Theor. Comput. Sci., vol. 86, pp. 377–389, 1991.
V. N. Vapnik, Estimation of Dependences Based on Empirical Data. New York: Springer-Verlag, 1982.
R. M. Dudley, “A course on empirical processes,” Lecture Notes in Math. Vol. 1097. New York: Springer, 1984, pp. 1–142.
V. N. Vapnik and A. Ya. Chervonenkis, “On the uniform convergence of relative frequencies of events to their probabilities,” Theory Probab. Its Appl., vol. 16, no. 2, pp. 264–280, 1971.
V. N. Vapnik and A. Ya. Chervonenkis, “Necessary and sufficient conditions for the uniform convergence of means to their expectations,” Theory Probab. Its Appl., vol. 26, no. 3, pp. 532–553, 1981.
S. R. Kulkarni, “Problems of computational and information complexity in machine vision and learning,” Ph.D. thesis, Dep. Elec. Eng. Comput. Sci., M.I.T., June 1991.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Dudley, R.M., Kulkarni, S.R., Richardson, T., Zeitouni, O. (2010). A Metric Entropy Bound is Not Sufficient for Learnability. In: Giné, E., Koltchinskii, V., Norvaisa, R. (eds) Selected Works of R.M. Dudley. Selected Works in Probability and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-5821-1_28
Download citation
DOI: https://doi.org/10.1007/978-1-4419-5821-1_28
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-5820-4
Online ISBN: 978-1-4419-5821-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)