Abstract
We present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized density estimators. Using this inequality, we are able to improve classical results concerning the convergence of two-part code MDL in [1]. Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Barron, A., Cover, T.: Minimum complexity density estimation. IEEE Transactions on Information Theory 37, 1034–1054 (1991)
Barron, A., Schervish, M.J., Wasserman, L.: The consistency of posterior distributions in nonparametric problems. Ann. Statist. 27(2), 536–561 (1999)
Le Cam, L.: Convergence of estimates under dimensionality restrictions. The Annals of Statistics 1, 38–53 (1973)
Li, J.Q.: Estimation of Mixture Models. PhD thesis, The Department of Statistics. Yale University (1999)
Meir, R., Zhang, T.: Generalization error bounds for Bayesian mixture algorithms. Journal of Machine Learning Research 4, 839–860 (2003)
Rissanen, J.: Stochastic complexity and statistical inquiry. World Scientific, Singapore (1989)
Seeger, M.: PAC-Bayesian generalization error bounds for Gaussian process classification. JMLR 3, 233–269 (2002)
van de Geer, S.A.: Empirical Processes in M-estimation. Cambridge University Press, Cambridge (2000)
Yang, Y., Barron, A.: Information-theoretic determination of minimax rates of convergence. The Annals of Statistics 27, 1564–1599 (1999)
Zhang, T.: Theoretical analysis of a class of randomized regularization methods. In: COLT 1999, pp. 156–163 (1999)
Zhang, T.: Learning bounds for a generalized family of Bayesian posterior distributions. In: NIPS 2003 (2004) (to appear)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, T. (2004). On the Convergence of MDL Density Estimation. In: Shawe-Taylor, J., Singer, Y. (eds) Learning Theory. COLT 2004. Lecture Notes in Computer Science(), vol 3120. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-27819-1_22
Download citation
DOI: https://doi.org/10.1007/978-3-540-27819-1_22
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22282-8
Online ISBN: 978-3-540-27819-1
eBook Packages: Springer Book Archive