Abstract
We argue that common features of non-parametric estimation appear in parametric cases as well if there is a deviation from the classical regularity condition. Namely, in many non-parametric estimation problems (as well as some parametric cases) unbiased finite-variance estimators do not exist; neither estimator converges locally uniformly with the optimal rate; there are no asymptotically unbiased with the optimal rate estimators; etc.
We argue that these features naturally arise in particular parametric subfamilies of non-parametric classes of distributions. We generalize the notion of regularity of a family of distributions and present a general regularity condition, which leads to the notions of the information index and the information function.
We argue that the typical structure of a continuity modulus explains why unbiased finite-variance estimators cannot exist if the information index is larger than two, while in typical non-parametric situations neither estimator converges locally uniformly with the optimal rate. We present a new result on impossibility of locally uniform convergence with the optimal rate.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Barankin, E. W. (1949). Locally best unbiased estimates. Annals of Mathematical Statistics, 20, 477–501.
Devroye, L. (1995). Another proof of a slow convergence result of Birgé. Statistics & Probability Letters, 23(1), 63–67.
Donoho D. L., & Liu, R. C. (1991). Geometrizing rates of convergence II, III. Annals of Statistics, 19(2), 633–667, 668–701.
Ibragimov, I. A., & Khasminskii, R. Z. (1980). Estimation of distribution density. Zap. Nauch. Sem. LOMI, 98, 61–85.
Liu, R. C., & Brown, L. D. (1993). Nonexistence of informative unbiased estimators in singular problems. Annals of Statistics, 21(1), 1–13.
Novak, S. Y. (2006). A new characterization of the normal law. Statistics & Probability Letters, 77(1), 95–98.
Novak, S. Y. (2010). Impossibility of consistent estimation of the distribution function of a sample maximum. Statistics, 44(1), 25–30.
Novak, S. Y. (2010). Lower bounds to the accuracy of sample maximum estimation. Theory of Stochastic Processes, 15(31)(2), 156–161.
Novak, S. Y. (2011). Extreme value methods with applications to finance. London: Taylor & Francis/CRC. ISBN 978-1-43983-574-6.
Pfanzagl, J. (2000). On local uniformity for estimators and confidence limits. Journal of Statistical Planning and Inference, 84, 27–53.
Pfanzagl, J. (2001). A nonparametric asymptotic version of the Cramér-Rao bound. In State of the art in probability and statistics. Lecture notes in monograph series (Vol. 36, pp. 499–517). Beachwood, OH: Institute of Statistical Mathematics.
Tsybakov, A. B. (2009). Introduction to nonparametric estimation. New York: Springer.
Acknowledgements
The author is grateful to the anonymous reviewer for helpful comments.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Novak, S.Y. (2018). Non-parametric Lower Bounds and Information Functions. In: Bertail, P., Blanke, D., Cornillon, PA., Matzner-Løber, E. (eds) Nonparametric Statistics. ISNPS 2016. Springer Proceedings in Mathematics & Statistics, vol 250. Springer, Cham. https://doi.org/10.1007/978-3-319-96941-1_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-96941-1_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-96940-4
Online ISBN: 978-3-319-96941-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)