Abstract
Let H be a linear subspace of a Hilbert space with inner product (·, ·) and norm I · II. For each n ∈ N and h ∈ H, let P n,h be a probability measure on a measurable space (X n , A n ). Consider the problem of estimating a “parameter” k n (h) given an “observation” X n with law P n,h . The convolution theorem and the minimax theorem give a lower bound on how well k n (h) can be estimated asymptotically as ↦ ∞. Suppose the sequence of statistical experiments (X n, A n , P n,h : h ∈ H) is “asymptotically normal” and the sequence of parameters is “regular”. Then the limit distribution of every “regular” estimator sequence is the convolution of a certain Gaussian distribution and a noise factor. Furthermore, the maximum risk of any estimator sequence is bounded below by the “risk” of this Gaussian distribution. These concepts are defined as follows.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1996 Springer Science+Business Media New York
About this chapter
Cite this chapter
van der Vaart, A.W., Wellner, J.A. (1996). Convolution and Minimax Theorems. In: Weak Convergence and Empirical Processes. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4757-2545-2_37
Download citation
DOI: https://doi.org/10.1007/978-1-4757-2545-2_37
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4757-2547-6
Online ISBN: 978-1-4757-2545-2
eBook Packages: Springer Book Archive