Convolution and Minimax Theorems
Let H be a linear subspace of a Hilbert space with inner product (·, ·) and norm I · II. For each n ∈ N and h ∈ H, let P n,h be a probability measure on a measurable space (X n , A n ). Consider the problem of estimating a “parameter” k n (h) given an “observation” X n with law P n,h . The convolution theorem and the minimax theorem give a lower bound on how well k n (h) can be estimated asymptotically as ↦ ∞. Suppose the sequence of statistical experiments (X n, A n , P n,h : h ∈ H) is “asymptotically normal” and the sequence of parameters is “regular”. Then the limit distribution of every “regular” estimator sequence is the convolution of a certain Gaussian distribution and a noise factor. Furthermore, the maximum risk of any estimator sequence is bounded below by the “risk” of this Gaussian distribution. These concepts are defined as follows.
KeywordsLoss Function Coordinate Projection Separable Banach Space Estimator Sequence Brownian Bridge
Unable to display preview. Download preview PDF.