On Bayesian and Decision-Theoretic Approaches to Statistical Prediction
- 2.3k Downloads
Let Y and Z be two random vectors with joint density f(y, z|θ), where θ∈Θ is an unknown parameter vector, and consider predicting Z based on y, the observed value of Y. We investigate Bayesian and decision-theoretic approaches to this problem, taking into account the loss function and the prior distribution of θ. Exploring connections between statistical prediction and decision theory, we find that a prediction problem can be reduced to a standard decision theory problem if the induced loss function is allowed to depend on the observed data y in addition to the unknown parameter θ and the decision d. In general, the predictive posterior density f(z|y) may not contain all information necessary for obtaining optimum predictions, but the posterior density f(θ|y) is adequate for that purpose. Some admissibility results are also discussed.
Keywords and phrasesAdmissibility Bayes risk loss function predictive posterior distribution
Unable to display preview. Download preview PDF.
- 10.Johnstone, I. M. (1988). On admissibility of unbiased estimates of loss, In Statistical Decision Theory and Related Topics IV (Eds., S. S. Gupta and J. O. Berger), pp. 361–379, Springer-Verlag, New York.Google Scholar
- 20.Rukhin, A. L. (1988). Estimated loss and admissible loss estimators, In Statistical Decision Theory and Related Topics IV (Eds., S. S. Gupta and J. O. Berger), pp. 409–418, Springer-Verlag, New York.Google Scholar
- 25.Varian, H. R. (1975). A Bayesian approach to real estate assessment, In Studies in Bayesian Econometrics and Statistics in Honor of Leonard J. Savage (Eds., S. E. Fienberg and A. Zellner), pp. 195–208, North-Holland, Amsterdam.Google Scholar