Abstract
We consider Pitman closeness domination in predictive density estimation problems when the underlying loss metric is \(\alpha\)-divergence, \(\{D(\alpha )\}\), a loss introduced by Csiszàr (Stud Sci Math Hung 2:299–318, 1967). The underlying distributions considered are normal location-scale models, including the distribution of the observables, the distribution of the variable, whose density is to be predicted, and the estimated predictive density which will be taken to be of the plug-in type. The scales may be known or unknown. Chang and Strawderman (J Multivar Anal 128:1–9, 2014) have derived a general expression for the \(\alpha\)-divergence loss in this setup, and have shown that it is a concave monotone function of quadratic loss, and also a function of the variances (predicand, and plug-in). We demonstrate \(\{D(\alpha )\}\) Pitman closeness domination of certain plug-in predictive densities over others for the entire class of metrics simultaneously when modified Pitman closeness domination holds in the related problem of estimating the mean. We also establish \(\{D(\alpha )\}\) Pitman closeness results for certain generalized Bayesian (best invariant) predictive density estimators. Examples of \(\{D(\alpha )\}\) Pitman closeness domination presented relate to the problem of estimating the predictive density of the variable with the larger mean. We also consider the case of two-ordered normal means with a known covariance matrix.
Similar content being viewed by others
References
Barlow, R. E., Bartholomew, D. J., Bremner, J. M., & Brunk, H. D. (1972). Statistical inference under order restrictions. New York: Wiley.
Chang, Y.-T., Fukuda, K., & Shinozaki, N. (2017). Estimation of two ordered normal means when a covariance matrix is known. Statistics, 5, 1095–1104.
Chang, Y.-T., Oono, Y., & Shinozaki, N. (2012). Improved estimators for the common mean and ordered means of two normal distributions with ordered variances. Journal of Statistical Planning and Inference, 142, 2619–2628.
Chang, Y.-T., & Shinozaki, N. (2015). Estimation of two ordered normal means under modified Pitman nearness criterion. Annals of the Institute of Statistical Mathematics, 67, 863–883.
Chang, Y.-T., & Strawderman, W. E. (2014). Stochastic domination in predictive density estimation for ordered normal means under \(\alpha\)-divergence loss. Journal of Multivariate Analysis, 128, 1–9.
Chou, J.-P., & Strawderman, W. E. (1990). Minimax estimation of means of multivariate normal mixtures. Journal of Multivariate Analysis, 35, 141–150.
Cohen, A., & Sackrowitz, B. (2004). A discussion of some inference issues in order restricted models. Canadian Journal of Statistics, 32(2), 199–205.
Corcuera, J. M., & Giummolè, F. (1999). A generalized Bayes rule for prediction. Scandinavian Journal of Statistics, 26, 265–279.
Csiszàr, I. (1967). Information-type measures of difference of probability distributions and indirect observations. Studia Scientiarum Mathematicarum Hungarica, 2, 299–318.
Fernández, M. A., Rueda, C., & Salvador, B. (2000). Parameter estimation under orthant restrictions. Canadian Journal of Statistics, 28(1), 171–181.
Fourdrinier, D., Marchand, E., Righi, A., & Strawderman, W. E. (2011). On improved predictive density with parametric constraints. Electronic Journal of Statistics, 5, 172–191.
Fourdrinier, D., Strawderman, W. E., & Wells, M. T. (2018). Shrinkage estimation. Berlin: Spring series in Statistics.
Graybill, F. A., & Deal, R. B. (1959). Combining unbiased estimators. Biometrics, 15, 543–550.
Gupta, R. D., & Singh, H. (1992). Pitman nearness comparisons of estimates of two ordered normal means. Australian Journal of Statistics, 34(3), 407–414.
Hwang, J. T., & Peddada, S. D. (1994). Confidence interval estimation subject to order restrictions. The Annals of Statistics, 22(1), 67–93.
Keating, J. P., Mason, R. L., & Sen, P. K. (1993). Pitman’s measure of closeness: A comparison of statistical estimators. Philadelphia: SIAM.
Kubokawa, T. (1989). Closer estimation of a common mean in the sense of Pitman. Annals of the Institute of Statistical Mathematics, 41(3), 477–484.
Lee, C. I. C. (1981). The quadratic loss of isotonic regression under normality. The Annals of Statistics, 9(3), 686–688.
Maruyama, Y., & Strawderman, W. E. (2012). Bayesian predictive densities for linear regression models under \(\alpha\)-divergence loss: Some results and open problems. Contemporary Developments in Bayesian Analysis and Statistical Decision Theory: A Festschrift for William E. Strawderman. Institute of Mathematical Statistics. vol 8, pp 42–56
Nayak, T. K. (1990). Estimation of location and scale parameters using generalized Pitman nearness criterion. Journal of Statistical Planning and Inference, 24, 259–268.
Oono, Y., & Shinozaki, N. (2005). Estimation of two order restricted normal means with unknown and possibly unequal variances. Journal of Statistical Planning and Inference, 131(2), 349–363.
Pitman, E. J. G. (1937). The closest estimates of statistical parameters. Proceedings of the Cambridge Philosophical Society, 33, 212–222.
Robertson, T., Wright, F. T., & Dykstra, R. L. (1988). Order restricted statistical inference. New York: Wiley.
Shinozaki, N., & Chang, Y.-T. (1999). A comparison of maximum likelihood and the best unbiased estimators in the estimation of linear combinations of positive normal means. Statistics and Decisions, 17, 125–136.
Silvapulle, M. J., & Sen, P. K. (2004). Constrained statistical inference. New York: Wiley.
van Eeden, C. (2006). Restricted parameter space estimation problems. Lecture notes in statistics 188. Berlin: Springer.
Acknowledgements
We would like to thank the Editor, the Associate Editor, and anonymous reviewers for quite thoughtful and constructive comments which lead to an improved version of this paper. This work is supported by Grant-in-Aid for Scientific Research (C) nos. 26330047, 18K11196 Japan (to Yuan-Tsung Chang and Nobuo Shinozaki). This work was partially supported by a Grant from the Simons Foundation (#418098 to William Strawderman).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Appendix
Appendix
Proof of Theorem 3
We make the variable transformation
Then, \(Z_1\) and \(Z_2\) are mutually independently distributed as N(0, 1) and \(N(\varDelta /\tau _1, \tau _2^2/\tau _1^2)\), respectively.
We note that \(\hat{\varvec{\mu }}^{{\rm CS}} \ne \hat{\varvec{\mu }}^{{\rm OS}}\) if and only if \({\bar{X}}_1 > {\bar{X}}_2\) and \(s_1^2 > s_2^2\). In this case, putting \(c_i=n_i/(n_1+n_2), i=1,2\), we have
Similarly
where \(\gamma _1=n_1s_2^2/(n_1s_2^2+n_2s_1^2)\) and \(\gamma _2=n_2s_1^2/(n_1s_2^2+n_2s_1^2)\). Since \(Z_1 - Z_2= ({\bar{X}}_1- {\bar{X}}_2)/ \tau _1 >0\) and \(c_1 \ge \gamma _1\), we have \((c_1- \gamma _1)Z_1+(c_2-\gamma _2)Z_2 \ge 0\). Therefore, we see that
Then, the numerator of (14) becomes
We further make the variable transformation:
Then, \(Y_1\) and \(Y_2\) are mutually independently distributed as \(N( -\varDelta /\tau _1,\) \((\tau _1^2+\tau _2^2)/\tau _1^2)\) and \(N( \tau _1\varDelta /\tau _2^2, (\tau _1^2+\tau _2^2)/\tau _2^2)\), respectively.
Noting that
we have
if \(n_1 \ge n_2\). Therefore, we see that
implies
Thus, we have
which is larger than \(1/2 P \{ Y_1> 0, s_1^2 > s_2^2\},\) because
This completes the proof.
Proof of Theorem 4
We first show that \(\hat{\mu }_2^{{\rm MLE}}\) is Pitman closer to \(\mu _2\) than \({\bar{X}}_2\).
We see that \(\hat{\mu }_2^{{\rm MLE}} \ne {\bar{X}}_2\) if and only if \({\bar{X}}_1 > {\bar{X}}_2\). In this case \(\hat{\mu }_2^{{\rm MLE}}=c_1 {\bar{X}}_1 + c_2 {\bar{X}}_2\), where \(c_1=n_1\sigma _2^2/(n_1\sigma _2^2+n_2\sigma _1^2)\) and \(c_2=n_2\sigma _1^2/(n_1\sigma _2^2+n_2\sigma _1^2)\).
We have
Next, we make the variable transformation
Then, \(W_1\) and \(W_2\) are distributed as \(N(-\varDelta , \tau _1^2)\) and \(N(0 , \tau _2^2)\), respectively, and \(W_1\) and \(W_2\) are mutually independent, where \(\varDelta = \mu _2- \mu _1\) and \(\tau _i^2= \sigma _i^2 /n_i, i=1,2\). Then, (20) becomes
We further make the variable transformation:
Then, \(V_1\) and \(V_2\) are mutually independent and
Noting that
and denoting the p.d.f. of \(V_1\) by \(g(\cdot )\), (21) becomes
since \(c_2\tau _2^2-c_1\tau _1^2= 0\) and \(\varDelta \ge 0\). Thus, we have
with strict inequality for \(\mu _2 > \mu _1\). This completes the proof.
The analogous result for \(\hat{\mu }_1^{{\rm MLE}}\) is Pitman closer to \(\mu _1\) than \({\bar{X}}_1\).
Rights and permissions
About this article
Cite this article
Chang, YT., Shinozaki, N. & Strawderman, W.E. Pitman closeness domination in predictive density estimation for two-ordered normal means under \(\alpha\)-divergence loss. Jpn J Stat Data Sci 3, 1–21 (2020). https://doi.org/10.1007/s42081-019-00043-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42081-019-00043-1