Skip to main content
Log in

Statistical inference based on bridge divergences

  • Published:
Annals of the Institute of Statistical Mathematics Aims and scope Submit manuscript

Abstract

M-estimators offer simple robust alternatives to the maximum likelihood estimator. The density power divergence (DPD) and the logarithmic density power divergence (LDPD) measures provide two classes of robust M-estimators which contain the MLE as a special case. In each of these families, the robustness of the estimator is achieved through a density power down-weighting of outlying observations. Even though the families have proved to be useful in robust inference, the relation and hierarchy between these two families are yet to be fully established. In this paper, we present a generalized family of divergences that provides a smooth bridge between DPD and LDPD measures. This family helps to clarify and settle several longstanding issues in the relation between the important families of DPD and LDPD, apart from being an important tool in different areas of statistical inference in its own right.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Basu, A., Lindsay, B. G. (1994). Minimum disparity estimation for continuous models: efficiency, distributions and robustness. Annals of the Institute of Statistical Mathematics, 46(4), 683–705.

  • Basu, A., Harris, I. R., Hjort, N. L., Jones, M. C. (1998). Robust and efficient estimation by minimising a density power divergence. Biometrika, 85(3), 549–559.

  • Bickel, P. J., Doksum, K. A. (2015). Mathematical statistics–basic ideas and selected topics (Vol. 1). Texts in Statistical Science Series, second edition. Boca Raton, FL: CRC Press.

  • Broniatowski, M., Toma, A., Vajda, I. (2012). Decomposable pseudodistances and applications in statistical estimation. Journal of Statistical Planning and Inference, 142(9), 2574–2585.

  • Chen, J., Tan, X. (2009). Inference for multivariate normal mixtures. Journal of Multivariate Analysis, 100, 1367–1383.

  • Dunlavy, D. M. (2005). Homotopy Optimization Methods and Protein Structure Prediction. PhD dissertation, The Graduate School of the University of Maryland, College Park.

  • Dunlavy, D. M., O’Leary, D. P. (2005). Homotopy optimization methods for global optimization. Technical report, Sandia National Laboratories, SAND2005-7495, Albuquerque, New Mexico 87185 and Livermore, California 94550.

  • Ferguson, T. S. (1996). A course in large sample theory. Texts in Statistical Science Series. London: Chapman & Hall.

    Google Scholar 

  • Fujisawa, H. (2013). Normalized estimating equation for robust parameter estimation. Electronic Journal of Statistics, 7, 1587–1606.

    Article  MathSciNet  MATH  Google Scholar 

  • Fujisawa, H., Eguchi, S. (2008). Robust parameter estimation with a small bias against heavy contamination. Journal of Multivariate Analysis, 99(9), 2053–2081.

  • Hong, C., Kim, Y. (2001). Automatic selection of the tuning parameter in the minimum density power divergence estimation. Journal of the Korean Statistical Society, 30(3), 453–465.

  • Jones, M. C., Hjort, N. L., Harris, I. R., Basu, A. (2001). A comparison of related density-based minimum divergence estimators. Biometrika, 88(3), 865–873.

  • Kanamori, T., Fujisawa, H. (2014). Affine invariant divergences associated with proper composite scoring rules and their applications. Bernoulli. Official Journal of the Bernoulli Society for Mathematical Statistics and Probability, 20(4), 2278–2304.

  • Lehmann, E. L., Casella, G. (1998). Theory of point estimation. Springer Texts in Statistics, second edition. New York: Springer-Verlag.

  • Lindsay, B. G. (1994). Efficiency versus robustness: the case for minimum Hellinger distance and related methods. The Annals of Statistics, 22(2), 1081–1114.

    Article  MathSciNet  MATH  Google Scholar 

  • Seo, B., Lindsay, B. G. (2013). A universally consistent modification of maximum likelihood. Statistica Sinica, 23(2), 467–487.

  • van der Vaart, A. W. (1998). Asymptotic statistics. Cambridge: Cambridge University Press.

    Book  MATH  Google Scholar 

  • Warwick, J., Jones, M. C. (2005). Choosing a robustness tuning parameter. Journal of Statistical Computation and Simulation, 75(7), 581–588.

  • Windham, M. P. (1995). Robustifying model fitting. Journal of the Royal Statistical Society. Series B. Methodological, 57(3), 599–609.

    MathSciNet  MATH  Google Scholar 

  • Zhang, S., He, X. (2016). Inference based on adaptive grid selection of probability transforms. Statistics. A Journal of Theoretical and Applied Statistics, 50(3), 667–688.

Download references

Acknowledgements

The authors gratefully acknowledge the comments of two anonymous referees as well as the members of the editorial board which led to a significantly improved version of the paper. The authors are indebted to Srijata Samanta of University of Florida for her contribution toward Remark 13.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ayanendranath Basu.

Electronic supplementary material

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kuchibhotla, A.K., Mukherjee, S. & Basu, A. Statistical inference based on bridge divergences. Ann Inst Stat Math 71, 627–656 (2019). https://doi.org/10.1007/s10463-018-0665-x

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10463-018-0665-x

Keywords

Navigation