Advertisement

Information Measures for Pareto Distributions and Order Statistics

  • Majid Asadi
  • Nader Ebrahimi
  • G. G. Hamedani
  • Ehsan S. Soofi
Chapter
Part of the Statistics for Industry and Technology book series (SIT)

Abstract

This paper consists of three sections. The first section gives an overview of the basic information functions, their interpretations, and dynamic information measures that have been recently developed for lifetime distributions. The second section summarizes the information features of univariate Pareto distributions, tabulates transformations of a Pareto random variable under which information measures of numerous distributions can be obtained, and gives a few characterizations of the generalized Pareto distribution. The final section summarizes information measures for order statistics and tabulates the expressions for Shannon entropies of order statistics for numerous distributions.

Keywords and phrases

Characterization entropy hazard rate Kullback-Leibler reliability Rényi residual life Shannon 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Arnold, B. C. (1983). Pareto Distributions, International Co-operative, Publishing House, Baltimore, MD.Google Scholar
  2. 2.
    Arnold, B. C., Balakrishnan, N., and Nagaraja, H. N. (1992). A First Course in Order Statistics, John Wiley_& Sons, New York.zbMATHGoogle Scholar
  3. 3.
    Asadi, M., and Ebrahimi, N. (2000). Residual entropy and its characterizations in terms of hazard function and mean residual life function, Statistics_& Probability Letters, 49, 263–269.zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Asadi, M., Ebrahimi, N., Hamedani, G. G., and Soofi, E. S. (2004). Maximum dynamic entropy models, Journal of Applied Probability, 41, 379–390.zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Asadi, M., Ebrahimi, N., and Soofi, E. S. (2005). Dynamic generalized information measures, Statistics_& Probability Letters, 71, 85–98.zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Bernardo, J. M. (1979). Expected information as expected utility, Annals of Statistics, 7, 686–690.zbMATHMathSciNetGoogle Scholar
  7. 7.
    Darbellay, G. A., and Vajda, I. (2000). Entropy expressions for multivariate continuous distributions, IEEE Transactions on Information Theory, 46, 709–712.zbMATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Di Crescenzo, A., and Longobardi, M. (2002). Entropy-based measure of uncertainty in past lifetime distributions, Journal of Applied Probability, 39, 434–440.zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Di Crescenzo, A., and Longobardi, M. (2004). A measure of discrimination between past lifetime distributions, Statistics_& Probability Letters, 67, 173–182.zbMATHCrossRefGoogle Scholar
  10. 10.
    Ebrahimi, N. (1996). How to measure uncertainty in the residual lifetime distributions, Sankhyā, Series A, 58, 48–57.zbMATHMathSciNetGoogle Scholar
  11. 11.
    Ebrahimi, N., and Kirmani, S. N. U. A. (1996a). A characterization of the proportional hazards model through a measure of discrimination between two residual life distributions, Biometrika, 83, 233–235.zbMATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Ebrahimi, N., and Kirmani, S. N. U. A. (1996b). A measure of discrimination between two residual lifetime distributions and its applications, Annals of Institute of Statistical Mathematics, 48, 257–265.zbMATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Ebrahimi, N., Soofi, E. S., and Zahedi, H. (2004). Information properties of order statistics and spacings, IEEE Transactions on Information Theory, 50, 177–183.CrossRefMathSciNetGoogle Scholar
  14. 14.
    Golan, A., and Perloff, J. M. (2002). Comparison of maximum entropy and higher-order entropy estimators, Journal of Econometrics, 107, 195–211.zbMATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Good, I. J. (1950). Probability and Weighting of Evidence, Griffin, London.Google Scholar
  16. 16.
    Jaynes, E. T. (1957). Information theory and statistical mechanics, Physics Review, 106, 620–630.CrossRefMathSciNetGoogle Scholar
  17. 17.
    Jaynes, E. T. (1982). On the rationale of maximum-entropy methods, Proceedings of IEEE, 70, 939–952.CrossRefGoogle Scholar
  18. 18.
    Kullback, S. (1959). Information Theory and Statistics, John Wiley _& Sons, New York.zbMATHGoogle Scholar
  19. 19.
    Kullback, S., and Leibler, R. A. (1951). On information and sufficiency, Annals of Mathematical Statistics, 22, 79–86.MathSciNetzbMATHGoogle Scholar
  20. 20.
    Nadarajah, S., and Zografos, K. (2003). Formulas for Rényi information and related measures for univariate distributions, Information Science, 155, 119–138.zbMATHCrossRefMathSciNetGoogle Scholar
  21. 21.
    Park, S. (1995). The entropy of consecutive order statistics, IEEE Transactions on Information Theory, 41, 2003–2007.zbMATHCrossRefGoogle Scholar
  22. 22.
    Park, S. (1996). Fisher information on order statistics, Journal of the American Statistical Association, 91, 385–390.zbMATHCrossRefMathSciNetGoogle Scholar
  23. 23.
    Rényi, A. (1961). On measures of entropy and information, Proceedings of the Fourth Berkeley Symposium, 1, 547–561, University of California Press, Berkeley.Google Scholar
  24. 24.
    Shannon, C. E. (1948). A mathematical theory of communication, Bell System Technical Journal, 27, 379–423.MathSciNetGoogle Scholar
  25. 25.
    Shore, J. E., and Johnson R. W. (1980). Axiomatic derivation of the principle of maximum entropy and principle of minimum cross-entropy, IEEE Transactions on Information Theory, 26, 26–37.zbMATHCrossRefMathSciNetGoogle Scholar
  26. 26.
    Song, K. (2001). Rényi information, loglikelihood and an intrinsic distribution measure, Journal of Statistical Planning and Inference, 93, 51–69.zbMATHCrossRefMathSciNetGoogle Scholar
  27. 27.
    Soofi, E. S. (1997). Information theoretic regression methods, In Advances in Econometrics: Applying Maximum Entropy to Econometric Problems, 12 (Eds., T. B. Fomby and R. C. Hill), pp. 25–83, JAI Press, Greenwich, CT.Google Scholar
  28. 28.
    Wong, K.M., and Chen, S. (1990). The entropy of ordered sequences and order statistics, IEEE Transactions on Information Theory, 36, 276–284.zbMATHCrossRefMathSciNetGoogle Scholar
  29. 29.
    Zellner, A. (1971). An Introduction to Bayesian Inference in Econometrics, John Wiley_& Sons, New York (reprinted in 1996).zbMATHGoogle Scholar

Copyright information

© Birkhäuser Boston 2006

Authors and Affiliations

  • Majid Asadi
    • 1
    • 2
    • 3
    • 4
  • Nader Ebrahimi
    • 1
    • 2
    • 3
    • 4
  • G. G. Hamedani
    • 1
    • 2
    • 3
    • 4
  • Ehsan S. Soofi
    • 1
    • 2
    • 3
    • 4
  1. 1.University of IsfahanIsfahanIran
  2. 2.Northern Illinois UniversityDeKalbUSA
  3. 3.Marquette UniversityMilwaukeeUSA
  4. 4.University of Wisconsin-MilwaukeeMilwaukeeUSA

Personalised recommendations