Skip to main content

Information Measures for Pareto Distributions and Order Statistics

  • Chapter

Part of the book series: Statistics for Industry and Technology ((SIT))

Abstract

This paper consists of three sections. The first section gives an overview of the basic information functions, their interpretations, and dynamic information measures that have been recently developed for lifetime distributions. The second section summarizes the information features of univariate Pareto distributions, tabulates transformations of a Pareto random variable under which information measures of numerous distributions can be obtained, and gives a few characterizations of the generalized Pareto distribution. The final section summarizes information measures for order statistics and tabulates the expressions for Shannon entropies of order statistics for numerous distributions.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Arnold, B. C. (1983). Pareto Distributions, International Co-operative, Publishing House, Baltimore, MD.

    Google Scholar 

  2. Arnold, B. C., Balakrishnan, N., and Nagaraja, H. N. (1992). A First Course in Order Statistics, John Wiley_& Sons, New York.

    MATH  Google Scholar 

  3. Asadi, M., and Ebrahimi, N. (2000). Residual entropy and its characterizations in terms of hazard function and mean residual life function, Statistics_& Probability Letters, 49, 263–269.

    Article  MATH  MathSciNet  Google Scholar 

  4. Asadi, M., Ebrahimi, N., Hamedani, G. G., and Soofi, E. S. (2004). Maximum dynamic entropy models, Journal of Applied Probability, 41, 379–390.

    Article  MATH  MathSciNet  Google Scholar 

  5. Asadi, M., Ebrahimi, N., and Soofi, E. S. (2005). Dynamic generalized information measures, Statistics_& Probability Letters, 71, 85–98.

    Article  MATH  MathSciNet  Google Scholar 

  6. Bernardo, J. M. (1979). Expected information as expected utility, Annals of Statistics, 7, 686–690.

    MATH  MathSciNet  Google Scholar 

  7. Darbellay, G. A., and Vajda, I. (2000). Entropy expressions for multivariate continuous distributions, IEEE Transactions on Information Theory, 46, 709–712.

    Article  MATH  MathSciNet  Google Scholar 

  8. Di Crescenzo, A., and Longobardi, M. (2002). Entropy-based measure of uncertainty in past lifetime distributions, Journal of Applied Probability, 39, 434–440.

    Article  MATH  MathSciNet  Google Scholar 

  9. Di Crescenzo, A., and Longobardi, M. (2004). A measure of discrimination between past lifetime distributions, Statistics_& Probability Letters, 67, 173–182.

    Article  MATH  Google Scholar 

  10. Ebrahimi, N. (1996). How to measure uncertainty in the residual lifetime distributions, Sankhyā, Series A, 58, 48–57.

    MATH  MathSciNet  Google Scholar 

  11. Ebrahimi, N., and Kirmani, S. N. U. A. (1996a). A characterization of the proportional hazards model through a measure of discrimination between two residual life distributions, Biometrika, 83, 233–235.

    Article  MATH  MathSciNet  Google Scholar 

  12. Ebrahimi, N., and Kirmani, S. N. U. A. (1996b). A measure of discrimination between two residual lifetime distributions and its applications, Annals of Institute of Statistical Mathematics, 48, 257–265.

    Article  MATH  MathSciNet  Google Scholar 

  13. Ebrahimi, N., Soofi, E. S., and Zahedi, H. (2004). Information properties of order statistics and spacings, IEEE Transactions on Information Theory, 50, 177–183.

    Article  MathSciNet  Google Scholar 

  14. Golan, A., and Perloff, J. M. (2002). Comparison of maximum entropy and higher-order entropy estimators, Journal of Econometrics, 107, 195–211.

    Article  MATH  MathSciNet  Google Scholar 

  15. Good, I. J. (1950). Probability and Weighting of Evidence, Griffin, London.

    Google Scholar 

  16. Jaynes, E. T. (1957). Information theory and statistical mechanics, Physics Review, 106, 620–630.

    Article  MathSciNet  Google Scholar 

  17. Jaynes, E. T. (1982). On the rationale of maximum-entropy methods, Proceedings of IEEE, 70, 939–952.

    Article  Google Scholar 

  18. Kullback, S. (1959). Information Theory and Statistics, John Wiley _& Sons, New York.

    MATH  Google Scholar 

  19. Kullback, S., and Leibler, R. A. (1951). On information and sufficiency, Annals of Mathematical Statistics, 22, 79–86.

    MathSciNet  MATH  Google Scholar 

  20. Nadarajah, S., and Zografos, K. (2003). Formulas for Rényi information and related measures for univariate distributions, Information Science, 155, 119–138.

    Article  MATH  MathSciNet  Google Scholar 

  21. Park, S. (1995). The entropy of consecutive order statistics, IEEE Transactions on Information Theory, 41, 2003–2007.

    Article  MATH  Google Scholar 

  22. Park, S. (1996). Fisher information on order statistics, Journal of the American Statistical Association, 91, 385–390.

    Article  MATH  MathSciNet  Google Scholar 

  23. Rényi, A. (1961). On measures of entropy and information, Proceedings of the Fourth Berkeley Symposium, 1, 547–561, University of California Press, Berkeley.

    Google Scholar 

  24. Shannon, C. E. (1948). A mathematical theory of communication, Bell System Technical Journal, 27, 379–423.

    MathSciNet  Google Scholar 

  25. Shore, J. E., and Johnson R. W. (1980). Axiomatic derivation of the principle of maximum entropy and principle of minimum cross-entropy, IEEE Transactions on Information Theory, 26, 26–37.

    Article  MATH  MathSciNet  Google Scholar 

  26. Song, K. (2001). Rényi information, loglikelihood and an intrinsic distribution measure, Journal of Statistical Planning and Inference, 93, 51–69.

    Article  MATH  MathSciNet  Google Scholar 

  27. Soofi, E. S. (1997). Information theoretic regression methods, In Advances in Econometrics: Applying Maximum Entropy to Econometric Problems, 12 (Eds., T. B. Fomby and R. C. Hill), pp. 25–83, JAI Press, Greenwich, CT.

    Google Scholar 

  28. Wong, K.M., and Chen, S. (1990). The entropy of ordered sequences and order statistics, IEEE Transactions on Information Theory, 36, 276–284.

    Article  MATH  MathSciNet  Google Scholar 

  29. Zellner, A. (1971). An Introduction to Bayesian Inference in Econometrics, John Wiley_& Sons, New York (reprinted in 1996).

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Birkhäuser Boston

About this chapter

Cite this chapter

Asadi, M., Ebrahimi, N., Hamedani, G.G., Soofi, E.S. (2006). Information Measures for Pareto Distributions and Order Statistics. In: Balakrishnan, N., Sarabia, J.M., Castillo, E. (eds) Advances in Distribution Theory, Order Statistics, and Inference. Statistics for Industry and Technology. Birkhäuser Boston. https://doi.org/10.1007/0-8176-4487-3_13

Download citation

Publish with us

Policies and ethics