Skip to main content

Generalizations of Entropy and Information Measures

  • Chapter

Abstract

This paper presents and discusses two generalized forms of the Shannon entropy, as well as a generalized information measure. These measures are applied on a exponential-power generalization of the usual Normal distribution, emerged from a generalized form of the Fisher’s entropy type information measure, essential to Cryptology. Information divergences between these random variables are also discussed. Moreover, a complexity measure, related to the generalized Shannon entropy, is also presented, extending the known SDL complexity measure.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bauer, L.F.: Kryptologie, Methoden and Maximen. Springer, London (1994)

    Book  MATH  Google Scholar 

  2. Blachman, N.M.: The convolution inequality for entropy powers. IEEE Trans. Inf. Theory 11(2), 267–271 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  3. Carlen, E.A.: Superadditivity of Fisher’s information and logarithmic Sobolev inequalities. J. Funct. Anal. 101, 194–211 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  4. Cotsiolis, A., Tavoularis, N.K.: On logarithmic Sobolev inequalities for higher order fractional derivatives. C.R. Acad. Sci. Paris Ser. I 340, 205–208 (2005)

    MathSciNet  MATH  Google Scholar 

  5. Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, Hoboken (2006)

    MATH  Google Scholar 

  6. Del Pino, M., Dolbeault, J., Gentil, I.: Nonlinear diffusions, hypercontractivity and the optimal L p-Euclidean logarithmic Sobolev inequality. J. Math. Anal. Appl. 293(2), 375–388 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  7. Feldman, D.P., Crutchfield, J.P.: Measures of statistical complexity: why? Phys. Lett. A 3, 244–252 (1988)

    MathSciNet  Google Scholar 

  8. Ferentinos, K., Papaioannou, T.: New parametric measures of information. Inf. Control 51, 193–208 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  9. Fisher, R.A.: On the mathematical foundation of theoretical statistics. Philos. Trans. R. Soc. Lond. A 222, 309–368 (1922)

    Article  MATH  Google Scholar 

  10. Gómez, E., Gómez–Villegas, M.A., Marin, J.M.: A multivariate generalization of the power exponential family of distributions. Commun. Stat. Theory Methods 27(3), 589–600 (1998)

    Google Scholar 

  11. Goodman, I.R., Kotz, S.: Multivariate \(\theta\)-generalized normal distributions. J. Multivar. Anal. 3, 204–219 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  12. Gradshteyn, I.S., Ryzhik, I.M.: Table of Integrals, Series, and Products. Elsevier, Amsterdam (2007)

    MATH  Google Scholar 

  13. Gross, L.: Logarithm Sobolev inequalities. Am. J. Math. 97(761), 1061–1083 (1975)

    Article  Google Scholar 

  14. Katzan, H. Jr.: The Standard Data Encryption Algorithm. Petrocelli Books, Princeton, NJ (1977)

    Google Scholar 

  15. Kitsos, C.P., Tavoularis, N.K.: Logarithmic Sobolev inequalities for information measures. IEEE Trans. Inf. Theory 55(6), 2554–2561 (2009)

    Article  MathSciNet  Google Scholar 

  16. Kitsos, C.P., Toulias, T.L.: New information measures for the generalized normal distribution. Information 1, 13–27 (2010)

    Article  Google Scholar 

  17. Kitsos, C.P., Toulias, T.L.: An entropy type measure of complexity. In: Proceedings of COMPSTAT 2012, pp. 403–415 (2012)

    Google Scholar 

  18. Kitsos, C.P., Toulias, T.L.: Bounds for the generalized entropy-type information measure. J. Commun. Comput. 9(1), 56–64 (2012)

    MathSciNet  Google Scholar 

  19. Kitsos, C.P., Toulias, T.L.: Inequalities for the Fisher’s information measures. In: Rassias, T.M. (ed.) Handbook of Functional Equations: Functional Inequalities, Springer Optimization and Its Applications, vol. 95, pp. 281–313. Springer, New York (2014)

    Google Scholar 

  20. Kitsos, C.P., Toulias, T.L., Trandafir, C.P.: On the multivariate γ-ordered normal distribution. Far East J. Theor. Stat. 38(1), 49–73 (2012)

    MathSciNet  MATH  Google Scholar 

  21. Kotz, S.: Multivariate distribution at a cross-road. In: Patil, G.P., Kotz, S., Ord, J.F. (eds.) Statistical Distributions in Scientific Work, vol. 1, pp. 247–270. D. Reidel, Dordrecht (1975)

    Chapter  Google Scholar 

  22. Nadarajah, S.: The Kotz type distribution with applications. Statistics 37(4), 341–358 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nadarajah, S.: A generalized normal distribution. J. Appl. Stat. 32(7), 685–694 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  24. Piasecki, R., Plastino, A.: Entropic descriptor of a complex behaviour. Phys. A 389(3), 397–407 (2010)

    Article  Google Scholar 

  25. Rényi, A.: On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561. University of California Press, Berkeley (1961)

    Google Scholar 

  26. Rényi, A.: Probability Theory. North-Holland (Ser. Appl. Math. Mech.), Amsterdam (1970)

    Google Scholar 

  27. Rosso, O.A., Martin, M.T., Plastino, A.: Brain electrical activity analysis using wavelet-based informational tools (II): Tsallis non-extensivity and complexity measures. Phys. A 320, 497–511 (2003)

    Article  MATH  Google Scholar 

  28. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423, 623–656 (1948)

    Article  MathSciNet  Google Scholar 

  29. Shiner, J.S., Davison, M., Landsberg, P.T.: Simple measure for complexity. Phys. Rev. E 59(2), 1459–1464 (1999)

    Article  Google Scholar 

  30. Sobolev, S.: On a theorem of functional analysis. AMS Transl. Ser. 2 (English Translation) 34, 39–68 (1963)

    Google Scholar 

  31. Stam, A.J.: Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control 2, 255–269 (1959)

    Article  MathSciNet  Google Scholar 

  32. Stinson, D.R.: Cryptography: Theory and Practice, 3rd edn. CRC Press, Boca Raton (2006)

    Google Scholar 

  33. Vajda, I.: \(\mathcal{X}^{2}\)-divergence and generalized Fisher’s information. In: Transactions of the 6th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, pp. 873–886 (1973)

    Google Scholar 

  34. Weissler, F.B.: Logarithmic Sobolev inequalities for the heat-diffusion semigroup. Trans. Am. Math. Soc. 237, 255–269 (1963)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas L. Toulias .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Toulias, T.L., Kitsos, C.P. (2015). Generalizations of Entropy and Information Measures. In: Daras, N., Rassias, M. (eds) Computation, Cryptography, and Network Security. Springer, Cham. https://doi.org/10.1007/978-3-319-18275-9_22

Download citation

Publish with us

Policies and ethics