Advertisement

Entropy and Information \(\star \)

  • Simon ŠircaEmail author
Chapter
Part of the Graduate Texts in Physics book series (GTP)

Abstract

Entropy is introduced as a concept that quantifies the amount of information contained in a signal or in its corresponding probability distribution. It is defined for discrete and continuous distributions, along with its relative counterpart, the Kullback-Leibler divergence that measures the “distance” between two distributions. The principle of maximum entropy is stated, paving the way to the derivation of several discrete maximum-entropy distributions by means of Lagrange multiplier formalism: the Maxwell-Boltzmann, Bose-Einstein and Fermi-Dirac distributions. The relation between information and thermodynamic entropy is elucidated. A brief discussion of continuous maximum-entropy distributions is followed by presenting the method of maximum-entropy spectral analysis.

Keywords

Power Spectral Density Maximum Entropy Lagrange Function Information Entropy Multivariate Normal Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    H. Nyquist, Certain factors affecting telegraph speed. Bell Syst. Tech. J. 3, 324 (1924)CrossRefGoogle Scholar
  2. 2.
    R.V.L. Hartley, Transmission of information. Bell Syst. Tech. J. 7, 535 (1928)CrossRefGoogle Scholar
  3. 3.
    C.E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J. 27 (1948) 379 (Part I) and 623 (Part II)Google Scholar
  4. 4.
    M. Tribus, Thirty years of information theory, in Maximum Entropy Formalism, ed. by R.D. Levine, M. Tribus (MIT Press, Cambridge, 1979), pp. 1–19Google Scholar
  5. 5.
    J.N. Kapur, Maximum Entropy Models in Science and Engineering (Wiley Eastern Ltd., New Delhi, 1989)zbMATHGoogle Scholar
  6. 6.
    S. Kullback, R.A. Leibler, On information and sufficiency. Ann. Math. Stat. 22, 79 (1951)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    S.M. Hanasoge, M. Branicki, Interpreting cross-correlations of one-bit filtered seismic noise. Geophys. J. Int. 195, 1811 (2013)ADSCrossRefGoogle Scholar
  8. 8.
    E.T. Jaynes, Information theory and statistical mechanics. Phys. Rev. 106, 620 (1957)ADSMathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    E.T. Jaynes, Information theory and statistical mechanics. II. Phys. Rev. 108, 171 (1957)ADSMathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    S. Pressé, K. Ghosh, J. Lee, K.A. Dill, Principles of maximum entropy and maximum caliber in statistical physics. Rev. Mod. Phys. 85, 1115 (2013)ADSCrossRefGoogle Scholar
  11. 11.
    P. Harremoës, F. Topsøe, Maximum entropy fundamentals. Entropy 3, 191 (2001)ADSMathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    G. Szegő, Ein Grenzwertsatz über die Toeplitzschen Determinanten einer reellen positiven Funktion. Math. Ann. 76, 490 (1915)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    E.T. Jaynes, On the rationale of maximum-entropy methods. IEEE Proc. 70, 939 (1982)ADSCrossRefGoogle Scholar
  14. 14.
    P. Stoica, R. Moses, Spectral Analysis of Signals (Prentice-Hall Inc, New Jersey, 2005)Google Scholar
  15. 15.
    J.P. Burg, Maximum entropy spectral analysis, lecture at the 37th annual international meeting, Soc. Explor. Geophys., Oklahoma City, Oklahoma, 31 Oct 1967Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Faculty of Mathematics and PhysicsUniversity of LjubljanaLjubljanaSlovenia

Personalised recommendations