Entropy and Information \(\star \)
Entropy is introduced as a concept that quantifies the amount of information contained in a signal or in its corresponding probability distribution. It is defined for discrete and continuous distributions, along with its relative counterpart, the Kullback-Leibler divergence that measures the “distance” between two distributions. The principle of maximum entropy is stated, paving the way to the derivation of several discrete maximum-entropy distributions by means of Lagrange multiplier formalism: the Maxwell-Boltzmann, Bose-Einstein and Fermi-Dirac distributions. The relation between information and thermodynamic entropy is elucidated. A brief discussion of continuous maximum-entropy distributions is followed by presenting the method of maximum-entropy spectral analysis.
KeywordsPower Spectral Density Maximum Entropy Lagrange Function Information Entropy Multivariate Normal Distribution
- 3.C.E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J. 27 (1948) 379 (Part I) and 623 (Part II)Google Scholar
- 4.M. Tribus, Thirty years of information theory, in Maximum Entropy Formalism, ed. by R.D. Levine, M. Tribus (MIT Press, Cambridge, 1979), pp. 1–19Google Scholar
- 14.P. Stoica, R. Moses, Spectral Analysis of Signals (Prentice-Hall Inc, New Jersey, 2005)Google Scholar
- 15.J.P. Burg, Maximum entropy spectral analysis, lecture at the 37th annual international meeting, Soc. Explor. Geophys., Oklahoma City, Oklahoma, 31 Oct 1967Google Scholar