Abstract
There are two main applications of statistical information theory: telecommunications and thermodynamics. Both areas are well covered by their own specialized textbooks. We will start the information theoretical examination from the telecommunications end with Shannon’s theory.
Chapter PDF
Reference
Markov has shown that for the law of great numbers (7.2.2) to apply, it is sufficient that the graph is connected, i.e. that a transition from one state to any other state is possible in a sufficiently large number of steps Khinchin, 1957, p. 16.
R.W. Hamming: Error Detecting and Error Correcting Codes (Bell System Tech. J. 29, 1950).
Referring to Denbigh&Denbigh, 1985, p. 104, which in turn refers to M. Tribus: Boelter Anniversary Volume, McGraw-Hill, 1963.
Vienna Academy, No. 39 in “Gesammelte Werke”, p. 121, here cited from the translation Sommerfeld, 1956, p.213.
Called “digrams” by Shannon but “bigrams” by his cryptographic reference Pratt, 1942, p.260, which gives among other things the approximate frequencies of 20,000 English trigrams.
Translation of a formulation due to Planck Planck, 1911, p.86.
Rights and permissions
Copyright information
© 2002 Springer Science+Business Media New York
About this chapter
Cite this chapter
Kåhre, J. (2002). Statistical Information. In: The Mathematical Theory of Information. The Springer International Series in Engineering and Computer Science, vol 684. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-0975-2_7
Download citation
DOI: https://doi.org/10.1007/978-1-4615-0975-2_7
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-5332-4
Online ISBN: 978-1-4615-0975-2
eBook Packages: Springer Book Archive