Probability Theory pp 447-456 | Cite as

# Information and Entropy

Chapter

## Abstract

Section 14.1 presents the definitions and key properties of information and entropy. Section 14.2 discusses the entropy of a (stationary) finite Markov chain. The Law of Large Numbers is proved for the amount of information contained in a message that is a long sequence of successive states of a Markov chain, and the asymptotic behaviour of the number of the most common states in a sequence of successive values of the chain is established. Applications of this result to coding are discussed.

### Keywords

Entropy### References

- 11.Feinstein, A.: Foundations of Information Theory. McGraw-Hill, New York (1995) Google Scholar
- 21.Khinchin, A.Ya.: Ponyatie entropii v teorii veroyatnostei (The concept of entropy in the theory probability). Usp. Mat. Nauk
**8**, 3–20 (1953) (in Russian) MATHGoogle Scholar

## Copyright information

© Springer-Verlag London 2013