Advertisement

Information and Entropy

  • Alexandr A. Borovkov
Part of the Universitext book series (UTX)

Abstract

Section 14.1 presents the definitions and key properties of information and entropy. Section 14.2 discusses the entropy of a (stationary) finite Markov chain. The Law of Large Numbers is proved for the amount of information contained in a message that is a long sequence of successive states of a Markov chain, and the asymptotic behaviour of the number of the most common states in a sequence of successive values of the chain is established. Applications of this result to coding are discussed.

Keywords

Markov Chain Binary Code Code Method Conditional Entropy Probable Word 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 11.
    Feinstein, A.: Foundations of Information Theory. McGraw-Hill, New York (1995) Google Scholar
  2. 21.
    Khinchin, A.Ya.: Ponyatie entropii v teorii veroyatnostei (The concept of entropy in the theory probability). Usp. Mat. Nauk 8, 3–20 (1953) (in Russian) MATHGoogle Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  • Alexandr A. Borovkov
    • 1
  1. 1.Sobolev Institute of Mathematics and Novosibirsk State UniversityNovosibirskRussia

Personalised recommendations