Information Rates I

  • Robert M. Gray


Before proceeding to generalizations of the various measures of information, entropy, and divergence to nondiscrete alphabets, we consider several properties of information and entropy rates of finite alphabet processes. We show that codes that produce similar outputs with high probability yield similar rates and that entropy and information rate, like ordinary entropy and information, are reduced by coding. The discussion introduces a basic tool of ergodic theory-the partition distance- and develops several versions of an early and fundamental result from information theory-Fano’s inequality. We obtain an ergodic theorem for information densities of finite alphabet processes as a simple application of the general ShannonMcMillan-Breiman theorem coupled with some definitions. In Chapter 6 these results easily provide L 1 ergodic theorems for information densities for more general processes.


Mutual Information Information Rate Stationary Code Scalar Quantizer Information Density 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1990

Authors and Affiliations

  • Robert M. Gray
    • 1
  1. 1.Information Systems Laboratory Electrical Engineering DepartmentStanford UniversityStanfordUSA

Personalised recommendations