Information Rates II

  • Robert M. Gray


In this chapter we develop general definitions of information rate for processes with standard alphabets and we prove a mean ergodic theorem for information densities. The L 1 results are extensions of the results of Moy [105] and Perez [123] for stationary processes, which in turn extended the Shannon-McMillan theorem from entropies of discrete alphabet processes to information densities. (See also Kieffer [85].) We also relate several different measures of information rate and consider the mutual information between a stationary process and its ergodic component function. In the next chapter we apply the results of Chapter 5 on divergence to the definitions of this chapter for limiting information and entropy rates to obtain a number of results describing the behavior of such rates. In Chapter 8 almost everywhere ergodic theorems for relative entropy and information densities are proved.


Mutual Information Relative Entropy Information Rate Average Mutual Information Scalar Quantizer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1990

Authors and Affiliations

  • Robert M. Gray
    • 1
  1. 1.Information Systems Laboratory Electrical Engineering DepartmentStanford UniversityStanfordUSA

Personalised recommendations