In this chapter we develop general definitions of information rate for processes with standard alphabets and we prove a mean ergodic theorem for information densities. The L 1 results are extensions of the results of Moy  and Perez  for stationary processes, which in turn extended the Shannon-McMillan theorem from entropies of discrete alphabet processes to information densities. (See also Kieffer .) We also relate several different measures of information rate and consider the mutual information between a stationary process and its ergodic component function. In the next chapter we apply the results of Chapter 5 on divergence to the definitions of this chapter for limiting information and entropy rates to obtain a number of results describing the behavior of such rates. In Chapter 8 almost everywhere ergodic theorems for relative entropy and information densities are proved.
KeywordsMutual Information Relative Entropy Information Rate Average Mutual Information Scalar Quantizer
Unable to display preview. Download preview PDF.