Before proceeding to generalizations of the various measures of information, entropy, and divergence to nondiscrete alphabets, we consider several properties of information and entropy rates of finite alphabet processes. We show that codes that produce similar outputs with high probability yield similar rates and that entropy and information rate, like ordinary entropy and information, are reduced by coding. The discussion introduces a basic tool of ergodic theory-the partition distance- and develops several versions of an early and fundamental result from information theory-Fano’s inequality. We obtain an ergodic theorem for information densities of finite alphabet processes as a simple application of the general ShannonMcMillan-Breiman theorem coupled with some definitions. In Chapter 6 these results easily provide L 1 ergodic theorems for information densities for more general processes.
KeywordsMutual Information Information Rate Stationary Code Scalar Quantizer Information Density
Unable to display preview. Download preview PDF.