A variety of information measures have been introduced for finite alphabet random variables, vectors, and processes:entropy, mutual information, relative entropy, conditional entropy, and conditional mutual information. All of these can be expressed in terms of divergence and hence the generalization of these definitions to infinite alphabets will follow from a general definition of divergence. Many of the properties of generalized information measures will then follow from those of generalized divergence.
KeywordsMarkov Chain Mutual Information Relative Entropy Chain Rule Entropy Density
Unable to display preview. Download preview PDF.