Entropy and Information
The development of the idea of entropy of random variables and processes by Claude Shannon provided the beginnings of information theory and of the modern age of ergodic theory. We shall see that entropy and related information measures provide useful descriptions of the long term behavior of random processes and that this behavior is a key factor in developing the coding theorems of information theory. We now introduce the various notions of entropy for random variables, vectors, processes, and dynamical systems and we develop many of the fundamental properties of entropy.
KeywordsDivergence Inequality Mutual Information Relative Entropy Average Mutual Information Conditional Entropy
Unable to display preview. Download preview PDF.