Abstract

The development of the idea of entropy of random variables and processes by Claude Shannon provided the beginnings of information theory and of the modern age of ergodic theory. We shall see that entropy and related information measures provide useful descriptions of the long term behavior of random processes and that this behavior is a key factor in developing the coding theorems of information theory. We now introduce the various notions of entropy for random variables, vectors, processes, and dynamical systems and we develop many of the fundamental properties of entropy.

Keywords

Divergence Inequality Mutual Information Relative Entropy Average Mutual Information Conditional Entropy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 1990

Authors and Affiliations

  • Robert M. Gray
    • 1
  1. 1.Information Systems Laboratory Electrical Engineering DepartmentStanford UniversityStanfordUSA

Personalised recommendations