Abstract
In this chapter, we consider the concept, introduced by Shannon, of the amount of mutual information between two random variables or two groups of random variables. This concept is central to information theory, the independent development of which was initiated by Shannon in 1948. The amount of mutual information is defined as the difference between a priori and a posteriori (conditional) entropies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Pinsker, M.S.: The quantity of information about a Gaussian random stationary process, contained in a second process connected with it in a stationary manner. Dokl. Akad. Nauk USSR 99, 213–216 (1954, in Russian)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27 (1948)
Shannon, C.E.: A mathematical theory of communication (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Belavkin, R.V., Pardalos, P.M., Principe, J.C., Stratonovich, R.L. (2020). Information in the presence of noise. Shannon’s amount of information. In: Belavkin, R., Pardalos, P., Principe, J. (eds) Theory of Information and its Value. Springer, Cham. https://doi.org/10.1007/978-3-030-22833-0_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-22833-0_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22832-3
Online ISBN: 978-3-030-22833-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)