Abstract
Consider two dependent random variables (S, C) and suppose that \(\hat{\chi }\) is the optimal estimate of C when only S is known. I(S; C) is a measure of how much S tells us about C, and \(I(\hat{\chi } ; C)\) is a measure of how much our optimal estimate \(\hat{\chi }\) tells us about C. What can we say about \(I(\hat{\chi } ; C)\) if we know that I(S; C) = 3 bits, for example? The optimality of \(\hat{\chi }\) suggests that \(I(\hat{\chi } ; C)\) should also be close to 3 bits. This is what we address in this problem. Let (S, C) be jointly distributed ~p(s,c), where S = {0,..., N−1}, and C = {0,..., M−1}. Let \(\hat{\chi }:\{ 0, \ldots ,N - 1\} \to \{ 0, \ldots ,M - 1\}\) denote an arbitrary function of the outcomes of S. The problem is to estimate the numbers α(N, M) defined by
Since \(I(\hat{\chi };C) \leqslant I(S;C)\) (data processing inequality), α(N, M) ≤ 1 . In fact, α(N, M) < 1 for N, M as shown in the following example for α(3, 2).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1987 Springer-Verlag New York Inc.
About this chapter
Cite this chapter
Abu-Mostafa, Y.S. (1987). Essential Average Mutual Information. In: Cover, T.M., Gopinath, B. (eds) Open Problems in Communication and Computation. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-4808-8_19
Download citation
DOI: https://doi.org/10.1007/978-1-4612-4808-8_19
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4612-9162-6
Online ISBN: 978-1-4612-4808-8
eBook Packages: Springer Book Archive