Skip to main content
  • 580 Accesses

Abstract

Consider two dependent random variables (S, C) and suppose that \(\hat{\chi }\) is the optimal estimate of C when only S is known. I(S; C) is a measure of how much S tells us about C, and \(I(\hat{\chi } ; C)\) is a measure of how much our optimal estimate \(\hat{\chi }\) tells us about C. What can we say about \(I(\hat{\chi } ; C)\) if we know that I(S; C) = 3 bits, for example? The optimality of \(\hat{\chi }\) suggests that \(I(\hat{\chi } ; C)\) should also be close to 3 bits. This is what we address in this problem. Let (S, C) be jointly distributed ~p(s,c), where S = {0,..., N−1}, and C = {0,..., M−1}. Let \(\hat{\chi }:\{ 0, \ldots ,N - 1\} \to \{ 0, \ldots ,M - 1\}\) denote an arbitrary function of the outcomes of S. The problem is to estimate the numbers α(N, M) defined by

$$\alpha (N,M) = \mathop{{\inf }}\limits_{{p:I(S;C) > 0}} \mathop{{\max }}\limits_{{\hat{\chi } = \hat{C}(S)}} \left( {\frac{{I(\hat{\chi };C)}}{{I(S;C)}}} \right)$$

Since \(I(\hat{\chi };C) \leqslant I(S;C)\) (data processing inequality), α(N, M) ≤ 1 . In fact, α(N, M) < 1 for N, M as shown in the following example for α(3, 2).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1987 Springer-Verlag New York Inc.

About this chapter

Cite this chapter

Abu-Mostafa, Y.S. (1987). Essential Average Mutual Information. In: Cover, T.M., Gopinath, B. (eds) Open Problems in Communication and Computation. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-4808-8_19

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-4808-8_19

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4612-9162-6

  • Online ISBN: 978-1-4612-4808-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics