Advertisement

On Entropy, Information Inequalities, and Groups

  • Raymond W. Yeung
Chapter
Part of the The Springer International Series in Engineering and Computer Science book series (SECS, volume 712)

Abstract

There has been significant progress in the study of entropy functions and information inequalities in the past 10 years. The set-theoretic structure of Shannon’s information measures has been established, and machineproving of most information inequalities known to date (Shannon-type inequalities) has become possible. Most importantly, the recent discovery of a few so-called non-Shannon-type inequalities reveals the existence of information inequalities which cannot be proved by techniques known during the first 50 years of information theory. In this expository paper, the essence of this fundamental subject is explained, a number of applications of the results are given, and their implications in information theory, probability theory, and group theory are discussed.

Keywords

Mutual Information Conditional Independence Markov Random Field Network Security Entropy Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    C. E. Shannon, “Communication theory of Secrecy systems,” Bell System Technical Journal, vol. 28, pp. 656–715, Oct 1949.MathSciNetzbMATHGoogle Scholar
  2. [2]
    F. M. Reza, An Introduction to Information Theory, MaGraw-Hill, 1961.Google Scholar
  3. [3]
    G.-D. Hu, “On the amount of Information,” Teor. Veroyatnost. i Primenen. vol. 4, pp. 447–455, 1962 (in Russian).Google Scholar
  4. [4]
    N. M. Abramson, Information Theory and Coding, MaGraw-Hill, 1963.Google Scholar
  5. [5]
    T. S. Han, “Linear dependence structure of the entropy space,” Inform. form. Contr, vol. 29, pp. 337–368, 1975.zbMATHCrossRefGoogle Scholar
  6. [6]
    T. S. Han, “Nonnegative entropy measures of multivariate symmetric correlations,” Inform. Contr., vol. 36, pp. 133–156, 1978.zbMATHCrossRefGoogle Scholar
  7. [7]
    T. S. Han, “A uniqueness of Shannon’s information distance and related nonnegativity problems,” Journal of Combinatorics, Information System Sciences, vol. 6, no. 4, pp. 320–331, 1981.MathSciNetzbMATHGoogle Scholar
  8. [8]
    I. Csiszár and J. Körner, “Information Theory: Coding Theorem for Discrete Memoryless Systems,” New York: Academic Press, and Budapest: Akademiai Kiado, 1981.zbMATHGoogle Scholar
  9. [9]
    A. Papoulis, Probability, Random Variables and Stochastic Processes, 2nd ed., McGraw-Hill, 1984.zbMATHGoogle Scholar
  10. [10]
    N. Pippenger, “What are the laws of information theory?” 1986 Special Problems on Communication and Computation Conference, Palo Alto, California, Sept. 3–5, 1986.Google Scholar
  11. [11]
    T. M. Cover and J. A. Thomas, Elements of Information Theory. New York: Wiley, 1991.zbMATHCrossRefGoogle Scholar
  12. [12]
    R. W. Yeung, “A new outlook on Shannon’s information measures,” IEEE Trans. Inform. Theory, vol. 37, pp. 466–474, May 1991.MathSciNetzbMATHCrossRefGoogle Scholar
  13. [13]
    T. Kawabata and R. W. Yeung, “The structure of the I-Measure of a Markov chain,” IEEE Trans. Inform. Theory, vol. 38, pp. 1146–1149, 1992.MathSciNetzbMATHCrossRefGoogle Scholar
  14. [14]
    R. M. Capocelli, A. De Santis, L. Gargano and U. Vaccaro, “On the size of shares for secret sharing schemes,” J. Cryptography, vol. 6, pp. 157–167, 1993.zbMATHGoogle Scholar
  15. [15]
    D. R. Stinson, “New general lower bounds on the information rate of secret sharing schemes,” Lecture Notes in Comp. Sci. 740, pp 168–182, 1993.MathSciNetCrossRefGoogle Scholar
  16. [16]
    F. Matúš, “Probabilistic conditional independence structures and matroid theory: Background,” Int. J. General Systems, vol. 22, pp. 185–196, 1994.zbMATHGoogle Scholar
  17. [17]
    F. Matúš and M. Studen, “Conditional independences among four random variables I,” Combinatorics, Probability and Computing, vol. 4, no. 3, pp. 267–278, 1995.Google Scholar
  18. [18]
    F. Matúš, “Conditional independences among four random variables II,” Combinatorics, Probability Computing, vol. 4, pp. 407–417, 1995.zbMATHCrossRefGoogle Scholar
  19. [19]
    R. W. Yeung, “Multilevel diversity coding with distortion,” IEEE Trans. Inform. Theory, vol. IT-41, pp. 412–422, Mar 1995.CrossRefGoogle Scholar
  20. [20]
    R. W. Yeung, “A framework for linear information inequalities,” IEEE Trans. Inform. Theory, vol. 43, pp. 1924–1934, Nov 1997.MathSciNetzbMATHCrossRefGoogle Scholar
  21. [21]
    Z. Zhang and R. W. Yeung, “A non-Shannon-type conditional information inequality,” IEEE Trans. Inform. Theory, vol. 43, pp. 1982–1986, Nov 1997.MathSciNetzbMATHCrossRefGoogle Scholar
  22. [22]
    Z. Zhang and R. W. Yeung, “On Characterization of entropy function via information inequalities,” IEEE Trans. Inform. Theory, vol. 44, pp. 1440–1452, Jul 1998.MathSciNetzbMATHCrossRefGoogle Scholar
  23. [23]
    F. Matúš, “Conditional independences among four random variables III: Final conclusion,” Combinatorics, Probability Computing, vol. 8, pp. 269–276, 1999.zbMATHCrossRefGoogle Scholar
  24. [24]
    R. W. Yeung and Z. Zhang, “Distributed source coding for satellite communications,” IEEE Trans. Inform. Theory, vol. 45, pp. 1111–1120, May 1999.MathSciNetzbMATHCrossRefGoogle Scholar
  25. [25]
    R. W. Yeung and Z. Zhang, “A class of non-Shannon-type information inequalities and their applications,” Communications in Information and Systems, vol. 1, no. 1, pp. 87–100, 2001 (http://www.ims.cuhk.edu.hk/c͂is).MathSciNetzbMATHGoogle Scholar
  26. [26]
    T. H. Chan and R. W. Yeung, “On a relation between information inequalities and group theory,” to appear in IEEE Trans. Inform. Theory. Google Scholar
  27. [27]
    R. W. Yeung, A First Course in Information Theory, Kluwer Academic/Plenum Publishers, 2002.CrossRefGoogle Scholar
  28. [28]
    F.-W.Fu and R. W. Yeung, “On the rate-distortion region for multiple descriptions,” to appear in IEEE Trans. Inform. Theory. Google Scholar
  29. [29]
    R. W. Yeung, T. T. Lee and Z. Ye, “An information-theoretic characterization of Markov random fields and its applications,” to appear in IEEE Trans. Inform. Theory. Google Scholar
  30. [30]
    I. Sason, “Identification of new enormous classes of non-Shannon type constrained information inequalities and related inequalities for finite groups,” submitted to 2002 IEEE International Symposium on Information Theory.Google Scholar
  31. [31]
    K. Makarychev, Y. Makarychev, A. Romashchenko, and N. Vereshchagin, “A new class of non-Shannon-type inequalities for entropies,” submitted to Communications in Information and Systems. Google Scholar

Copyright information

© Springer Science+Business Media New York 2003

Authors and Affiliations

  • Raymond W. Yeung
    • 1
  1. 1.Department of Information EngineeringThe Chinese University of Hong KongN. T., Hong KongChina

Personalised recommendations