On the Structure of a Common Knowledge Created by Correlated Observations and Transmission over Helping Channels

  • Vladimir B. Balakirsky
Chapter

Abstract

Suppose that two individuals, person X and person Y, communicate with each other in such a way that X sends one of M X messages to Y and, simultaneously, Y sends one of M Y messages to X. The messages are numbered by the integers 1, ..., M X and 1, ..., M Y . Assuming the numbers to be the identifiers for the corresponding messages, we consider the pairs of the exchanged messages (i, j) ∈ {1, ..., M X } × {1, ..., M Y } as possible common values of X and Y which describe their common knowledge. Suppose also that there is another person, called the source, who gives the same binary vector x of length n to the individuals. Then X and Y update their knowledge by including this vector, which means that now they have a triple (i, j, x) in common and if 2 n is much greater than M X M Y , then the total number of possible common values is also much greater. However, if the source changes the rules in such a way that x is given to X and y is given to Y, where the vectors x and y do not coincide, but correlated, then this updating of the transmitted pair of messages is not possible any more, and the individuals can revert the situation in which they may agree on M X M Y common values. An alternative algorithm can be fixed as follows: X and Y compute their messages using deterministic functions of the observations and each individual, based on the vector given by the source and the message received from the other person, constructs a value belonging to some “virtual” space, which is assumed to be common to both of them and can be formally presented as a finite set Ω. The algorithm should be assigned in such a way that the values are also common. We will investigate this possibility and demonstrate the example in which one of 20 pairs of messages is exchanged, one of 60 pairs of vectors is given by the source, while X and Y construct one of 50 common values.

Keywords

Bipartite Graph Common Knowledge Side Information Achievable Rate Minimal Cardinality 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    C. E. Shannon, “Two—way communication channels,” in Claude Elwood Shannon: Collected Papers. N. J. A. Sloane and A. D. Wyner (eds.). New York: IEEE Press, 1993, 351 – 384.Google Scholar
  2. The paper was published in the Proc. 4-th Berkley Symp. Math. Stat. and Prob., 1961, 611–644. Google Scholar
  3. [2]
    R. Ahlswede, “Multi-way communication channels,” in 2nd Int. Symp. Inform. Theory; Tsahkadzor, Armenian SSR, 1971. Publishing House of the Hungarian Academy of Sciences, 1973, 23 – 52.Google Scholar
  4. [3]
    P. Gács, J. Körner, “Common information is far less than mutual information,” Probl. Inform. Control, 2 (2), 1973, 149 – 162.MATHGoogle Scholar
  5. [4]
    D. Slepian, J. K. Wolf. Slepian, J. K. Wolf, “Noiseless coding of correlated information sources,” IEEE Trans. Inform. Theory, 19 (4), 1973, 772 – 777.CrossRefGoogle Scholar
  6. [5]
    R. Ahlswede, “The capacity region of a channel with two senders and two receivers,” Ann. Prob., 2 (5), 1974, 805 – 814.MathSciNetMATHCrossRefGoogle Scholar
  7. [6]
    A. D. Wyner, “On source coding with side information at the decoder,” IEEE Trans. Inform. Theory, 21 (3), 1975, 294 – 300.MathSciNetMATHCrossRefGoogle Scholar
  8. [7]
    R. Ahlswede, J. Körner, “Source coding with side information and a converse for degraded broadcast channels,” IEEE Trans. Inform. Theory, 21 (6), 1975, 629 – 637.MATHCrossRefGoogle Scholar
  9. [8]
    J. Körner, K. Marton, “How to encode modulo-two sum of binary sources,” IEEE Trans. Inform. Theory, 25 (2), 1979, 219 – 221.MathSciNetMATHCrossRefGoogle Scholar
  10. [9]
    R. Ahlswede, G. Dueck, “Identification in the presence of feedback — A discovery of new capacity formulas,” IEEE Trans. Inform. Theory, 35 (1), 1989, 30 – 36.MathSciNetMATHCrossRefGoogle Scholar
  11. [10]
    U. Maurer, “Secret key agreement by public discussion from common information,” IEEE Trans. Inform. Theory, 39 (3), 1993, 733 – 742.MathSciNetMATHCrossRefGoogle Scholar
  12. [11]
    R. Ahlswede, I. Csiszár, “Common randomness in information theory and cryptography — Part I: Secret sharing,” IEEE Trans. Inform. Theory, 39 (5), 1993, 1121 – 1131.MathSciNetMATHCrossRefGoogle Scholar
  13. [12]
    R. Ahlswede, V. B. Balakirsky, “Identification under random processes,” Problemy Peredachi Informatsii (special issue honoring Mark S. Pinsker), 32(1), 1996, 144–160 (in Russian). English translation: Probl. Inform. Transmission, 32, 1996, 123 – 138.MathSciNetMATHGoogle Scholar
  14. [13]
    R. Ahlswede, I. Csiszár, “Common randomness in information theory and cryptography — Part II: CR capacity,” IEEE Trans. Inform. Theory, 44 (1), 1998, 225 – 240.MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2000

Authors and Affiliations

  • Vladimir B. Balakirsky
    • 1
    • 2
  1. 1.Electrical Engineering DepartmentEindhoven University of TechnologyEindhoventhe Netherlands
  2. 2.Data Security Association “Confident”St.-PetersburgRussia

Personalised recommendations