Skip to main content

On the Structure of a Common Knowledge Created by Correlated Observations and Transmission over Helping Channels

  • Chapter
Numbers, Information and Complexity

Abstract

Suppose that two individuals, person X and person Y, communicate with each other in such a way that X sends one of M X messages to Y and, simultaneously, Y sends one of M Y messages to X. The messages are numbered by the integers 1, ..., M X and 1, ..., M Y . Assuming the numbers to be the identifiers for the corresponding messages, we consider the pairs of the exchanged messages (i, j) ∈ {1, ..., M X } × {1, ..., M Y } as possible common values of X and Y which describe their common knowledge. Suppose also that there is another person, called the source, who gives the same binary vector x of length n to the individuals. Then X and Y update their knowledge by including this vector, which means that now they have a triple (i, j, x) in common and if 2n is much greater than M X M Y , then the total number of possible common values is also much greater. However, if the source changes the rules in such a way that x is given to X and y is given to Y, where the vectors x and y do not coincide, but correlated, then this updating of the transmitted pair of messages is not possible any more, and the individuals can revert the situation in which they may agree on M X M Y common values. An alternative algorithm can be fixed as follows: X and Y compute their messages using deterministic functions of the observations and each individual, based on the vector given by the source and the message received from the other person, constructs a value belonging to some “virtual” space, which is assumed to be common to both of them and can be formally presented as a finite set Ω. The algorithm should be assigned in such a way that the values are also common. We will investigate this possibility and demonstrate the example in which one of 20 pairs of messages is exchanged, one of 60 pairs of vectors is given by the source, while X and Y construct one of 50 common values.

on leave from the Data Security Association “Confident”, 193060 St.-Petersburg, Russia

The work was supported by the University of Bielefeld (Germany) and the Eindhoven University of Technology (the Netherlands). The author is grateful to Professor Rudolf Ahlswede and Professor Imre Csiszár for helpful and stimulating discussions, which essentially affected this research and presentation of the results. The help of Dr. Roger Bultitude in the preparing of the manuscript is also highly appreciated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C. E. Shannon, “Two—way communication channels,” in Claude Elwood Shannon: Collected Papers. N. J. A. Sloane and A. D. Wyner (eds.). New York: IEEE Press, 1993, 351 – 384.

    Google Scholar 

  2. The paper was published in the Proc. 4-th Berkley Symp. Math. Stat. and Prob., 1961, 611–644.

    Google Scholar 

  3. R. Ahlswede, “Multi-way communication channels,” in 2nd Int. Symp. Inform. Theory; Tsahkadzor, Armenian SSR, 1971. Publishing House of the Hungarian Academy of Sciences, 1973, 23 – 52.

    Google Scholar 

  4. P. Gács, J. Körner, “Common information is far less than mutual information,” Probl. Inform. Control, 2 (2), 1973, 149 – 162.

    MATH  Google Scholar 

  5. D. Slepian, J. K. Wolf. Slepian, J. K. Wolf, “Noiseless coding of correlated information sources,” IEEE Trans. Inform. Theory, 19 (4), 1973, 772 – 777.

    Article  Google Scholar 

  6. R. Ahlswede, “The capacity region of a channel with two senders and two receivers,” Ann. Prob., 2 (5), 1974, 805 – 814.

    Article  MathSciNet  MATH  Google Scholar 

  7. A. D. Wyner, “On source coding with side information at the decoder,” IEEE Trans. Inform. Theory, 21 (3), 1975, 294 – 300.

    Article  MathSciNet  MATH  Google Scholar 

  8. R. Ahlswede, J. Körner, “Source coding with side information and a converse for degraded broadcast channels,” IEEE Trans. Inform. Theory, 21 (6), 1975, 629 – 637.

    Article  MATH  Google Scholar 

  9. J. Körner, K. Marton, “How to encode modulo-two sum of binary sources,” IEEE Trans. Inform. Theory, 25 (2), 1979, 219 – 221.

    Article  MathSciNet  MATH  Google Scholar 

  10. R. Ahlswede, G. Dueck, “Identification in the presence of feedback — A discovery of new capacity formulas,” IEEE Trans. Inform. Theory, 35 (1), 1989, 30 – 36.

    Article  MathSciNet  MATH  Google Scholar 

  11. U. Maurer, “Secret key agreement by public discussion from common information,” IEEE Trans. Inform. Theory, 39 (3), 1993, 733 – 742.

    Article  MathSciNet  MATH  Google Scholar 

  12. R. Ahlswede, I. Csiszár, “Common randomness in information theory and cryptography — Part I: Secret sharing,” IEEE Trans. Inform. Theory, 39 (5), 1993, 1121 – 1131.

    Article  MathSciNet  MATH  Google Scholar 

  13. R. Ahlswede, V. B. Balakirsky, “Identification under random processes,” Problemy Peredachi Informatsii (special issue honoring Mark S. Pinsker), 32(1), 1996, 144–160 (in Russian). English translation: Probl. Inform. Transmission, 32, 1996, 123 – 138.

    MathSciNet  MATH  Google Scholar 

  14. R. Ahlswede, I. Csiszár, “Common randomness in information theory and cryptography — Part II: CR capacity,” IEEE Trans. Inform. Theory, 44 (1), 1998, 225 – 240.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer Science+Business Media New York

About this chapter

Cite this chapter

Balakirsky, V.B. (2000). On the Structure of a Common Knowledge Created by Correlated Observations and Transmission over Helping Channels. In: Althöfer, I., et al. Numbers, Information and Complexity. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-6048-4_28

Download citation

  • DOI: https://doi.org/10.1007/978-1-4757-6048-4_28

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4419-4967-7

  • Online ISBN: 978-1-4757-6048-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics