Advertisement

Some Methods in Multi-User Communication: A Tutorial Survey

  • János Körner
Part of the International Centre for Mechanical Sciences book series (CISM, volume 219)

Abstract

Just two years ago, in his survey paper on the Shannon Theory, Aaron Wyner1 qualified multi-user communication as the most dynamic and exciting area in Shannon’s Information Theory. I feel that recent progress in the field shows how right Wyner really was.

Keywords

Rate Region Side Information Channel Code Broadcast Channel Direct Part 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Wyner, A.D.: Recent results in the Shannon theory. IEEE-IT vol. 20.No. 1, Jan 74, pp. 2–10Google Scholar
  2. 2.
    Shannon, C.E.: Two-way communication channels. Proc. 4th Berkeley Symposium on Math. Statistics and Probability, vol. I. pp. 611–644.Google Scholar
  3. 3.
    Ahlswede, R.: Multi-way communication channels Transactions of the 2nd International Symposium on Information Theory held in Tsahkadsor, 1971, pp.23–52.Google Scholar
  4. 4.
    Van der Meulen, E.: The discrete memoryless channel with two senders and one receiver. Presented at the 2nd Intern. Symp. Info.Th., Tsahkadsor, Armenian SSR, 1971.Google Scholar
  5. 5.
    Cover, T.: Broadcast Channels. IEEE-IT vol. IT-18, January 1972, pp. 2–14.Google Scholar
  6. 6.
    Slepian, D. and Wolf, J.K.: Noiseless Coding of Correlated Information Sources. IEEE-IT vo. 19 No. 4, July 1973, pp. 471–480CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    acs, P. and Körner, J.: Common information is far less than mutual information. Problems of Control and Information Theory vol. 2. No. 2, pp. 149–162. 1973.MathSciNetGoogle Scholar
  8. 8.
    Wyner, A.D. and Ziv, J.: A theorem on the entropy of certain binary sequences and applications I-II. IEEE-IT vol. 19. No. 6. November 1973 pp. 769–778.CrossRefMATHMathSciNetGoogle Scholar
  9. 9.
    Körner, J. and Longo, G.: Two-step encoding for finite sources. IEEE-IT vol. 19 No. 6. November 1973 pp. 778–782.CrossRefMATHGoogle Scholar
  10. 10.
    Wolfowitz, J.: Coding theorems of information theory. 2nd edition. Springer-Verlag. Berlin 1964.CrossRefMATHGoogle Scholar
  11. 11.
    Ahlswede, R.-Gâcs, P.-Körner, J.: Bounds on conditional probabilties with applications in multi-user communication. To appear in Z.f. Wahrscheinlichkeitstheorie verw.Geb.Google Scholar
  12. 12.
    Margulis, G.A.: Verojatnostnije charakteristiki grafov s bolsoj svjaznostju. Probl. Peredaci Inform. Vol.X. No. 2, pp. 101–108.Google Scholar
  13. 13.
    Ahlswede, R.: On two-way communication channels and a problem by Zarankiewicz. Transactions 6th Prague Conference on Info. Theory etc. held in 1971.Academìa. Prague 1973.Google Scholar
  14. 14.
    Wolf, J.K.: The AEP property of random sequences and applications to information theory. This volume, pp. 125–156.Google Scholar
  15. 15.
    Cover, T.M.: A proof of the data-compression theorem of Slepian and Wolf for ergodic sources. IEEE Transactions on Inform. Theory vol. 21. No. 2, March 1975 pp. 226–228.CrossRefMATHMathSciNetGoogle Scholar
  16. 16.
    Wolfovitz, J.: Approximation with a fidelity criterion. Proc. Fifth Berkeley Symp. on Math. Stat. etc. University of California Press, Berkeley 1967 vol. I. pp. 565–575.Google Scholar
  17. 17.
    Körner, J.-Marton, K.: A source network problem involving the comparison of two channels I. Submitted to IEEE Trans. on Inform. TheoryGoogle Scholar
  18. 18.
    Wolf, J.K.: Data reduction for multiple correlated sources. Proceedings of the Fifth Colloquium on Microvawe Communication, Budapest, June 24–30, 1974, pp. 287–295.Google Scholar
  19. 19.
    Ahlswede, R.-Körner, J.: Source coding with side information and a converse for degraded broadcast channels IEEE Transactions on Inform. Theory, vol.21, No.6, Nov. 1975.Google Scholar
  20. 20.
    Wyner, A.D.: On source coding with side information at the decoder. IEEE Transactions on Inform. Theory, vol. 21, no. 3, May 1975.Google Scholar
  21. 21.
    Gray, R.M.-Wyner, A.D.: Source Coding for a Simple Network. Bell System Technical Journal, vol. 58, pp. 1681–1721, November 1974CrossRefGoogle Scholar
  22. 22.
    Ahlswede,R.-Körner,J.: On common information and related characteristics of correlated information sources. Presented at the 7th Prague Conference on Information Theory, Sept. 1974. To appear in IEEE Transactions on Inform. Theory.Google Scholar
  23. 23.
    Körner, J.-Marton, K.: paper in preparationGoogle Scholar
  24. 24.
    Gallager, R.G.: Information Theory and Reliable Communication. §.6.2. Wiley New York. 1968.Google Scholar
  25. 25.
    Körner,.J.-Marton, K.: A source network problem involving the comparison of two channels.Presented at the Keszthely Colloquium on Information Theory, August 1975, Hungary. Submitted to IEEE Transactions on Inform. Theory.Google Scholar
  26. 26.
    Csiszâr, I.: oral communicationGoogle Scholar
  27. 27.
    Feinstein, A.: A new basic theorem of information theory. IRE Trans. PGIT pp. 2–22 Sept. 1954.Google Scholar
  28. 28.
    Ahlswede, R.-Gâcs, P.: Spreading of sets in product spaces and exponent contraction of the Markov operator. To appear in Ann. Probability.Google Scholar
  29. 29.
    Witsenhausen, H.S.: On sequences of pairs of dependent random variables SIAM J. on App E Math., 28, pp. 100–113 Jan. 1975.MATHMathSciNetGoogle Scholar
  30. 30.
    Wyner, A.D.: The common information of two dependent random variables. IEEE Transactions on Inform. Theory vol. 21., pp. 163–179. March. 1975.Google Scholar
  31. 31.
    Ahlswede, R.-Körner, J.: On the cßnnections between the entropies of input and output distributions of discrete memorylessGoogle Scholar
  32. 31.
    channels.Transactions of the Brasov. Conference on Probability Theory, 1974.Google Scholar
  33. 32.
    Witsenhausen, H.S.: Entropy Inequalities for discrete channels IEEE-IT vol. 20, No. 5, Sept.1974 pp. 610–616.MATHMathSciNetGoogle Scholar
  34. 33.
    Ahlswede, R.: The capacity region of a channel with 2 senders and 2 receivers. Ann. of Probability, Vol. 2. No. 5, Oct. 1974. pp. 805–814.CrossRefMATHMathSciNetGoogle Scholar
  35. 34.
    Ulrey, M.: A coding theorem for a channel with s senders and receivers. Inform. and Control, to appear.Google Scholar
  36. 35.
    Slepian, D.-Wolf, J.K.: A coding theorem for multiple access channels with correlated sources. Bell Syst. Technical Journ. vol. 52, pp. 1037–1076, September 1973.Google Scholar
  37. 36.
    Arutjunjan, E.A.: Lower bound on error probability for multiple access channels Probl. Peredaci Informacii vol.XI.No. 2, pp. 23–37.Google Scholar
  38. 37.
    Bergmans, P.P.: Degraded Broadcast Channels Ph.D. dissertation. Stanford. June 1972.Google Scholar
  39. 38.
    Bergmans, P.P.: Coding theorem for broadcast channels with degraded components. IEEE Transactions on Inform. Theory, vol. 19, No. 2, pp. 197–207. March 1973.CrossRefMathSciNetGoogle Scholar
  40. 39.
    Gallager, R.G.: Capacity and coding for degraded broadcast channels. Probl. Peredaci Informacii vol. 10, No.3, pp.3–14. July-Sept. I974.Google Scholar
  41. 40.
    Körner, J. and Marton,K.: A source network problem involving the comparison of two channels II. Presented at the Keszthely Coll. on Inf. Theory, August 1975. Hungary.Submitted to the Transactions of the ColloquiumGoogle Scholar
  42. 41.
    Van der Meulen, E.C.: Random coding theorems for the general discrete memoryless broadcast channel IEEE Trans. Inform. Theory vol. 21, No.2., March 1975. pp. 180–190.Google Scholar
  43. 42.
    Cover, T.M.: An achievable rate region for the broadcast channel. IEEE Trans. Inform. Theory, vol. 21, No.4., July 1975. pp. 399–404.Google Scholar
  44. 43.
    Körner, J. and Marton, K.: General broadcast channels with degraded message sets. Preprint. Submitted to IEEE Trans. Inform. Theory.Google Scholar

Copyright information

© Springer-Verlag Wien 1975

Authors and Affiliations

  • János Körner
    • 1
  1. 1.Mathematical Institute of the Hungarian Academy of SciencesUngarn

Personalised recommendations