Advertisement

On the optimum rate of transmitting information

  • J. H. B. Kemperman
Conference paper
Part of the Lecture Notes in Mathematics book series (LNM, volume 89)

Keywords

Probability Measure Additive Noise Normal Type Equality Sign Noisy Channel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    U. AUGUSTIN, Gedachtnisfreie Kanale fur diskrete Zeit, Z. Wahrscheinlichkeitstheorie vol 6 (1966) 10–61.MathSciNetCrossRefzbMATHGoogle Scholar
  2. [2]
    D. BLACKWELL, L. BREIMAN AND A.J. THOMASIAN, The capacity of a class of channels, Ann. Math. Stat. vol. 30 (1959) 1229–1241.MathSciNetCrossRefzbMATHGoogle Scholar
  3. [3]
    I. CSISZAR, A note on Jensen’s inequality, Studia Scient. Math. Hungarica vol. 1 (1966) 185–188.MathSciNetzbMATHGoogle Scholar
  4. [4]
    R. M. FANO, Statistical theory of communication, Lecture notes, Massachusetts Inst. Techn., 1952.Google Scholar
  5. [5]
    R. M. FANO, Transmission of information, M. I. T. Press and John Wiley and Sons, New York, 1961.zbMATHGoogle Scholar
  6. [6]
    A. FEINSTEIN, A new basic theorem of information theory, IRE Trans. PGIT vol. 1 (1954) 2–22.MathSciNetGoogle Scholar
  7. [7]
    A. FEINSTEIN, Foundations of information theory, McGraw-Hill, New York, 1958.zbMATHGoogle Scholar
  8. [8]
    W. FELLER, An introduction to probability theory and its applications, vol. II, John Wiley and Sons, New York, 1966.zbMATHGoogle Scholar
  9. [9]
    R. G. GALLAGER, A simple derivation of the coding theorem and some applications, IEEE Trans. Inform. Theory vol. IT-11 (1965) 3–18.MathSciNetCrossRefzbMATHGoogle Scholar
  10. [10]
    H. JEFFREYS, Theory of probability, second edition, Oxford University Press, Oxford, 1948zbMATHGoogle Scholar
  11. [11]
    J. H. B. KEMPERMAN, Upper and lower bounds on the length of the longest code, Abstract, Notices Amer. Math. Soc. vol. 7 (1960) 924.Google Scholar
  12. [12]
    J. H. B. KEMPERMAN, Studies in coding theory I, Mimeographed Report, 94 pp., University of Rochester, 1962.Google Scholar
  13. [13]
    A. I. KHINCHIN, Mathematical foundations of information theory, Dover Publications, New York, 1957.zbMATHGoogle Scholar
  14. [14]
    S. KOTZ, Recent results in information theory, Methuen and Co., London, 1966; also published in the J. Appl. Prob. vol. 3 (1966) 1–93zbMATHGoogle Scholar
  15. [15]
    S. KULLBACK AND R. A. LEIBLER, On information and sufficiency, Ann. Math. Stat. vol. 22 (1951) 79–86.MathSciNetCrossRefzbMATHGoogle Scholar
  16. [16]
    H. P. McKEAN, Jr., Speed of approach to equilibrium for Kac’s caricature of a Maxwellian gas, Archive Rat. Mech. Anal. vol. 21 (1966) 343–367.MathSciNetCrossRefzbMATHGoogle Scholar
  17. [17]
    B. McMILLAN, The basic theorems of information theory, Ann. Math. Stat. vol. 24 (1953) 196–219.MathSciNetCrossRefzbMATHGoogle Scholar
  18. [18]
    M. S. PINSKER, Information and information stability of random variables and processes, translated and edited by A. Feinstein, Holden Day, San Francisco, 1964.Google Scholar
  19. [19]
    C. E. SHANNON, A mathematical theory of communication, Bell System Tech. J. vol. 27 (1948) 379–423; 623–656.MathSciNetCrossRefzbMATHGoogle Scholar
  20. [20]
    C. E. SHANNON, Certain results in coding theory for noisy channels, Inform. and Control vol. 1 (1957) 6–25.MathSciNetCrossRefzbMATHGoogle Scholar
  21. [21]
    C. E. SHANNON, R. G. GALLAGER AND E. R. BERLEMP, Lower bounds to error probability for coding on discrete memoryless channels I, Inform and Control vol. 10 (1967) 65–103.MathSciNetCrossRefzbMATHGoogle Scholar
  22. [22]
    V. STRASSEN, Asymptotisch Abschatzungen in Shannons Informationstheorie, pp. 1–35 of the Transactions Third Prague Conference on Information Theory, Publishing House Czechoslovak Academy of Sciences, Prague, 1964.zbMATHGoogle Scholar
  23. [23]
    L. WEISS, On the strong converse of the coding theorem for symmetric channels without memory, Quart. Appl. Math. vol. 18 (1960) 209–214.MathSciNetzbMATHGoogle Scholar
  24. [24]
    J. WOLFOWITZ, The coding of messages subject to chance errors, Illinois J. Math. vol. 1 (1957) 591–606.MathSciNetzbMATHGoogle Scholar
  25. [25]
    J. WOLFOWITZ, Strong converse of the coding theorem for semicontinuous channels, Illinois J. Math. vol. e (1959) 477–489Google Scholar
  26. [26]
    J. WOLFOWITZ, Coding theorems of information theory, Springer-Verlag, New York, 1961.CrossRefzbMATHGoogle Scholar
  27. [27]
    J. WOLFOWITZ, Coding theorems of information theory, second edition, Springer-Verlag, New York, 1964.CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag 1969

Authors and Affiliations

  • J. H. B. Kemperman
    • 1
  1. 1.University of RochesterRochesterUSA

Personalised recommendations