Skip to main content

On the optimum rate of transmitting information

  • Conference paper
  • First Online:
Probability and Information Theory

Part of the book series: Lecture Notes in Mathematics ((LNM,volume 89))

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. U. AUGUSTIN, Gedachtnisfreie Kanale fur diskrete Zeit, Z. Wahrscheinlichkeitstheorie vol 6 (1966) 10–61.

    Article  MathSciNet  MATH  Google Scholar 

  2. D. BLACKWELL, L. BREIMAN AND A.J. THOMASIAN, The capacity of a class of channels, Ann. Math. Stat. vol. 30 (1959) 1229–1241.

    Article  MathSciNet  MATH  Google Scholar 

  3. I. CSISZAR, A note on Jensen’s inequality, Studia Scient. Math. Hungarica vol. 1 (1966) 185–188.

    MathSciNet  MATH  Google Scholar 

  4. R. M. FANO, Statistical theory of communication, Lecture notes, Massachusetts Inst. Techn., 1952.

    Google Scholar 

  5. R. M. FANO, Transmission of information, M. I. T. Press and John Wiley and Sons, New York, 1961.

    MATH  Google Scholar 

  6. A. FEINSTEIN, A new basic theorem of information theory, IRE Trans. PGIT vol. 1 (1954) 2–22.

    MathSciNet  Google Scholar 

  7. A. FEINSTEIN, Foundations of information theory, McGraw-Hill, New York, 1958.

    MATH  Google Scholar 

  8. W. FELLER, An introduction to probability theory and its applications, vol. II, John Wiley and Sons, New York, 1966.

    MATH  Google Scholar 

  9. R. G. GALLAGER, A simple derivation of the coding theorem and some applications, IEEE Trans. Inform. Theory vol. IT-11 (1965) 3–18.

    Article  MathSciNet  MATH  Google Scholar 

  10. H. JEFFREYS, Theory of probability, second edition, Oxford University Press, Oxford, 1948

    MATH  Google Scholar 

  11. J. H. B. KEMPERMAN, Upper and lower bounds on the length of the longest code, Abstract, Notices Amer. Math. Soc. vol. 7 (1960) 924.

    Google Scholar 

  12. J. H. B. KEMPERMAN, Studies in coding theory I, Mimeographed Report, 94 pp., University of Rochester, 1962.

    Google Scholar 

  13. A. I. KHINCHIN, Mathematical foundations of information theory, Dover Publications, New York, 1957.

    MATH  Google Scholar 

  14. S. KOTZ, Recent results in information theory, Methuen and Co., London, 1966; also published in the J. Appl. Prob. vol. 3 (1966) 1–93

    MATH  Google Scholar 

  15. S. KULLBACK AND R. A. LEIBLER, On information and sufficiency, Ann. Math. Stat. vol. 22 (1951) 79–86.

    Article  MathSciNet  MATH  Google Scholar 

  16. H. P. McKEAN, Jr., Speed of approach to equilibrium for Kac’s caricature of a Maxwellian gas, Archive Rat. Mech. Anal. vol. 21 (1966) 343–367.

    Article  MathSciNet  MATH  Google Scholar 

  17. B. McMILLAN, The basic theorems of information theory, Ann. Math. Stat. vol. 24 (1953) 196–219.

    Article  MathSciNet  MATH  Google Scholar 

  18. M. S. PINSKER, Information and information stability of random variables and processes, translated and edited by A. Feinstein, Holden Day, San Francisco, 1964.

    Google Scholar 

  19. C. E. SHANNON, A mathematical theory of communication, Bell System Tech. J. vol. 27 (1948) 379–423; 623–656.

    Article  MathSciNet  MATH  Google Scholar 

  20. C. E. SHANNON, Certain results in coding theory for noisy channels, Inform. and Control vol. 1 (1957) 6–25.

    Article  MathSciNet  MATH  Google Scholar 

  21. C. E. SHANNON, R. G. GALLAGER AND E. R. BERLEMP, Lower bounds to error probability for coding on discrete memoryless channels I, Inform and Control vol. 10 (1967) 65–103.

    Article  MathSciNet  MATH  Google Scholar 

  22. V. STRASSEN, Asymptotisch Abschatzungen in Shannons Informationstheorie, pp. 1–35 of the Transactions Third Prague Conference on Information Theory, Publishing House Czechoslovak Academy of Sciences, Prague, 1964.

    MATH  Google Scholar 

  23. L. WEISS, On the strong converse of the coding theorem for symmetric channels without memory, Quart. Appl. Math. vol. 18 (1960) 209–214.

    MathSciNet  MATH  Google Scholar 

  24. J. WOLFOWITZ, The coding of messages subject to chance errors, Illinois J. Math. vol. 1 (1957) 591–606.

    MathSciNet  MATH  Google Scholar 

  25. J. WOLFOWITZ, Strong converse of the coding theorem for semicontinuous channels, Illinois J. Math. vol. e (1959) 477–489

    Google Scholar 

  26. J. WOLFOWITZ, Coding theorems of information theory, Springer-Verlag, New York, 1961.

    Book  MATH  Google Scholar 

  27. J. WOLFOWITZ, Coding theorems of information theory, second edition, Springer-Verlag, New York, 1964.

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

M. Behara K. Krickeberg J. Wolfowitz

Rights and permissions

Reprints and permissions

Copyright information

© 1969 Springer-Verlag

About this paper

Cite this paper

Kemperman, J.H.B. (1969). On the optimum rate of transmitting information. In: Behara, M., Krickeberg, K., Wolfowitz, J. (eds) Probability and Information Theory. Lecture Notes in Mathematics, vol 89. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0079123

Download citation

  • DOI: https://doi.org/10.1007/BFb0079123

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-04608-0

  • Online ISBN: 978-3-540-36098-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics