Skip to main content

Fundamentals of Classical Information Theory

  • Chapter
  • 2093 Accesses

Part of the book series: Theoretical and Mathematical Physics ((TMP))

Abstract

In this chapter, we briefly review the basic facts of the classical information communication processes. The fundamental aspects of information theory according to Shannon (Bell Syst. Tech. J. 27:379–423, 623–656, 1948) are composed of the following concepts: message; entropy describing the amount of information; communication channel, mutual entropy, coding, and capacity of the channel. We will discuss some coding theorems which are important results of the classical information theory.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Arndt, C.: Information Measures. Springer, Berlin (2001)

    MATH  Google Scholar 

  2. Ash, R., Information Theory. Wiley, New York (1965)

    MATH  Google Scholar 

  3. Billingsley, P.: Ergodic Theory and Information. Wiley, New York (1968)

    Google Scholar 

  4. Breiman, L.: On achieving channel capacity in finite-memory channels. Ill. J. Math. 4, 246–252 (1960)

    MATH  MathSciNet  Google Scholar 

  5. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)

    Book  MATH  Google Scholar 

  6. Feinstein, A.: Foundations of Information Theory. McGraw-Hill, New York (1958)

    MATH  Google Scholar 

  7. Feinstein, A.: On the coding theorem and its converse for finite memory channels. Inf. Control 2, 25–44 (1959)

    Article  MATH  MathSciNet  Google Scholar 

  8. Gelfand, I.M., Yaglom, A.M.: Calculation of the amount of information about a random function contained in another such function. Am. Math. Soc. Transl. 12, 199–246 (1959)

    MathSciNet  Google Scholar 

  9. Guiasu, S.: Information Theory with Applications. McGraw-Hill, New York (1977)

    MATH  Google Scholar 

  10. Ihara, S.: Stochastic Process and Entropy. Iwanami, Tokyo (1984) (in Japanese)

    Google Scholar 

  11. Ingarden, R.S.: Simplified axioms for information without probability. Pr. Mat. 9, 273–282 (1965)

    MATH  MathSciNet  Google Scholar 

  12. Ingarden, R.S., Kossakowski, A., Ohya, M.: Information Dynamics and Open Systems. Kluwer Academic, Dordrecht (1997)

    MATH  Google Scholar 

  13. Jumarie, G.: Relative Information. Springer, Berlin (1990)

    MATH  Google Scholar 

  14. Khinchin, A.I.: Mathematical Foundations of Information Theory. Dover, New York (1958), English Translation

    Google Scholar 

  15. Kullback, S., Leibler, R.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)

    Article  MATH  MathSciNet  Google Scholar 

  16. Kunisawa, K., Umegaki, H.: Progress of Information Theory. Iwanami, Tokyo (1965) (in Japanese)

    Google Scholar 

  17. Kunisawa, K.: Information Theory, vol. I. Kyoritsu, Tokyo (1983) (in Japanese)

    Google Scholar 

  18. McEliece, R.J.: The Theory of Information Theory and Coding. Addison-Wesley, Reading (1977)

    MATH  Google Scholar 

  19. McMillan, B.: The basic theorem of information theory. Ann. Math. Stat. 24, 196–219 (1953)

    Article  MATH  MathSciNet  Google Scholar 

  20. Ohya, M., Watanabe, N.: A new treatment of communication processes with Gaussian channels. Jpn. J. Appl. Math. 3(1), 197–206 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  21. Parthasarathy, K.R.: On the integral representation of the rate of transmission of a stationary channel. Ill. J. Math. 5, 299–305 (1961)

    MATH  Google Scholar 

  22. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423 (1948), 623–656

    MATH  MathSciNet  Google Scholar 

  23. Taki, Y., Information Theory, vol. I. Iwanami, Tokyo (1978) (in Japanese)

    Google Scholar 

  24. Umegaki, H.: A functional method on amount of entropy. Kodai Math. Semin. Rep. 15, 162–175 (1963)

    Article  MATH  MathSciNet  Google Scholar 

  25. Umegaki, H.: General treatment of alphabet message space and integral representation of entropy. Kodai Math. Semin. Rep. 16, 8–26 (1964)

    Google Scholar 

  26. Umegaki, H., Ohya, M.: Probabilistic Entropy. Kyoritsu, Tokyo (1983) (in Japanese)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masanori Ohya .

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media B.V.

About this chapter

Cite this chapter

Ohya, M., Volovich, I. (2011). Fundamentals of Classical Information Theory. In: Mathematical Foundations of Quantum Information and Computation and Its Applications to Nano- and Bio-systems. Theoretical and Mathematical Physics. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-0171-7_6

Download citation

Publish with us

Policies and ethics