Skip to main content

Definition of information and entropy in the absence of noise

  • Chapter
  • First Online:
  • 1329 Accesses

Abstract

In modern science, engineering and public life, a big role is played by information and operations associated with it: information reception, information transmission, information processing, storing information and so on. The significance of information has seemingly outgrown the significance of the other important factor, which used to play a dominant role in the previous century, namely, energy.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    ‘nat’ refers to natural digit that means natural unit.

  2. 2.

    ‘bit’ refers to binary digit that means binary unit (sign).

  3. 3.

    Boltzmann’s entropy is commonly referred to as ‘Shannon’s entropy’, or just ‘entropy’ within the field of information theory.

References

  1. Fano, R.M.: Transmission of Information: A Statistical Theory of Communications, 1st edn. MIT Press, Cambridge (1961)

    Google Scholar 

  2. Fano, R.M.: Transmission of Information: A Statistical Theory of Communications (Translation to Russian). Mir, Moscow (1965)

    Google Scholar 

  3. Feinstein, A.: Foundations of Information Theory. McGraw-Hill, New York (1958)

    Book  MATH  Google Scholar 

  4. Feinstein, A.: Foundations of Information Theory (Translation to Russian). Inostrannaya Literatura, Moscow (1960)

    Google Scholar 

  5. Gnedenko, B.V.: The Course of the Theory of Probability. Fizmatgiz, Moscow (1961, in Russian)

    Google Scholar 

  6. Gnedenko, B.V.: The Theory of Probability (Translation from Russian). Chelsea, New York (1962)

    Google Scholar 

  7. Goldman, S.: Information Theory. Prentice Hall, Englewood Cliffs (1953)

    Google Scholar 

  8. Goldman, S.: Information Theory (Translation to Russian). Inostrannaya Literatura, Moscow (1957)

    Google Scholar 

  9. Hartley, R.V.L.: Transmission of information. Bell Syst. Tech. J. 7(3) (1928)

    Google Scholar 

  10. Hartley, R.V.L.: Transmission of information (Translation to Russian). In: A. Harkevich (ed.) Theory of Information and Its Applications. Fizmatgiz, Moscow (1959)

    Google Scholar 

  11. Kullback, S.: Information Theory and Statistics. Wiley, New York (1959)

    MATH  Google Scholar 

  12. Kullback, S.: Information Theory and Statistics (Translation to Russian). Nauka, Moscow (1967)

    MATH  Google Scholar 

  13. Rao, C.R.: Linear Statistical Inference and Its Applications. Wiley, New York (1965)

    MATH  Google Scholar 

  14. Rao, C.R.: Linear Statistical Inference and Its Applications (Translation to Russian). Inostrannaya Literatura, Moscow (1968)

    Google Scholar 

  15. Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27 (1948)

    Google Scholar 

  16. Shannon, C.E.: Communication in the presence of noise. Proc. IRE 37(1), 10–21 (1949)

    Article  MathSciNet  Google Scholar 

  17. Shannon, C.E.: Communication in the presence of noise (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)

    Google Scholar 

  18. Shannon, C.E.: A mathematical theory of communication (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Belavkin, R.V., Pardalos, P.M., Principe, J.C., Stratonovich, R.L. (2020). Definition of information and entropy in the absence of noise. In: Belavkin, R., Pardalos, P., Principe, J. (eds) Theory of Information and its Value. Springer, Cham. https://doi.org/10.1007/978-3-030-22833-0_1

Download citation

Publish with us

Policies and ethics