Skip to main content

Information Theory as the Science of Literal Communication

  • Chapter
  • First Online:
  • 1259 Accesses

Abstract

Chapter 4 is devoted to information theory as the science of literal communication. It begins with describing Shannon’s paradigm, which identifies the actors of any communication: (1) the source, which generates some message; (2) the channel which propagates the message; and (3) the destination which receives it. The matching of these entities to each others needs using devices which transform the message by coding, of two main types. Source coding is intended to shorten the message that the source delivers. Channel coding is intended to protect it against the symbol errors which occur in the channel, which demands lengthening the message. Both are assumed to be exactly reversible. Quantitative measures of information are defined, based on the improbability of symbols and messages. The source entropy measures the average information quantity borne by each of the symbols of the message it delivers. The channel capacity measures the largest information quantity that it can transfer. Two fundamental theorems state that source coding can reduce the message length up to a limit set by the source entropy, and that errorless communication is possible in the presence of symbol errors, but only provided the source entropy is less than the channel capacity. A normalized version of Shannon’s paradigm assumes that the message is transformed by source coding followed by channel coding, both achieving their theoretical limit. A simple proof of the fundamental source coding theorem is presented and the Huffman source coding algorithm is described. Comments about source coding help understanding the very concept of information and its relationship with semantics.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Such a function is generally referred to as ‘concave’ in the mathematical literature. We prefer using the single word ‘convex’, the shape of its representative curve being indicated by ∩ or ∪.

  2. 2.

    Analog means for copying multidimensional objects exist, but they are only approximative so they are not reliable when they are repeatedly used.

  3. 3.

    It may even be thought of as a 4-dimensional object since folding the assembled polypeptidic chain into a 3-dimensional molecule involves time.

References

  • Battail, G. (1990). Codage de source adaptatif par l’algorithme de Guazzo. Annales Télécommunic, 45(11–12), 677–693.

    Google Scholar 

  • Battail, G. (1997). Théorie de l’information. Paris: Masson.

    Google Scholar 

  • Battail, G. (2009). Living versus inanimate: the information border. Biosemiotics, 2(3), 321–341. doi:10.1007/s12304-009-9059-z.

    Article  Google Scholar 

  • Brillouin, L. (1956). Science and information theory. NewYork: Academic Press.

    Google Scholar 

  • Cover, T. M., & Thomas, J. A. (1991). Elements of information theory. New York: Wiley.

    Book  Google Scholar 

  • Gallager, R. G. (1968). Information theory and reliable communication. New York: Wiley.

    Google Scholar 

  • Gallager, R. G. (1978). Variations on a theme by Huffman. IEEE Transactions On Information Theory, IT-24(6), 668–674.

    Article  Google Scholar 

  • Guazzo, M., , M. (1980). A general minimum-redundancy source-coding algorithm. IEEE Transactions On Information Theory, IT-26(1), 15–25.

    Google Scholar 

  • Huffman, D. A. (1952). A method for the construction of minimum redundancy codes. Proceeding IRE, 40, 1098–1101.

    Article  Google Scholar 

  • Jaynes, E. T. (1957). Information theory and statistical mechanics I & II. Physical Review, 107/108, 620–630/171–190.

    Article  Google Scholar 

  • Johnson, R. W., & J.E. Shore, J. E. (1983). Comments on and correction to axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions On Information Theory, IT-29(6), 942–943.

    Article  Google Scholar 

  • Khinchin, A. I. (1957). Mathematical foundations of information theory. Dover.

    Google Scholar 

  • Kullback, S. (1959). Information theory and statistics. New York: Wiley.

    Google Scholar 

  • McMillan, B. (1953). The basic theorems of information theory. Annals Of Mathematics Statistics 24, 196–219. (Reprinted in Slepian 1974, 57–80).

    Article  Google Scholar 

  • Moher, M. (1993). Decoding via cross-entropy minimization. Proceeding GLOBECOM'93. 809–813, Houston, U.S.A.

    Google Scholar 

  • Neumann, J. von. (1966). Theory of self-reproducing automata, edited and completed by A.W. Burks. Urbana and London: University of Illinois Press.

    Google Scholar 

  • Pattee, H. (2005). The physics and metaphysics of biosemiotics. Journal of biosemiotics, 1(1) 281–301. (Reprinted in Favareau 2010, pp. 524–540).

    Google Scholar 

  • Rissanen, J. J. (1976). Generalized Kraft inequality and arithmetic coding. IBM Journal of Research & Development, 20(3) 198–203.

    Article  Google Scholar 

  • Rissanen, J. J., & Langdon, G. G. Jr. (1979). Arithmetic coding. IBM Journal of Research & Development 23(2) 149–162.

    Article  Google Scholar 

  • Roubine, E. (1970). Introduction à la théorie de la communication, tome III: théorie de l’information. Paris: Masson.

    Google Scholar 

  • Schrödinger, E. (1943). In What is life? and mind and matter. London: Cambridge University Press (1967).

    Google Scholar 

  • Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–457,  623–656. (Reprinted in Shannon and Weaver 1949, Sloane and Wyner 1993, pp. 5–83 and in Slepian 1974, pp. 5–29).

    Article  Google Scholar 

  • Shannon, C. E., & Weaver, W. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.

    Google Scholar 

  • Shore, J. E., & Johnson, R. W. (1980). Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions. on Information Theory, IT-26(1), 26–37.

    Article  Google Scholar 

  • Shore, J. E., & Johnson, R. W. (1981). Properties of cross-entropy minimization. IEEE Transactions on Information Theory, IT-27(4), 472–482.

    Article  Google Scholar 

  • Slepian, D. (Ed.). (1974). Key papers in the development of information theory. Piscataway: IEEE Press.

    Google Scholar 

  • Sloane, N. J. A., & A.D. Wyner, A. D. (Eds.). (1993). Claude Elwood Shannon, collected papers. Piscataway: IEEE Press.

    Book  Google Scholar 

  • Yockey, H. P. (1992). Information theory and molecular biology. Cambridge: Cambridge University Press.

    Google Scholar 

  • Ziv, J., Lempel, J. (1978). Compression of individual sequences via variable-rate coding. IEEE Transactions on Information Theory, IT-24(5), 530–536.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gérard Battail .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Battail, G. (2014). Information Theory as the Science of Literal Communication. In: Information and Life. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7040-9_4

Download citation

Publish with us

Policies and ethics