In the context of communication theory the concept of a quantity of information is defined as a function over the elements of a set S of messages. In coding these messages in some alphabet for transmission over a noise-free channel, the greatest average transmission rate is achieved if the length L i of the ith message is set equal (within the error of representation by integers) to the expected value of a random variable which has the value log k (1/p i ).k is the number of letters in the alphabet and p i is the statistical probability of the ith message, i.e. the relative frequency of its transmission.1 For simplicity in theoretical discussion one usually chooses k=2, thinking of a binary alphabet.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    C. E. Shannon, A Mathematical Theory of Communication (Monograph B-1598, Bell Telephone Systems Technical Publications ), New York 1948.Google Scholar
  2. 2.
    Aristotle, De Anima, 402b.Google Scholar
  3. 3.
    For a brief development, see D. Hawkins, ‘On Chance and Choice’, Reviews of Modern Physics 36 (1964) 512–516.CrossRefGoogle Scholar
  4. 4.
    Charles Hartshorne and Paul Weiss (eds.), Collected Papers of Charles Peirce, vol. II: Elements of Logic, Cambridge 1932, Ch. V.Google Scholar
  5. 5.
    See Robert Fano, ‘A Heuristic Discussion of Probabilistic Decoding’, IEEE Transactions on Information Theory, Vol. II (1964), pp. 64–74. For a general discussion of tree searching, see David Hawkins, The Language of Nature, San Francisco 1964, pp. 239–244.Google Scholar
  6. 6.
    See especially the preface, ‘On Philosophy in General’, in The Critique of Judgement.Google Scholar
  7. 7.
    Egon Brunswick, The Conceptual Framework of Psychology, Chicago 1952.Google Scholar

Copyright information

© D. Reidel Publishing Company / Dordrecht-Holland 1967

Authors and Affiliations

  • David Hawkins
    • 1
  1. 1.University of ColoradoUSA

Personalised recommendations