Limited role of entropy in information economics

  • Jacob Marschak
Economic Models
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4)


"Information transmitted" is defined as the amount by which added evidence (or "message received") diminishes "uncertainty". The latter is characterized by some properties intuitively suggested by this word and possessed by conditional entropy, a parameter of the posterior probability distribution. However, conditional entropy shares these properties with all other concave symmetric functions on the probability space.

Moreover, a given transmission channel (or, in the context of statistical inference, a given experiment) yields a higher maximum expected benefit than another, to any user, if the only if all concave functions of the posterior probability vector have higher values for the former channel (or experiment). Hence one information system (channel, experiment) may be preferable to another for a given user although its transmission rate, in entropy terms, is lower.

But only entropy has the economically relevant property of measuring, in the limit, the expected length of efficiently coded messages sent in long sequences. Thus, while irrelevant to the value (maximum expected benefit) of an information system and to the costs of observing, estimating, and deciding, entropy formulas are indeed relevant to the cost of communicating, i.e., of storing, coding and transmitting messages.


Mixed Strategy Pure Strategy Concave Function Code Word Entropy Formula 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Aczél, J. On Different Characterizations of Entropies. Probability and Information Theory, 1–11. Behara, M. et al., eds., Springer (1969)Google Scholar
  2. Blackwell, D. Equivalent Comparisons of Experiments. Ann. Math. Stat. 24, 265–72 (1953)Google Scholar
  3. — and Girshick, A. Theory of Games and Statistical Decisions, Mc Graw Hill (1970)Google Scholar
  4. De Groot, M.H. Uncertainty, Information and Sequential Experiments, Ann. Math. Stat. 33, 404–419 (1962)Google Scholar
  5. Optimal Statistical Decisions. Mc Graw Hill (1970)Google Scholar
  6. Feinstein, A. Foundations of Information Theory. Mc Graw Hill (1958)Google Scholar
  7. Marschak, J. Economics of Information Systems. J. Amer. Stat. Ass, 66, 192–219 (1971)Google Scholar
  8. — Optimal Systems for Information and Decision, Techniques of Optimization. Academic Press (1972)Google Scholar
  9. — and Miyasawa, K. Economic Comparability of Information Systems. Intern. Econ. Rev., 9, 137–74 (1968)Google Scholar
  10. Savage, L.J. The Foundations of Statistics, Wiley (1954)Google Scholar
  11. Schroedinger, E. Statistical Thermodynamics, Cambridge Univ. Press (1948)Google Scholar
  12. Shannon, C. The Mathematical Theory of Communication. Bell Syst. Tech. J. (1948)Google Scholar
  13. Wolfowitz, T. Coding Theorems of Information Theory. Springer (1961)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1973

Authors and Affiliations

  • Jacob Marschak
    • 1
  1. 1.University of CaliforniaLos Angeles

Personalised recommendations