Limited role of entropy in information economics
"Information transmitted" is defined as the amount by which added evidence (or "message received") diminishes "uncertainty". The latter is characterized by some properties intuitively suggested by this word and possessed by conditional entropy, a parameter of the posterior probability distribution. However, conditional entropy shares these properties with all other concave symmetric functions on the probability space.
Moreover, a given transmission channel (or, in the context of statistical inference, a given experiment) yields a higher maximum expected benefit than another, to any user, if the only if all concave functions of the posterior probability vector have higher values for the former channel (or experiment). Hence one information system (channel, experiment) may be preferable to another for a given user although its transmission rate, in entropy terms, is lower.
But only entropy has the economically relevant property of measuring, in the limit, the expected length of efficiently coded messages sent in long sequences. Thus, while irrelevant to the value (maximum expected benefit) of an information system and to the costs of observing, estimating, and deciding, entropy formulas are indeed relevant to the cost of communicating, i.e., of storing, coding and transmitting messages.
KeywordsMixed Strategy Pure Strategy Concave Function Code Word Entropy Formula
Unable to display preview. Download preview PDF.
- Aczél, J. On Different Characterizations of Entropies. Probability and Information Theory, 1–11. Behara, M. et al., eds., Springer (1969)Google Scholar
- Blackwell, D. Equivalent Comparisons of Experiments. Ann. Math. Stat. 24, 265–72 (1953)Google Scholar
- — and Girshick, A. Theory of Games and Statistical Decisions, Mc Graw Hill (1970)Google Scholar
- De Groot, M.H. Uncertainty, Information and Sequential Experiments, Ann. Math. Stat. 33, 404–419 (1962)Google Scholar
- — Optimal Statistical Decisions. Mc Graw Hill (1970)Google Scholar
- Feinstein, A. Foundations of Information Theory. Mc Graw Hill (1958)Google Scholar
- Marschak, J. Economics of Information Systems. J. Amer. Stat. Ass, 66, 192–219 (1971)Google Scholar
- — Optimal Systems for Information and Decision, Techniques of Optimization. Academic Press (1972)Google Scholar
- — and Miyasawa, K. Economic Comparability of Information Systems. Intern. Econ. Rev., 9, 137–74 (1968)Google Scholar
- Savage, L.J. The Foundations of Statistics, Wiley (1954)Google Scholar
- Schroedinger, E. Statistical Thermodynamics, Cambridge Univ. Press (1948)Google Scholar
- Shannon, C. The Mathematical Theory of Communication. Bell Syst. Tech. J. (1948)Google Scholar
- Wolfowitz, T. Coding Theorems of Information Theory. Springer (1961)Google Scholar