# Maximum Entropy Calculations on a Discrete Probability Space

## Abstract

In a remarkable series of papers beginning in 1957, E. T. Jaynes (1957) began a revolution in inductive thinking with his principle of maximum entropy. He defined probability as a degree of plausibility, a much more general and useful definition than the frequentist definition as the limit of the ratio of two frequencies in some imaginary experiment. He then used Shannon’s definition of entropy and stated that in any situation in which we have incomplete information, the probability assignment which expresses all known information and is maximally non-committal with respect to all unknown information is that unique probability distribution with maximum entropy (ME). It is also a combinatorial theorem that the unique ME probability distribution is the one which can be realized in the greatest number of ways. The ME principle also provides the fairest description of our state of knowledge. When further information is obtained, if that information is pertinent then a new ME calculation can be performed with a consequent reduction in entropy and an increase in our total information. It must be emphasized that the ME solution is not necessarily the “correct” solution; it is simply the best that can be done with whatever data are available. There is no one “correct solution”, but an infinity of possible solutions. These ideas will now be made quite concrete and expressed mathematically.

## Keywords

Maximum Entropy Probability Assignment Maximum Entropy Principle Insufficient Reason Maximum Entropy Formalism## Preview

Unable to display preview. Download preview PDF.

## References

- Cox. R., (1974). Probability, Frequency and Reasonable Expectation, Am. J. Physics, 17, 1.Google Scholar
- Cox, R., (1961). The Algebra of Probable Inference, Johns Hopkins University Press, Baltimore, MD.zbMATHGoogle Scholar
- Czuber, E., (1908) Wahrscheinlichkeitsrechnung.Google Scholar
- Hobson, A. (1972). The Interpretation of Inductive Probabilities, J. Stat. Phys 6, 189.MathSciNetCrossRefGoogle Scholar
- Frieden, B. Roy (1985). Dice Entropy and Likelihood, Proc. IEEE 73, 1764.Google Scholar
- Friedman K., (1973), Replies to Tribus and Motroni and to Gage and Hestenes, J. Stat. Phys 2, 265.CrossRefGoogle Scholar
- Friedman K. and A. Shimony, (1971). Jaynes Maximum Entropy Prescription and Probability Theory, J. Stat. Phys. 1, 193.Google Scholar
- Gage, D. W. and D. Hestenes, (1973). Comments on the paper “Jaynes Maximum Entropy Prescription and Probability Theory”, J. Stat. Phys 7, 89.CrossRefGoogle Scholar
- Jaynes, E. T., (1957). Information Theory and Statistical Mechanics, Part I, Phys. Rev., 106, 620; Part II; ibid, 108,171.MathSciNetCrossRefGoogle Scholar
- Jaynes, E. T. (1963a), “Brandeis Lectures” in E. T. Jaynes Papers on Probability, Statistics and Statistical Physics, R. D. Rosenkrantz, Ed. D. Reidel Publishing Co., Boston, Mass.Google Scholar
- Jaynes, E. T., (1968). “Prior Probabilities”, IEEE Trans Syst. Sci Cybern., SSC4, 227.CrossRefGoogle Scholar
- Jaynes, E. T., (1978). Where do we stand on Maximum Entropy, in the Maximum Entropy Formalism, R. D. Levine and M. Tribus,Editors, MIT Press, Cambridge, Mass.Google Scholar
- Jaynes, E. T., (1979). “Concentration of Distributions at Entropy Maxima” in E. T. Jaynes: Papers on Probability, Statistics and Statistical Physics, R. D. Rosenkrntz, Ed., D. Reidel Publishing Co. Boston, Mass.Google Scholar
- Jaynes E. T. (1982). “On The Rationale of Maximum — Entropy Methods”, Proc. IEEE, 70 939.CrossRefGoogle Scholar
- Keynes, J.M., (1952). A treatise on Probability. MacMillam & Co, London.Google Scholar
- deLaplace, Pierre Simon, (1951). A Philosphical Essay on Probabilities, Dover, New York.Google Scholar
- Makhoul, J. (1986), “Maximum Confusion Spectral Analysis”, Proc. Third ASSP Workshop on Spectrum Estimation and Modelling; Boston, Mass.Google Scholar
- Macqueen J. and J. Marschak, (1975) “Partial Knowledge, Entropy and Estimation”, Proc. Nat. Acad. Sci., Vol 72, pp. 3819–3824.MathSciNetzbMATHCrossRefGoogle Scholar
- Rowlinson, J. S., (1970). Probability, Information and Entropy, Nature 225, 1196.CrossRefGoogle Scholar
- Shannon, C. E. and W. Weaver, (1949). The Mathematical Theory of Communication, The University of Illinois Press: Urbana.zbMATHGoogle Scholar
- Shimony, A., (1973). Comment on the interpretation of inductive probabilities, J. Stat. Phys 9, 187.MathSciNetCrossRefGoogle Scholar
- Teubners, B. G., Sammlung Von Lehr Buchern Auf Dem Gebiete Der Mathematischen Wissenschaften, Band IX p. 149, Berlin.Google Scholar
- Tribus, Myron (1961), Thermostatics and Thermodynamics, D Van Nostrand Co., Princeton, N. J.Google Scholar
- Tribus, Myron (1969), Rational Descriptions, Decisions and Designs. Pergamon Press, Oxford.Google Scholar
- Tribus, Myron and H. Motroni, (1977) Comments on the Paper, Jaynes Maximum Entropy Prescription and Probability Theory”, J. Stat. Phys 4, 227.CrossRefGoogle Scholar