Advertisement

Is the Maximum Entropy Principle Operationally Justifiable?

Chapter
  • 395 Downloads

Abstract

Let X be a random variable originally believed to have distribution Q. When new information is obtained suggesting that the distribution of X actually belongs to a set of distributions Π not containing the original guess Q, this should be updated to conform with the new information. Intuitively a proper updating should be that element of Π which is closest to the original guess Q. It remains to specify the measure of distance between distributions to be used to find this closest element.

Keywords

Maximum Entropy Common Distribution Close Element Common Range Conditional Limit 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. [1]
    T. Cover, B.S. Choi, and I. Csiszár, “Conditional Limit Theorems under Markov Conditioning,” to appear IEEE Trans. Inf. Theory.Google Scholar

Copyright information

© Springer-Verlag New York Inc. 1987

Authors and Affiliations

  1. 1.Mathematical Institute of the Hungarian Academy of SciencesBudapestHungary

Personalised recommendations