Information Contents of Distributions
The entropy, as usually defined, is a measure of our ignorance and, if multiplied by — 1, can be considered as a measure of our knowledge of the state of a system.1 It is a measure of our total knowledge into which the knowledge of the value of any observable enters in the same way (cf. section 3). It is this last circumstance which prompted the considerations leading to the present note. According to quantum mechanical theory, some observables can be measured much more easily than others: the observables which commute with the additive conserved quantities (energy, components of the linear and angular momenta, electric charge) can be measured with microscopic apparatuses; those which do not commute with these quantities need for their measurement macroscopic systems.2 Hence, the problem of defining a measure of our knowledge with respect to the latter quantities arises. The present note will be restricted to the case in which there is only one conserved additive quantity; this will be denoted by k. The name “skew information” has been proposed3 for the amount of information which an ensemble described by a state vector or a statistical matrix contains with respect to the not easily measured quantities. This information relates to the transition probabilities into states which lie askew to the characteristic vectors of the additive conserved quantities.
Unable to display preview. Download preview PDF.
- 1.For a very brief and condensed history of the recognition of this principle, see footnote 1 of W. Weavers article in The Mathematical Theory of Communication (Urbana: The University of Illinois Press, 1949), p. 45. See also the last few pages of M. v. Smoluchowskis article in Vortrge uber die kinetische Theorie der Materie und Elektrizitt (Leipzig: B. G. Teubner, 1914).Google Scholar
- 2.Wigner, L. P., Z. Physik, 131, 101 (1952); Araki, H., and M. M. Yanase, Phys. Rev., 120, 622 (1960).Google Scholar
- 3.Wigner, E. P., Physikertagung Wien (Mosbach/Baden: Physik Verlag, 1962), p. 1Google Scholar
- 4.Landau, L., Z. Physik, 45, 430 (1927); v. Neumann, J., Nachr. Gott., p. 245 (1927).Google Scholar
- 5.Gibbs, J. W., Collected Papers (New York: Longman, Green and Co., 1928), p. 154; Tolman, R. C., in The Principles of Statistical Mechanics (Oxford University Press, 1938), p. 52.Google Scholar
- 6.Krauss, F., Math. Zeit., 41, 18 (1936); Bendat, J., and S. Sherman, Trans. Amer. Math. Soc., 79, 58 (1955).Google Scholar
- 7.Delbrck, M., and G. Molire, Abh. Preuss. Akad. (1937), p. 1.Google Scholar
- 8.Wigner, E. P., and M. M. Yanase, to appear soon.Google Scholar
- 9.Gibbs, J. W., Collected Papers, p. 159.Google Scholar
- 10.Halmos, P. R., Finite Dimensional Vector Spaces (Princeton University Press, 1942), p. 138.Google Scholar