Advertisement

Problems of Information Transmission

, Volume 41, Issue 2, pp 134–149 | Cite as

Concentration Theorems for Entropy and Free Energy

  • V. V. V’yugin
  • V. P. Maslov
Large Systems

Abstract

Jaynes’s entropy concentration theorem states that, for most words ω1 ...ωN of length N such that \(\mathop \Sigma \limits_{i = 1}^{\rm N} \;f(\omega _i ) \approx vN\), empirical frequencies of values of a function f are close to the probabilities that maximize the Shannon entropy given a value v of the mathematical expectation of f. Using the notion of algorithmic entropy, we define the notions of entropy for the Bose and Fermi statistical models of unordered data. New variants of Jaynes’s concentration theorem for these models are proved. We also present some concentration properties for free energy in the case of a nonisolated isothermal system. Exact relations for the algorithmic entropy and free energy at extreme points are obtained. These relations are used to obtain tight bounds on uctuations of energy levels at equilibrium points.

Keywords

Entropy Free Energy Energy Level Statistical Model System Theory 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

REFERENCES

  1. 1.
    Jaynes, E.T., Papers on Probability, Statistics, and Statistical Physics, Dordrecht: Kluwer, 1989.Google Scholar
  2. 2.
    Cover, T.M. and Thomas, J.A., Elements of Information Theory, New York: Wiley, 1991.Google Scholar
  3. 3.
    Li, M. and Vitanyi, P., An Introduction to Kolmogorov Complexity and Its Applications, NewYork: Springer, 1997, 2nd ed.Google Scholar
  4. 4.
    Stratonovich, R.L., Teoriya informatsii (Information Theory), Moscow: Sov. Radio, 1975.Google Scholar
  5. 5.
    Landau, L.D. and Lifshitz, E.M., Statisticheskaya fizika, Part 1, Moscow: Nauka, 1976. Translated under the title Statistical Physics, vol. 1, Oxford, New York: Pergamon, 1980.Google Scholar
  6. 6.
    Jaynes, E.T., How Should We Use Entropy in Economics? (Some Half-Baked Ideas in Need of Criticism), unpublished manuscript. Available from http://bayes.wustl.edu/et/etj/articles/entropy.in. economics.pdf.Google Scholar
  7. 7.
    Maslov, V.P., Integral Equations and Phase Transitions in Probability Games. Analogy with Statistical Physics, Teor. Veroyatn. Primen., 2003, vol. 48, no.2, pp. 403–410 [Theory Probab. Appl. (Engl. Transl.), 2003, vol. 48, no. 2, pp. 359-367].Google Scholar
  8. 8.
    Kolmogorov, A.N., Three Approaches to the Quantitative Definition of Information, Probl. Peredachi Inf., 1965, vol. 1, no.1, pp. 3–11 [Probl. Inf. Trans. (Engl. Transl.), 1965, vol. 1, no. 1, pp. 1–7].Google Scholar
  9. 9.
    Kolmogorov, A.N., The Logical Basis for Information Theory and Probability Theory, IEEE Trans. Inform. Theory, 1968, vol. 14, no.3, pp. 662–664.Google Scholar
  10. 10.
    Zurek, W.H., Algorithmic Randomness and Physical Entropy, Phys. Rev. A, 1989, vol. 40, no.8, pp. 4731–4751.Google Scholar
  11. 11.
    Rissanen, J., Minimum Description Length Principle, Encyclopaedia of Statistical Sciences, vol. 5, Kotz, S. and Johnson, N.L., Eds., New York: Wiley, 1986, pp. 523–527.Google Scholar
  12. 12.
    Gacs, P., Tromp, J.,and Vitanyi, P., Algorithmic Statistics, IEEE Trans. Inform. Theory, 2001, vol. 47, no.6, pp. 2443–2463.Google Scholar
  13. 13.
    V’yugin, V.V. and Maslov, V.P., Extremal Relations between Additive Loss Functions and the Kolmogorov Complexity, Probl. Peredachi Inf., 2003, vol. 39, no.4, pp. 71–87 [Probl. Inf. Trans. (Engl. Transl.), 2003, vol. 39, no. 4, pp. 380–394].Google Scholar
  14. 14.
    Bogolyubov, N.N., Energy Levels of the Non-Ideal Bose-Einsten Gas, Vestnik Moskov. Univ., 1947, vol. 7, pp. 43–56.Google Scholar
  15. 15.
    Uspensky, V.A., Semenov, A.L., and Shen’, A.Kh., Can an Individual Sequence of Zeros and Ones Be Random?, Uspekhi Mat. Nauk, 1990, vol. 45, no.1, pp. 105–162 [Russian Math. Surveys (Engl. Transl.), 1990, vol. 45, no. 1, pp. 121–189].Google Scholar
  16. 16.
    Kolmogorov, A.N. and Uspensky, V.A., Algorithms and Randomness, Teor. Veroyatn. Primen., 1987, vol. 32, no.3, pp. 425–455 [Theory Probab. Appl. (Engl. Transl.), 1987, vol. 32, no. 3, pp. 389–412].Google Scholar
  17. 17.
    V’yugin, V.V., Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences, Computer J., 1999, vol. 42, no.4, pp. 294–317.Google Scholar
  18. 18.
    Kolmogorov, A.N., Combinatorial Foundations of Information Theory and the Calculus of Probabilities, Uspekhi Mat. Nauk, 1983, vol. 38, no.4, pp. 27–36 [Russian Math. Surveys (Engl. Transl.), 1983, vol. 38, no. 4, pp. 29–40].Google Scholar

Copyright information

© MAIK “Nauka/Interperiodica” 2005

Authors and Affiliations

  • V. V. V’yugin
    • 1
  • V. P. Maslov
    • 2
  1. 1.Institute for Information Transmission ProblemsRASMoscow
  2. 2.Physics DepartmentM.V. Lomonosov Moscow State UniversityRussia

Personalised recommendations