Maximum Entropy Principles

  • Igor Grabec
  • Wolfgang Sachse
Part of the Springer Series in Synergetics book series (SSSYN, volume 68)


The concept of information can be successfully utilized for the adaptation of a probability distribution to empirical data. In order to proceed to the formulation of the corresponding principle, let us first recall the expression for the empirical probability density for the case when all the samples are distinct
$$fe\left( x \right) = \frac{1}{N}\sum\limits_{i = 1}^N {\delta \left( {x - {x_i}} \right)} = \sum\limits_{i = 1}^N {{P_f}} \delta \left( {x - {x_i}} \right) $$


Sample Point Sample Space Vector Quantization Reference Function Window Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    A. Gersho: “On the Structure of Vector Quantizers”, IEEE Trans. Inf. Theory, IT-28, 157–166 (1982)MathSciNetCrossRefGoogle Scholar
  2. 2.
    J. W. Gibbs: Elementary Principles in Statistical Mechanics (Yale University Press, New Haven, Conn. 1902 )zbMATHGoogle Scholar
  3. 3.
    I. Grabec: “Self-Organization of Neorons Described by the Maximum-Entropy Principle”, Biol. Cyb., 63, 403–409 (1990)zbMATHCrossRefGoogle Scholar
  4. 4.
    H. Haken: Synergetics, An Introduction, Springer Series in Synergetics, Vol. 1 ( Springer, Berlin 1983 )Google Scholar
  5. 5.
    H. Haken: Information and Self-Organization, A Macroscopic Approach to Complex Systems, Springer Series in Synergetics, Vol. 40 ( Springer, Berlin 1988 )Google Scholar
  6. 6.
    E. T. Janes: “Where do we stand on maximum entropy?” in The Maximum- Entropy Formalism, ed. by R. D. Levine, M. Tribus ( MIT Press, Cambridge, MA 1978 )Google Scholar
  7. 7.
    G. Jumarie: Relative Information: Theories and Applications, Springer Series in Synergetics, Vol. 47 ( Springer, Berlin 1990 )Google Scholar
  8. 8.
    J. N. Kapur: Maximum-entropy Models in Science and Engineering ( John Wiley and Sons, New York 1989 )zbMATHGoogle Scholar
  9. 9.
    J. N. Kapur, H. K. Kesavan: Entropy Optimization Principles with Applications ( Academic Press, Boston 1992 )Google Scholar
  10. 10.
    T. Kohonen: “Learning Vector Quantization for Pattern Recognition”, Report TKK-F-A601, ISBN 915-753-950-9, (Helsinki University of Technology, Dpt. Techn. Physics, SF-02150, Espoo, Finland 1986 )Google Scholar
  11. 11.
    G. A. Korn, T. M. Korn: Mathematical Handbook for Scientists and Engineers, Defìnitions, Theorems, and Formulas for Reference and Review ( McGraw Hill, New York 1968 )zbMATHGoogle Scholar
  12. 12.
    S. Kullback: Information Theory and Statistics (J. Wiley and Sons, New York 1959 )zbMATHGoogle Scholar
  13. 13.
    Y. Linde, A. Buzo, R. M. Gray: “An Algorithm for Vector Quantizer Design”, IEEE Trans. Com., Com. 28, 84–95 (1980)CrossRefGoogle Scholar
  14. 14.
    J. Makhoul, S. Roucos, H. Gish: “Vector Quantization in Speech Coding”, Proc. IEEE, 73, 1551–1588 (1985)CrossRefGoogle Scholar
  15. 15.
    N. M. Nasrabadi, R. A. King: “Image Coding Using Vector Quantization, A Review”, IEEE Trans. Com., 36, 957–971 (1988)CrossRefGoogle Scholar
  16. 16.
    Maximum-Entropy and Bayesian Methods in Inverse Problems, ed. by C. Ray Smith and W.T. Grandy, Jr. (D. Reidel Publishing Company, Dordrecht 1985)zbMATHGoogle Scholar
  17. 17.
    C. E. Shannon, W. Weaver: The Mathematical Theory of Communication, (University of Illinois Press, Urbana, Chicago 1949 and later editions)zbMATHGoogle Scholar
  18. 18.
    R.L. Stratonovitch Teoriya Informacii (Sov. Radio, Moskva 1975 ), in RussianGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Igor Grabec
    • 1
  • Wolfgang Sachse
    • 2
  1. 1.Faculty of Mechanical EngineeringUniversity of LjubljanaLjubljanaSlovenia
  2. 2.Theoretical and Applied MechanicsCornell UniversityIthacaUSA

Personalised recommendations