Advertisement

Information

  • Igor Grabec
  • Wolfgang Sachse
Part of the Springer Series in Synergetics book series (SSSYN, volume 68)

Abstract

The concept of information is intuitively related with the change of our knowledge about natural phenomena caused by observations. [1, 2] An observation can generally be treated as an experiment, performed with the help of our sensual perception or a measuring instrument. A less intuitive description of the concept of information can be obtained by relating it with the properties of measuring systems and experimentation. For this purpose we mathematically define a variable by which the change of knowledge, and with it the information acquired, can be quantitatively characterized. We therefore try to incorporate the characteristic properties that are intuitively assigned to information into the mathematical definition of a corresponding variable. However, information is usually intuitively further related to a “meaning” or a “significance” which is generally specific to a particular observer but which also depends on the entire complex of phenomena preceding the actual acquisition of the information. Obviously the “meaning” of information is conceptually much more involved than the information itself and it cannot be easily or generally described. We therefore shall not define this concept quantitatively.

Keywords

Relative Information Sample Space True Distribution Continuous Random Variable Combine Experiment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    H. Haken: Synergetics, An Introduction, Springer Series in Synergetics, Vol. 1 ( Springer, Berlin 1983 )Google Scholar
  2. 2.
    H. Haken: Information and Self-Organization, A Macroscopic Approach to Complex Systems, Springer Series in Synergetics, Vol. 40 ( Springer, Berlin 1988 )Google Scholar
  3. 3.
    J. N. Kapur: Maximum Entropy Models in Science and Engineering ( John Wiley and Sons, New York 1989 )zbMATHGoogle Scholar
  4. 4.
    S. Kullback: Information Theory and Statistics (J. Wiley and Sons, New York 1959 )zbMATHGoogle Scholar
  5. 5.
    G. Jumarie: Relative Information, Theories and Applications, Springer Series in Synergetics, Vol. 47 ( Springer, Berlin 1990 )Google Scholar
  6. 6.
    C. E. Shannon, W. Weaver: The Mathematical Theory of Communication (University of Illinois Press, Urbana, Chicago 1949 and later editions)zbMATHGoogle Scholar
  7. 7.
    R.L. Stratonovitch: Teoriya Informacii (Sov. Radio, Moskva 1975 ), in RussianGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Igor Grabec
    • 1
  • Wolfgang Sachse
    • 2
  1. 1.Faculty of Mechanical EngineeringUniversity of LjubljanaLjubljanaSlovenia
  2. 2.Theoretical and Applied MechanicsCornell UniversityIthacaUSA

Personalised recommendations