Advertisement

A Critical Review of Shannon Information Theory

  • Guy Jumarie
Part of the Springer Series in Synergetics book series (SSSYN, volume 47)

Abstract

In the preceding chapter, we summarized the basic elements of information theory, and we now proceed to examine and analyze the main characteristics of this theory. The term “critical” in the title of the chapter implies simply that we shall present a review of the main features for and against the theory. To support the theory in its present form, one can mention Shannon results on the capacity of a channel, the Boltzmann equation, and the fact that one can prove the central limit theorem in probability by using the properties of entropy only. Against the present form of the theory we have the apparent discrepancy between discrete entropy and continuous entropy, the absence of a concept of negative information to describe information lost, and the fact that the model does not take explicitly into account syntax and semantics. In the present chapter, we shall review these features and one of our conclusions will be as follows: Contrary to what some scientists are inclined to believe, we maintain that the continuous entropy is soundly defined, and that it merely remains to exhibit the differences in physical nature between discrete entropy and continuous entropy.

Keywords

Boltzmann Equation Central Limit Theorem Shannon Entropy Transmission Error Negative Information 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 3.1
    Thom, R.: Structural Stability and Morphogenesis ( Benjamin, New York 1975 )MATHGoogle Scholar
  2. 3.2
    Jumarie. G.: Subjectivity, Information, Systems. Introduction to a Theory of Relativistic Cybernetics ( Gordon and Breach, New York 1986 )Google Scholar
  3. 3.3
    Faddeev, D. K.: Uspehi Mat. NAUKA 11, 227 (1956)MathSciNetMATHGoogle Scholar
  4. 3.4
    Feinstein, A.: Foundation of Information Theory ( McGraw-Hill, New York 1958 )Google Scholar
  5. 3.5
    Kullback, S.: Information Theory and Statistics ( Wiley, New York 1959 )MATHGoogle Scholar
  6. 3.6
    Brillouin, L.: Science and Information Theory ( Academic, New York 1956 )MATHGoogle Scholar
  7. 3.7
    Lindeberg, J. W.: Math. Zeit. 15, 211(1922)MathSciNetCrossRefGoogle Scholar
  8. 3.8
    Linnik, Yu. V.: Theory of Prob, and Applications 4, 288 (1959)MathSciNetCrossRefGoogle Scholar
  9. 3.9
    Feller, W.: An Introduction to Probability Theory and its Application ( Wiley, New York 1950 )Google Scholar
  10. 3.10
    Jaynes, E.T.: Phys. Rev. 106, 620; 108, 171 (1957)CrossRefGoogle Scholar
  11. 3.11
    Jumarie, G.: J. of Math. Phys. 26, 1173 (1985)MathSciNetADSMATHCrossRefGoogle Scholar
  12. 3.12
    Shore, J. E., Johnson, R. W.: IEEE Trans. Inform. Theory IT 26, 26 (1980)MathSciNetADSMATHCrossRefGoogle Scholar
  13. 3.13
    Weaver, W., Shannon, C. E.: The Mathematical Theory of Communication ( Univ. of Illinois Press, Urbana 1949 )MATHGoogle Scholar
  14. 3.14
    Bar Hillel, Y.: Language and Information ( Addison Wesley, Reading 1964 )MATHGoogle Scholar
  15. 3.15
    Ryan, J. P.: J. Theoretical Biology 84, 31 (1980)CrossRefGoogle Scholar
  16. 3.16
    Haken, H. “Towards a Dynamic Information Theory” in Thermodynamics and Regulation of Biological Processes, ed. by I. Lamprecht and A. I. Zotin ( Walter de Gruyter, Berlin 1984 ) pp. 93–104Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1990

Authors and Affiliations

  • Guy Jumarie
    • 1
  1. 1.Department of Mathematics and Computer ScienceUniversity of Québec at MontréalMontréalCanada

Personalised recommendations