Measuring Complexity in Terms of Mutual Information

  • Andrew M. Fraser
Part of the NATO ASI Series book series (NSSB, volume 208)

Abstract

An alternative definition of the KS entropy h μ based on mutual information is proposed. The new definition is designed to handle experimental noise more gracefully than the standard definition. An example is used to illustrate the difference.

Keywords

Mutual Information English Summary Strange Attractor Conditional Entropy Experimental Noise 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    R. G. Gallager. Information Theory and Reliable Communication. John Wiley and Sons, New York, 1968.MATHGoogle Scholar
  2. [2]
    A. N. Kolmogorov. A new metric invariant of transient dynamical systems and automorphisms in lebesgue spaces. Dokl. Akad. Nauk. SSSR, 119:861–864, 1958. English summary in Mathematical Reviews, vol. 21, pp. 386, 1960.Google Scholar
  3. [3]
    Y. Sinai. On the concept of entropy for a dynamic system. Dokl. Akad. Nauk. SSSR, 124:768–771, 1959. English summary in Mathematical Reviews, vol. 21, pp. 386–387, 1960.Google Scholar
  4. [4]
    Y. Sinai. Introduction to Ergodic Theory. Princeton University Press, Princeton, 1976.MATHGoogle Scholar
  5. [5]
    R. Shaw. Strange attractors, chaotic behavior, and information flow. Z. Naturforsch, 36a (1): 80112, Jan. 1981.Google Scholar
  6. [6]
    A. M. Fraser. Information and entropy in strange attractors. I.E.E.E. Transactions on Information Theory, 35 (2): 245–262, March 1989.MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Plenum Press, New York 1989

Authors and Affiliations

  • Andrew M. Fraser
    • 1
  1. 1.Center for Nonlinear Dynamics Department of PhysicsUniversity of TexasAustinUSA

Personalised recommendations