Skip to main content

Optimal Information Measures for Weakly Chaotic Dynamical Systems

  • Chapter
General Theory of Information Transfer and Combinatorics

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4123))

Abstract

The study of weakly chaotic dynamical systems suggests that an important indicator for their classification is the quantity of information that is needed to describe their orbits. The information can be measured by the use of suitable compression algorithms. The algorithms are “optimal” for this purpose if they compress very efficiently zero entropy strings. We discuss a definition of optimality in this sense. We also show that the set of optimal algorithms is not empty, showing a concrete example. We prove that the algorithms which are optimal according to the above definition are suitable to measure the information needed to describe the orbits of the Manneville maps: in these examples the information content measured by these algorithms has the same asymptotic behavior as the algorithmic information content.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Argenti, F., Benci, V., Cerrai, P., Cordelli, A., Galatolo, S., Menconi, G.: Information and dynamical systems: a concrete measurement on sporadic dynamics. Chaos, Solitons and Fractals 13(3), 461–469 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  2. Allegrini, P., Barbi, M., Grigolini, P., West, B.J.: Dynamical model for DNA sequences. Phys. Rev. E 52(5), 5281–5297 (1995)

    Article  Google Scholar 

  3. Allegrini, P., Benci, V., Grigolini, P., Hamilton, P., Ignaccolo, M., Menconi, G., Palatella, L., Raffaelli, G., Scafetta, N., Virgilio, M., Jang, J.: Compression and diffusion: a joint approach to detect complexity. Chaos Solitons Fractals 15(3), 517–535 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  4. Benci, V., Bonanno, C., Galatolo, S., Menconi, G., Virgilio, M.: Dynamical systems and computable information. Disc. Cont. Dyn. Syst.-B 4(4) (2004)

    Google Scholar 

  5. Bonanno, C., Galatolo, S.: The complexity of the Manneville map (work in preparation)

    Google Scholar 

  6. Bonanno, C., Menconi, G.: Computational information for the logistic map at the chaos threshold. Disc. Cont. Dyn. Syst.- B 2(3), 415–431 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  7. Blume, F.: Possible rates of entropy convergence. Ergodic Theory and Dynam. Systems 17(1), 45–70 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  8. Brudno, A.A.: Entropy and the complexity of the trajectories of a dynamical system. Trans. Moscow Math. Soc. 2, 127–151 (1983)

    Google Scholar 

  9. Chaitin, G.J.: Information, Randomness and Incompleteness. Papers on Algorithmic Information Theory. World Scientific, Singapore (1987)

    Google Scholar 

  10. Csiszár, I.: The method of types. IEEE Trans. Inform. Theory 44, 2505–2523 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  11. Galatolo, S.: Orbit complexity by computable structures. Nonlinearity 13, 1531–1546 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  12. Galatolo, S.: Orbit complexity and data compression. Discrete and Continuous Dynamical Systems 7(3), 477–486 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  13. Galatolo, S.: Complexity, initial condition sensitivity, dimension and weak chaos in dynamical systems. Nonlinearity 16, 1219–1238 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  14. Gaspard, P., Wang, X.J.: Sporadicity: between periodic and chaotic dynamical behavior. Proc. Nat. Acad. Sci. USA 85, 4591–4595 (1988)

    Article  MATH  MathSciNet  Google Scholar 

  15. Han, T.S., Kobayashi, K.: Mathematics of Information and Coding. Math. Monographs, vol. 203. AMS (2002)

    Google Scholar 

  16. Khinchin, A.I.: Mathematical Foundations of Information Theory. Dover Publications, New York (1957)

    MATH  Google Scholar 

  17. Isola, S.: Renewal sequences and intermittency. J. Statist. Phys. 97(1-2), 263–280 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  18. Kosaraju, S.R., Manzini, G.: Compression of low entropy strings with Lempel-Ziv algorithms. SIAM J. Comput. 29, 893–911 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  19. Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and its Applications. Springer, Heidelberg (1993)

    MATH  Google Scholar 

  20. Manneville, P.: Intermittency, self-similarity and 1/f spectrum in dissipative dynamical systems. J. Physique 41, 1235–1243 (1980)

    Article  MathSciNet  Google Scholar 

  21. Pesin, Y.B.: Dimension Theory in Dynamical Systems. Chicago Lectures in Mathematics (1997)

    Google Scholar 

  22. Pollicott, M., Weiss, H.: Multifractal analysis of Lyapunov exponent for continued fraction and Manneville-Pomeau transformations and applications to Diophantine approximation. Comm. Math. Phys. 207(1), 145–171 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  23. Ryabko, B.: Twice-universal coding, Russian. Problemy Peredachi Informatsii 20(3), 24–28 (1984)

    MATH  MathSciNet  Google Scholar 

  24. Takens, F., Verbitski, E.: Generalized entropies: Renyi and correlation integral approach. Nonlinearity 11(4), 771–782 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  25. White, H.: Algorithmic complexity of points in dynamical systems. Ergodic Theory Dynam. Syst. 13, 807–830 (1993)

    Article  MATH  Google Scholar 

  26. Ziv, J., Lempel, A.: A universal algorithm for sequential data compression. IEEE Trans. Inform. Theory 23, 337–342 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  27. Ziv, J., Lempel, A.: Compression of individual sequences via variable-rate coding. IEEE Trans. Inform. Theory 24, 530–536 (1978)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Benci, V., Galatolo, S. (2006). Optimal Information Measures for Weakly Chaotic Dynamical Systems. In: Ahlswede, R., et al. General Theory of Information Transfer and Combinatorics. Lecture Notes in Computer Science, vol 4123. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11889342_37

Download citation

  • DOI: https://doi.org/10.1007/11889342_37

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46244-6

  • Online ISBN: 978-3-540-46245-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics