Abstract
The study of weakly chaotic dynamical systems suggests that an important indicator for their classification is the quantity of information that is needed to describe their orbits. The information can be measured by the use of suitable compression algorithms. The algorithms are “optimal” for this purpose if they compress very efficiently zero entropy strings. We discuss a definition of optimality in this sense. We also show that the set of optimal algorithms is not empty, showing a concrete example. We prove that the algorithms which are optimal according to the above definition are suitable to measure the information needed to describe the orbits of the Manneville maps: in these examples the information content measured by these algorithms has the same asymptotic behavior as the algorithmic information content.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Argenti, F., Benci, V., Cerrai, P., Cordelli, A., Galatolo, S., Menconi, G.: Information and dynamical systems: a concrete measurement on sporadic dynamics. Chaos, Solitons and Fractals 13(3), 461–469 (2002)
Allegrini, P., Barbi, M., Grigolini, P., West, B.J.: Dynamical model for DNA sequences. Phys. Rev. E 52(5), 5281–5297 (1995)
Allegrini, P., Benci, V., Grigolini, P., Hamilton, P., Ignaccolo, M., Menconi, G., Palatella, L., Raffaelli, G., Scafetta, N., Virgilio, M., Jang, J.: Compression and diffusion: a joint approach to detect complexity. Chaos Solitons Fractals 15(3), 517–535 (2003)
Benci, V., Bonanno, C., Galatolo, S., Menconi, G., Virgilio, M.: Dynamical systems and computable information. Disc. Cont. Dyn. Syst.-B 4(4) (2004)
Bonanno, C., Galatolo, S.: The complexity of the Manneville map (work in preparation)
Bonanno, C., Menconi, G.: Computational information for the logistic map at the chaos threshold. Disc. Cont. Dyn. Syst.- B 2(3), 415–431 (2002)
Blume, F.: Possible rates of entropy convergence. Ergodic Theory and Dynam. Systems 17(1), 45–70 (1997)
Brudno, A.A.: Entropy and the complexity of the trajectories of a dynamical system. Trans. Moscow Math. Soc. 2, 127–151 (1983)
Chaitin, G.J.: Information, Randomness and Incompleteness. Papers on Algorithmic Information Theory. World Scientific, Singapore (1987)
Csiszár, I.: The method of types. IEEE Trans. Inform. Theory 44, 2505–2523 (1998)
Galatolo, S.: Orbit complexity by computable structures. Nonlinearity 13, 1531–1546 (2000)
Galatolo, S.: Orbit complexity and data compression. Discrete and Continuous Dynamical Systems 7(3), 477–486 (2001)
Galatolo, S.: Complexity, initial condition sensitivity, dimension and weak chaos in dynamical systems. Nonlinearity 16, 1219–1238 (2003)
Gaspard, P., Wang, X.J.: Sporadicity: between periodic and chaotic dynamical behavior. Proc. Nat. Acad. Sci. USA 85, 4591–4595 (1988)
Han, T.S., Kobayashi, K.: Mathematics of Information and Coding. Math. Monographs, vol. 203. AMS (2002)
Khinchin, A.I.: Mathematical Foundations of Information Theory. Dover Publications, New York (1957)
Isola, S.: Renewal sequences and intermittency. J. Statist. Phys. 97(1-2), 263–280 (1999)
Kosaraju, S.R., Manzini, G.: Compression of low entropy strings with Lempel-Ziv algorithms. SIAM J. Comput. 29, 893–911 (2000)
Li, M., Vitanyi, P.: An Introduction to Kolmogorov Complexity and its Applications. Springer, Heidelberg (1993)
Manneville, P.: Intermittency, self-similarity and 1/f spectrum in dissipative dynamical systems. J. Physique 41, 1235–1243 (1980)
Pesin, Y.B.: Dimension Theory in Dynamical Systems. Chicago Lectures in Mathematics (1997)
Pollicott, M., Weiss, H.: Multifractal analysis of Lyapunov exponent for continued fraction and Manneville-Pomeau transformations and applications to Diophantine approximation. Comm. Math. Phys. 207(1), 145–171 (1999)
Ryabko, B.: Twice-universal coding, Russian. Problemy Peredachi Informatsii 20(3), 24–28 (1984)
Takens, F., Verbitski, E.: Generalized entropies: Renyi and correlation integral approach. Nonlinearity 11(4), 771–782 (1998)
White, H.: Algorithmic complexity of points in dynamical systems. Ergodic Theory Dynam. Syst. 13, 807–830 (1993)
Ziv, J., Lempel, A.: A universal algorithm for sequential data compression. IEEE Trans. Inform. Theory 23, 337–342 (1977)
Ziv, J., Lempel, A.: Compression of individual sequences via variable-rate coding. IEEE Trans. Inform. Theory 24, 530–536 (1978)
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Benci, V., Galatolo, S. (2006). Optimal Information Measures for Weakly Chaotic Dynamical Systems. In: Ahlswede, R., et al. General Theory of Information Transfer and Combinatorics. Lecture Notes in Computer Science, vol 4123. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11889342_37
Download citation
DOI: https://doi.org/10.1007/11889342_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-46244-6
Online ISBN: 978-3-540-46245-3
eBook Packages: Computer ScienceComputer Science (R0)