Abstract
The concept of entropy was initially introduced in thermodynamics in a phenomenological context. Later, mainly by the contribution of Boltzmann, a probabilistic interpretation of the entropy was developed in order to clarify its deep relation with the microscopic structure underlying the macroscopic bodies. In the unrelated field of communication systems, Shannon showed that, with a suitable generalization of the entropy concept, one can establish a mathematically self-consistent information theory. Kolmogorov realized, just after Shannon’s work, the conceptual relevance of information theory and the importance of its ideas for the characterization of irregular behaviors in dynamical systems. Going beyond a probabilistic point of view, the Algorithmic Complexity (introduced by Chaitin, Kolmogorov and Solomonoff) allows a formalization of the intuitive notion of randomness of a sequence. In this chapter we discuss the connections among entropy, chaos and algorithmic complexity; in addition we briefly discuss how these concepts and methods can be successfully applied to other unrelated fields: linguistics, bioinformatics, finance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin/Heidelberg
About this chapter
Cite this chapter
Falcioni, M., Loreto, V., Vulpiani, A. (2003). Kolmogorov’s Legacy about Entropy, Chaos, and Complexity. In: Livi, R., Vulpiani, A. (eds) The Kolmogorov Legacy in Physics. Lecture Notes in Physics, vol 636. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-39668-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-540-39668-0_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-20307-0
Online ISBN: 978-3-540-39668-0
eBook Packages: Springer Book Archive