Abstract
The notion of entropy was introduced in the XIX century in the works of founders of statistical mechanics, R. Clausius, J.C. Maxwell, L. Boltzmann and others, in connection with the analysis of irreversibility phenomena. Later, entropy appeared and became the fundamental concept in the information theory created by C. Shannon in the 1940’s and was concerned with the problems of the transmission of information in the presence of noise. Though the formal expression for entropy was the same in both cases, there were some differences in its meaning. A.N. Kolmogorov in his work [Koll] applied the ideas of information theory and the notion of entropy to the analysis of some problems of ergodic theory. This work gave rise to a new branch of ergodic theory with numerous results and applications—the so-called entropy theory of dynamical systems. At the present time one may consider the developing of this theory to be mostly completed. This chapter is devoted to its exposition.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1989 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Cornfeld, I.P., Sinai, Y.G. (1989). Entropy Theory of Dynamical Systems. In: Sinai, Y.G. (eds) Dynamical Systems II. Encyclopaedia of Mathematical Sciences, vol 2. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-06788-8_3
Download citation
DOI: https://doi.org/10.1007/978-3-662-06788-8_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-06790-1
Online ISBN: 978-3-662-06788-8
eBook Packages: Springer Book Archive