Abstract
It is hardly necessary to point out that information has become a pillar of modern society. The concept of information made its entry into the exact sciences only relatively recently, since it was formalized in the years 1945–1948 by Shannon in order to tackle the technical problems of communication. In actual fact, it was already implicitly present in the idea of entropy introduced by Boltzmann at the end of the nineteenth century. As we shall see, like the idea of stationarity, entropy does not characterize a particular realization, but rather the whole set of possible realizations. In contrast, Kolmogorov complexity is defined for each realization and we may give an intuitive meaning to the idea of fluctuation or randomness that we would sometimes like to attribute to a series of observations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer Science+Business Media New York
About this chapter
Cite this chapter
Réfrégier, P. (2004). Information and Fluctuations. In: Noise Theory and Application to Physics. Advanced Texts in Physics. Springer, New York, NY. https://doi.org/10.1007/978-0-387-22526-5_5
Download citation
DOI: https://doi.org/10.1007/978-0-387-22526-5_5
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1896-3
Online ISBN: 978-0-387-22526-5
eBook Packages: Springer Book Archive