Abstract
In the previous chapter, for one particular example (see Sections 3.1 and 3.4) we showed that in calculating the maximum entropy (i.e. the capacity of a noiseless channel) the constraint \(c(y) \leqslant a\) imposed on feasible realizations is equivalent, for a sufficiently long code sequence, to the constraint \(\mathbb {E}[c(y)] \leqslant a\) on the mean value \(\mathbb {E}[c(y)]\). In this chapter we prove (Section 4.3) that under certain assumptions such equivalence takes place in the general case; this is the assertion of the first asymptotic theorem. In what follows, we shall also consider the other two asymptotic theorems (Chapters 7 and 11), which are the most profound results of information theory. All of them have the following feature in common: ultimately all these theorems state that, for utmost large systems, the difference between the concepts of discreteness and continuity disappears, and that the characteristics of a large collection of discrete objects can be calculated using a continuous functional dependence involving averaged quantities. For the first variational problem, this feature is expressed by the fact that the discrete function \(H = \ln M\) of a, which exists under the constraint c(y) ≤ a, is asymptotically replaced by a continuous function H(a) calculated by solving the first variational problem. As far as the proof is concerned, the first asymptotic theorem turns out to be related to the theorem on canonical distribution stability (Section 4.2), which is very important in statistical thermodynamics and which is actually proved there when the canonical distribution is derived from the microcanonical one. Here we consider it in a more general and abstract form. The relationship between the first asymptotic theorem and the theorem on the canonical distribution once more underlines the intrinsic unity of the mathematical apparatus of information theory and statistical thermodynamics.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Belavkin, R.V., Pardalos, P.M., Principe, J.C., Stratonovich, R.L. (2020). First asymptotic theorem and related results. In: Belavkin, R., Pardalos, P., Principe, J. (eds) Theory of Information and its Value. Springer, Cham. https://doi.org/10.1007/978-3-030-22833-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-22833-0_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22832-3
Online ISBN: 978-3-030-22833-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)