Abstract
In this chapter we extend the definitions of Chap. 5 to real information channels that handle sequences of symbols instead of single symbols. This extension is necessary to use the idea of taking a limit of very long sequences to define information rate (Definition 5.5) now to define transinformation rate and channel capacity. This leads to the proof of Shannon’s famous theorem in the next chapter.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In this very general definition, one needs the canonical σ-algebras on \({A}^{\mathbb{N}}\) and \({B}^{\mathbb{N}}\) that are generated by the so-called “cylinder sets” (cf. Bauer 1972 and see also Definition 7.2).
- 2.
Of course, a general definition should not make requirements on the processes involved. Yet Shannon’s theorem relies on strong properties and it seems adequate to restrict the definition to stationary processes.
- 3.
The maximum is attained because the set of all probability vectors q is compact.
References
Ash, R. B. (1965). Information theory. New York, London, Sidney: Interscience.
Bauer, H. (1972). Probability theory and elements of measure theory. New York: Holt, Rinehart and Winston.
Feinstein, A. (1958). Foundations of information theory. New York: McGraw-Hill Book Company, Inc.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Palm, G. (2012). Channel Capacity. In: Novelty, Information and Surprise. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29075-6_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-29075-6_7
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-29074-9
Online ISBN: 978-3-642-29075-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)