Abstract
In Chapter 9, the continuous signal was constructed as the limiting case of a discrete signal, when the increments approach zero. The statistical properties of a continuous signal were defined in terms of probability density. The same approach can be used with the measure of information transmitted by a continuous signal, or as it is called in electronics, an analog signal.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Reference
Shannon interprets the discrete entropy as “uncertainty”, but in the sequel he calls the continuous entropy (Section 10.2) “continuous information” Shannon, 1949, Chapter III.
Rights and permissions
Copyright information
© 2002 Springer Science+Business Media New York
About this chapter
Cite this chapter
Kåhre, J. (2002). Continuous Information. In: The Mathematical Theory of Information. The Springer International Series in Engineering and Computer Science, vol 684. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-0975-2_10
Download citation
DOI: https://doi.org/10.1007/978-1-4615-0975-2_10
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-5332-4
Online ISBN: 978-1-4615-0975-2
eBook Packages: Springer Book Archive