Abstract
In this chapter we consider the problem of safe transmission of a message over a channel, which cannot be affected by noise. We are looking for error-free and the fastest possible methods for transmitting messages. This is a rather special, but important, problem in classical information theory. We rely mainly on the following two central tools: prefix-free sets and Shannon entropy. Undoubtedly, the prefix-free sets are the easiest codes to construct, and most interesting problems on codes can be raised for prefix-free sets. Shannon entropy is a measure of the degree of ignorance concerning which possibility holds in an ensemble with a given a priori probability distribution. Later on, we shall contrast the Shannon measure with the information content of an individual (finite) object — viewed as a measure of how difficult it is to specify that object.
A poem is never finished, only abandoned.
Paul Valery
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
History of Results
J. L. Bentley, A. C. Yao. An almost optimal algorithm for unbounded search, Inf. Proc. Lett. 5 (1976), 82–87.
D. E. Knuth. Supernatural numbers, in D. A. Klarner (ed.). The Mathematical Gardner, Prindle, Weber & Schmidt, Wadsworth, Boston, MA, 1981, 310–325.
L. G. Kraft. A Device for Quantizing Grouping and Coding Amplitude Modulated Pulses, MS Thesis, MIT, Cambridge, MA, 1949.
C. E. Shannon. A mathematical theory of communication, Bell Syst. Tech. J. 27 (1948), 379–423, 623–656.
J. Berstel, D. Perrin. Theory of Codes, Academic Press, New York, 1985.
I. Csiszar, J. Körner. Information Theory, Academic Press, New York, 1981.
T. M. Cover, J. Y. Thomas. Elements of Information Theory, John Wiley & Sons, New York, 1991.
S. Guiasu. Information Theory and Applications, McGraw-Hill, New York, 1977.
D. S. Jones. Elementary Information Theory, Clarendon Press, Oxford, 1979.
H. Jürgensen, J. Duske. Codierungstheorie, BI, Mannheim, 1977.
A. I. Khinchin. Mathematical Foundations of Information Theory, Dover, New York, 1957.
S. K. Leung-Yan-Cheong, T. M. Cover. Some equivalences between Shannon entropy and Kolmogorov complexity, IEEE Trans. Info. Theory 24 (1978), 331–338.
M. R. Titchener. Construction and properties of the augmented and binary—depletion codes, IEE Proc. 132 (1984), 163–169.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Calude, C.S. (2002). Noiseless Coding. In: Information and Randomness. Texts in Theoretical Computer Science An EATCS Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-04978-5_2
Download citation
DOI: https://doi.org/10.1007/978-3-662-04978-5_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-07793-7
Online ISBN: 978-3-662-04978-5
eBook Packages: Springer Book Archive