Abstract
The previous chapter introduced the coding problem: that of assigning some codewords or bit-patterns C to a set of n symbols that have a probability distribution given by P = [p1,…,p n . This chapter explores some lines in the sand which cannot be crossed when designing codes. The first is a lower bound on the expected length of a code: Shannon’s entropy limit. The second restriction applies to the lengths of codewords, and is generally referred to as the Kraft inequality. Both of these limits serve to keep us honest when devising new coding schemes. Both limits also provide clues on how to construct codes that come close to reaching them. We can also obtain experimental bounds on compressibility by using human models and experience, and this area is briefly considered in Section 2.3. The final section of this chapter then shows the application of these limits to some simple compression systems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer Science+Business Media New York
About this chapter
Cite this chapter
Moffat, A., Turpin, A. (2002). Fundamental Limits. In: Compression and Coding Algorithms. The Springer International Series in Engineering and Computer Science, vol 669. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-0935-6_2
Download citation
DOI: https://doi.org/10.1007/978-1-4615-0935-6_2
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-5312-6
Online ISBN: 978-1-4615-0935-6
eBook Packages: Springer Book Archive