The previous chapter introduced the coding problem: that of assigning some codewords or bit-patterns C to a set of n symbols that have a probability distribution given by P = [p1,…,p n . This chapter explores some lines in the sand which cannot be crossed when designing codes. The first is a lower bound on the expected length of a code: Shannon’s entropy limit. The second restriction applies to the lengths of codewords, and is generally referred to as the Kraft inequality. Both of these limits serve to keep us honest when devising new coding schemes. Both limits also provide clues on how to construct codes that come close to reaching them. We can also obtain experimental bounds on compressibility by using human models and experience, and this area is briefly considered in Section 2.3. The final section of this chapter then shows the application of these limits to some simple compression systems.
KeywordsProbability Estimator Compression Scheme Code Algorithm Compression System Adaptive Code
Unable to display preview. Download preview PDF.