Fundamental Limits

  • Alistair Moffat
  • Andrew Turpin
Part of the The Springer International Series in Engineering and Computer Science book series (SECS, volume 669)


The previous chapter introduced the coding problem: that of assigning some codewords or bit-patterns C to a set of n symbols that have a probability distribution given by P = [p1,…,p n . This chapter explores some lines in the sand which cannot be crossed when designing codes. The first is a lower bound on the expected length of a code: Shannon’s entropy limit. The second restriction applies to the lengths of codewords, and is generally referred to as the Kraft inequality. Both of these limits serve to keep us honest when devising new coding schemes. Both limits also provide clues on how to construct codes that come close to reaching them. We can also obtain experimental bounds on compressibility by using human models and experience, and this area is briefly considered in Section 2.3. The final section of this chapter then shows the application of these limits to some simple compression systems.


Probability Estimator Compression Scheme Code Algorithm Compression System Adaptive Code 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 2002

Authors and Affiliations

  • Alistair Moffat
    • 1
  • Andrew Turpin
    • 2
  1. 1.The University of MelbourneAustralia
  2. 2.Curtin University of TechnologyAustralia

Personalised recommendations