Advertisement

Arithmetic Coding

Part of the Undergraduate Topics in Computer Science book series (UTICS)

The Huffman algorithm is simple, efficient, and produces the best codes for the individual data symbols. The discussion in Chapter 2 however, shows that the only case where it produces ideal variable-length codes (codes whose average size equals the entropy) is when the symbols have probabilities of occurrence that are negative powers of 2 (i.e., numbers such as 1/2, 1/4, or 1/8). This is because the Huffman method assigns a code with an integral number of bits to each symbol in the alphabet. Information theory tells us that a symbol with probability 0.4 should ideally be assigned a 1.32-bit code, because —log2 0.4 ≈ 1.32. The Huffman method, however, normally assigns such a symbol a code of one or two bits.

Keywords

Real Number Cumulative Frequency Encode Process Decimal Digit Arithmetic Code 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag London Limited 2008

Personalised recommendations