Abstract
Chapter 5 continues the discussion of Shannon’s information theory as regards channel capacity and channel coding. Simple channel models are introduced and their capacity is computed. It is shown that channel coding needs redundancy and the fundamental theorem of channel coding is stated. Its proof relies on Shannon’s random coding, the principle of which is stated and illustrated. A geometrical picture of a code as a sparse set of points within the high-dimensional Hamming space which represents sequences is proposed. The practical implementation of channel coding uses error-correcting codes, which are briefly defined and illustrated by describing some code families: recursive convolutional codes , turbocodes and low-density parity-check codes . The last two families can be interpreted as approximately implementing random coding by deterministic means . Contrary to true random coding, their decoding is of moderate complexity and both achieve performance close to the theoretical limit. How their decoding is implemented is briefly described. The first and more important step of decoding enables regenerating an encoded sequence. Finally, it is stated that the constraints which endow error-correcting codes with resilience to errors can be of any kind (e.g., physical-chemical or linguistic), and not necessarily mathematical as in communication engineering.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
A polynomial of degree μ is said to be primitive if taking the successive powers of one of its roots generates all the \(2^{\mu} - 1\) non-zero elements of the μ-th extension of the binary field.
- 2.
Or more, but then some specific difficulties are met; two-component codes suffice for obtaining results close enough to the theoretical limit for most practical purposes.
- 3.
Failing to do so would increase the magnitude of the computed a posteriori real value without improving its reliability; remember that the magnitude of a log-likelihood ratio is intended to measure the reliability of the corresponding bit.
References
Bahl, L. R., Cocke, J., Jelinek, F., & J. Raviv, J. (1974). Optimal decoding of linear codes for minimizing symbol error rate. IEEE Transaction on Information Theory, IT20(2) 284–287.
Battail, G. (1987a). Pondération des symboles décodés par l’algorithme de Viterbi. Annales Télécommunications, 42(1–2), 31–38.
Battail, G. (1987b). Le décodage pondéré en tant que procédé de réévaluation d’une distribution de probabilité. Annales Télécommunications, 42(9-10), 499–509.
Battail, G. (1989). Construction explicite de bons codes longs. Annales Télécommunic., 44(7–8), 392–404.
Battail, G. (1993). Pseudo-random recursive convolutional coding for near-capacity performance. 2nd International Symposium on Communication Theory and Applications, Ambleside (UK), 12-16 July 1993. (Communications theory and applications II, B. Honary, M. Darnell, P. Farrell, Eds., HW Communications Ltd., pp. 54–65).
Battail, G. (1996). On random-like codes. Proceeding 4-th Canadian workshop on information theory, Lac Delage, Québec, 28–31 May 1995. (Information Theory and Applications II, J.-Y. Chouinard, P. Fortier and T. A. Gulliver, Eds., Lecture Notes in Computer Science No. 1133, pp. 76–94, Springer).
Battail, G. (2000). On Gallager’s low-density parity-check codes. International symposium on information theory. Proceeding ISIT 2000, p. 202, Sorrento, Italy, 25–30 June 2000.
Battail, G., Berrou, C., & Glavieux, A. (1993). Pseudo-random recursive convolutional coding for near-capacity performance. Proceeding GLOBECOM'93, Communication Theory Mini-Conference, Vol. 4, pp. 23–27, Houston, U.S.A.
Battail, G., Decouvelaere, M. (1976), “Décodage par répliques”, Ann. Télécommunic., Vol. 31, No. 11-12, pp. 387–404.
Battail, G., Decouvelaere, M., & Godlewski, P. (1979). Replication decoding. IEEE Transaction on Information Theory, IT-25(3), 332–345.
Coffey, J. T., & Goodman, R. M. (1990). Any code of which we cannot think is good. IEEE Transaction on Information Theory, IT-36(6), 1453–1461.
Gallager, R. G. (1962). Low-density parity-check codes. IRE Trans. on Inf. Th., Vol. IT-8, pp. 21–28.
Gallager, R. G. (1963). Low-density parity-check codes. Cambridge: MIT Press.
Gallager, R. G. (1965). A simple derivation of the coding theorem and some applications. IEEE Transactions on Information Theory, IT-13(1), 3–18.
Hagenauer, J., & Hoeher, P. (1989). A Viterbi algorithm with soft-decision outputs and its applications. Proceeding GLOBECOM'89, pp. 47.1.1–47.1.7 (Nov.). Dallas, U.S.A.
Khinchin, A. I. (1957). Mathematical foundations of information theory. Ney York: Dover.
Kolmogorov, A. N. (1956). On the Shannon theory of information transmission in the case of continuous signals, in (Slepian 1974, pp. 238–244).
Massey, J. L. (1963). Threshold decoding. Cambridge: MIT Press.
Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–457, 623–656. (Reprinted in Shannon and Weaver 1949, Sloane and Wyner 1993, pp. 5–83 and in Slepain 1947, pp. 5–29).
Shannon, C. E. (1949). Communication in the presence of noise. Proceeding IRE, pp. 10–21. (Reprinted in Sloane and Wyner 1993, pp. 160–172 and in Slepian 1974, pp. 30–41).
Shannon, C. E., & Weaver, W. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.
Slepian, D. (Ed.). (1974). Key papers in the development of information theory. Piscataway: IEEE Press.
Sloane, N. J. A., & Wyner, A. D. (Eds.). (1993). Claude Elwood Shannon, collected papers. Piscataway: IEEE Press.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Battail, G. (2014). Channel Capacity and Channel Coding. In: Information and Life. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-7040-9_5
Download citation
DOI: https://doi.org/10.1007/978-94-007-7040-9_5
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-007-7039-3
Online ISBN: 978-94-007-7040-9
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)