Turbo Coding pp 121-164 | Cite as

Turbo Decoding

  • Chris Heegard
  • Stephen B. Wicker
Part of the The Springer International Series in Engineering and Computer Science book series (SECS, volume 476)


The problem of decoding received, encoded data sequences can be formulated and solved in a number of ways. In this chapter we provide a general framework for the decoding problem. The basic concepts of measures and metrics are introduced. It is then shown how metrics can be used in different ways to realize different decoding algorithms, with an emphasis on the BCJR and Viterbi algorithms. These two algorithms are unified within a general framework that we call the “Generalized Viterbi Algorithm”. This is followed by a detailed description of turbo decoding: a low-complexity, suboptimal means for decoding serial and parallel concatenated codes. The chapter concludes with a discussion of mismatched decoding: a problem that arises when the channel statistics are not known or have been inaccurately specified. It is shown that turbo decoders are remarkably robust under such conditions, and can be made more so by including channel estimation within the decoding process.


Convolutional Code Viterbi Algorithm Iterative Decode Turbo Decode Soft Decision 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [BCJR74]
    L. R. Bahl, J. Cocke, F. Jelinek, and J. Raviv. Optimal decoding of linear codes for minimizing symbol error rate. IEEE Transactions on Information Theory, IT-20:284–287, March 1974.MathSciNetzbMATHCrossRefGoogle Scholar
  2. [Bal95]
    V.B. Balakirsky. A converse coding theorem for mismatched decodng at the output of binary-input memory-less channels. IEEE Trans. Inform. Theory, IT-41(6): 1889–1902, Nov. 1995.MathSciNetzbMATHCrossRefGoogle Scholar
  3. [BP66]
    L.E. Baum and T. Pétrie. Statistical inference for probabilistic functions of finite state Markov chains. Ann. Math. Stat. 37:1554–1563, 1966zbMATHCrossRefGoogle Scholar
  4. [BS68]
    L.E. Baum and G.R. Sell. Growth transformations for functions on manifolds. Pac. J. Math. 27(2):211–227, 1968.MathSciNetzbMATHCrossRefGoogle Scholar
  5. [BPGW70]
    L.E. Baum, T. Pétrie, G. Soûles and N. Weiss. A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. Ann. Math. Stat. 41:164–171, 1970.zbMATHCrossRefGoogle Scholar
  6. [BGT93]
    C. Berrou, A. Glavieux, and P. Thitimajshima. Near Shannon limit error-correcting coding and decoding: Turbo Codes. Proceedings of the 1993 International Conference on Communications, 1064–1070, 1993.Google Scholar
  7. [CK81]
    I. Csiszar and J. Körner. Graph decomposition: A new key to coding theorems, IEEE Trans. Inform. Theory, IT-27:5–12, 1981.MathSciNetzbMATHCrossRefGoogle Scholar
  8. [CN95]
    I. Csiszar and P. Narayan. Channel capacity for a given decoding metric. IEEE Trans. Inform. Theory, IT-41(1):35–43, 1995.MathSciNetzbMATHCrossRefGoogle Scholar
  9. [For72]
    G. David Forney Jr.. Maximum-likelihood sequence estimation of digital sequences in the presence of intersymbol interference. IEEE Transactions on Information Theory, IT-18:363–378, 1972.MathSciNetzbMATHCrossRefGoogle Scholar
  10. [For73]
    G. David Forney Jr.. The Viterbi algorithm. Proceedings of the IEEE, 61(3):268–278, March 1973.MathSciNetCrossRefGoogle Scholar
  11. [Gop75]
    V.D. Goppa. Nonprobabilistic mutual information without memory. Probl. Contr. Inform. Theory, 4:97–102, 1975.MathSciNetGoogle Scholar
  12. [HOP96]
    Joachim Hagenauer, Elke Offer, and Lutz Papke. Iterative decoding of binary block and convolutional codes. IEEE Transactions on Information Theory, IT-42:429–445, 1996.zbMATHCrossRefGoogle Scholar
  13. [Hui83]
    J.Y.N. Hui. Fundamental issues of multiple accessing. Ph.D. dissertation, M.I.T., 1983.Google Scholar
  14. [Lap96]
    A. Lapidoth. Mismatched decoding and the multiple access channel. IEEE Trans. Inform. Theory, IT-42(5):1439–1452, 1996.zbMATHCrossRefGoogle Scholar
  15. [IH77]
    H. Imai and S. Hirakawa. A new multilevel coding method using error correcting codes. IEEE Transactions Information Theory, IT-23:371–377, 1977.zbMATHCrossRefGoogle Scholar
  16. [Kim98]
    S. Kim. Probabilistic Reasoning, Parameter Estimation, and Issues in Turbo Decoding. Ph.D. dissertation, Cornell University, 1998.Google Scholar
  17. [KW98a]
    S. Kim and S.B. Wicker. A Connection Between the Baum-Welch Algorithm and Turbo Decoding. Proceedings of the 1998 Information Theory Workshop, Killarney, Ireland, June 22–26, pp. 12–13, 1998.Google Scholar
  18. [KW98b]
    S. Kim and S.B. Wicker. On Mismatched and Self-Matching Turbo Decoding. Submitted to IEEE Trans. Inform. Theory, 1998.Google Scholar
  19. [Kob71]
    H. Kobayashi. Correlative level coding and maximum likelihood decoding. IEEE Transactions on Information Theory, IT-17(5):586–594, 1971.zbMATHCrossRefGoogle Scholar
  20. [McE96]
    R. J. McEliece. On the BCJR trellis for linear block codes. IEEE Transactions on Information Theory, IT-42(4):1072–1092, 1996.MathSciNetzbMATHCrossRefGoogle Scholar
  21. [MKLS94]
    N. Merhav, G. Kaplan, A. Lapidoth and S. Shamai (Shitz). On information rates for mismatched decoders. IEEE Trans. Inform. Theory, IT-40(6):1953–1967, 1994.zbMATHCrossRefGoogle Scholar
  22. [RVH95]
    P. Robertson, E. Villebrun, and P. Hoeher. A comparison of optimal and sub-optimal MAP decoding algorithms operating in the log domain. In IEEE International Conference on Communications, pages 1009–1013, June 1995.Google Scholar
  23. [RW98]
    P. Robertson and T. Wörz. Bandwidth-efficient turbo trelliscoded modulation using punctured component codes. IEEE Journal on Selected Areas in Communications, 1998.Google Scholar
  24. [SHR98]
    M. Shoemake, C. Heegard, and E. Rossin. Turbo codes for high order constellations. In Abstract Book, Killarney, IRELAND, June 1998. IEEE Information Theory Workshop.Google Scholar
  25. [Ung82]
    G. Ungerboeck. Channel coding with multilevel/phase signals. IEEE Transactions on Information Theory, IT-28:55–67, 1982.zbMATHCrossRefGoogle Scholar
  26. [Ung87]
    G. Ungerboeck. Trellis-coded modulation with redundant signal sets— part I: Introduction, — part II: State of the art. IEEE Communications Magazine, 5(2):5–21, 1987.CrossRefGoogle Scholar
  27. [Vit67]
    A. J. Viterbi. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory, IT-13:260–269, 1967.zbMATHCrossRefGoogle Scholar
  28. [Wei84]
    L.-F. Wei. Rotationally invariant convolutional channel coding with expanded signal space-parts I and II. IEEE Journal on Selected Areas in Communications, SAC-2:659–686, 1984.CrossRefGoogle Scholar
  29. [SW98]
    T. A. Summers and S. G. Wilson. NR Mismatch and Online Estimation in Turbo Decoding. IEEE Transactions on Communications, COM-46:421–423, 1998.CrossRefGoogle Scholar
  30. [Ziv85]
    J. Ziv. Universal Decoding for Finite-State Channels. IEEE Trans. Inform. Theory, vol. IT-31(4):453–460, 1985.MathSciNetzbMATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 1999

Authors and Affiliations

  • Chris Heegard
    • 1
    • 2
  • Stephen B. Wicker
    • 2
  1. 1.Alantro Communications, Inc.USA
  2. 2.Cornell UniversityUSA

Personalised recommendations