In the convolutional coding, the message bits come in serially instead of large blocks.
KeywordsImpulse Response Code Word Convolutional Code Memory Element AWGN Channel
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
324193_1_En_5_MOESM1_ESM.zip (2 kb)
- 1.Massey, J.L., Sain, M.K.: Inverse of linear sequential circuits. IEEE Trans. Comput. C-17, 330–337 (1968)Google Scholar
- 2.Heller, J.A.: Short constraint length convolutional codes, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA Space Program Summary 37–54, Vol. 3, pp. 171–174, December (1968)Google Scholar
- 3.Costello, D.J.: Free distance bounds for convolutional codes. IEEE Trans. Inf. Theory IT-20(3), 356–365 (1974)Google Scholar
- 5.Cain, J., Clark, G., Geist, J.: Punctured convolutional codes of rate (n-1)/n and simplified maximum likelihood decoding. IEEE Trans. Inf. Theory IT-25(1), 97–100 (1979)Google Scholar
- 6.Yasuda, Y., Kashiki, K., Hirata, Y.: High-rate punctured convolutional codes for soft decision Viterbi decoding. IEEE Trans. Commun. 3, 315–319 (1984)Google Scholar
- 7.Hole, K.: New short constraint length rate (N-1)/N punctured convolutional codes for soft decision Viterbi decoding. IEEE Trans. Commun. 9, 1079–1081 (1988)Google Scholar
- 8.Wicker, S.B.: Error Control Systems for Digital Communication and Storage. Prentice Hall, New Jersey (1995)Google Scholar
© Springer India 2015