Advertisement

Codes Produced by Permutations: The Link Between Source and Channel Coding

  • Rudolf AhlswedeEmail author
Chapter
  • 749 Downloads
Part of the Foundations in Signal Processing, Communications and Networking book series (SIGNAL, volume 13)

Abstract

The main result of Ahlswede and Dueck (IEEE Trans. Inf. Theory, IT–28(3):430–443, 1982, [5]) is that good codes, even those meeting the random coding bound, can be produced with relatively few (linear in the block length) permutations from a single codeword. This cutdown in complexity may be of practical importance. The motivation for looking at such codes came from our covering lemma, which makes it possible to build correlated source codes from channel codes via permutations.

References

  1. 1.
    R. Ahlswede, Channel capacities for list codes. J. Appl. Prob. 10, 824–836 (1973)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    R. Ahlswede, Elimination of correlation in random codes for arbitrarily varying channels. Z. Wahrscheinlichkeitstheorie verwandte Gebiete 44, 159–175 (1978)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    R. Ahlswede, A method of coding and its application to arbitrarily varying channels. J. Comb. Inf. Syst. Sci. 5(1), 10–35 (1980)Google Scholar
  4. 4.
    R. Ahlswede, Coloring hypergraphs: a new approach to multi-user source coding, Part II. J. Comb. Inf. Syst. Sci. 5(3), 220–268 (1980)MathSciNetzbMATHGoogle Scholar
  5. 5.
    R. Ahlswede, G. Dueck, Good codes can be produced by a few permutations. IEEE Trans. Inf. Theory IT–28(3), 430–443 (1982)Google Scholar
  6. 6.
    S. Arimoto, On the converse to the coding theorem for the discrete memoryless channels. IEEE Trans. Inf. Theory IT–19, 357–359 (1973)Google Scholar
  7. 7.
    R.E. Blahut, Hypothesis testing and information theory. IEEE Trans. Inf. Theory IT–20, 405–417 (1974)Google Scholar
  8. 8.
    I. Csiszár, J. Körner, Graph decomposition: a new key to coding theorems. IEEE Trans. Inf. Theory IT–27, 5–12 (1981)Google Scholar
  9. 9.
    I. Csiszár, J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems (Academic Press, New York, 1981)zbMATHGoogle Scholar
  10. 10.
    I. Csiszár, J. Körner, K. Marton, A new look at the error exponent of a discrete memoryless channel (preprint), in IEEE International Symposium on Information Theory (Ithaca, NY, 1977)Google Scholar
  11. 11.
    R.L. Dobrushin, S.Z. Stambler, Coding theorems for classes of arbitrarily varying discrete memoryless channels. Probl. Peredach. Inf. 11, 3–22 (1975)zbMATHGoogle Scholar
  12. 12.
    G. Dueck, J. Körner, Reliability function of a discrete memoryless channel at rates above capacity. IEEE Trans. Inf. Theory IT–25, 82–85 (1979)Google Scholar
  13. 13.
    R.M. Fano, Transmission of Information: A Statistical Theory of Communication (Wiley, New York, 1961)Google Scholar
  14. 14.
    A. Feinstein, A new basic theorem of information theory. IRE Trans. Inf. Theory 4, 2–22 (1954)MathSciNetCrossRefGoogle Scholar
  15. 15.
    R.G. Gallager, A simple derivation of the coding theorem and some applications. IEEE Trans. Inf. Theory IT–11, 3–18 (1965)Google Scholar
  16. 16.
    R.G. Gallager, Information Theory and Reliable Communication (Wiley, New York, 1968)zbMATHGoogle Scholar
  17. 17.
    R.G. Gallager, Source coding with side information and universal coding (preprint), in IEEE International Symposium on Information Theory (Ronneby, Sweden, 1976)Google Scholar
  18. 18.
    V.D. Goppa, Nonprobabilistic mutual information without memory. Prob. Contr. Inf. Theory 4, 97–102 (1975)MathSciNetzbMATHGoogle Scholar
  19. 19.
    A. Haroutunian, Estimates of the error exponent for the semi-continuous memoryless channel. Probl. Peredach. Inf. 4, 37–48 (1968)Google Scholar
  20. 20.
    V.N. Koselev, On a problem of separate coding of two dependent sources. Probl. Peredach. Inf. 13, 26–32 (1977)MathSciNetzbMATHGoogle Scholar
  21. 21.
    J.K. Omura, A lower bounding method for channel and source coding probabilities. Inform. Contr. 27, 148–177 (1975)MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    C.E. Shannon, A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 632–656 (1948)MathSciNetzbMATHGoogle Scholar
  23. 23.
    C.E. Shannon, Certain results in coding theory for noisy channels. Inform. Contr. 1, 6–25 (1957)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    C.E. Shannon, R.G. Gallager, E.R. Berlekamp, Lower bounds to error probability for coding on discrete memoryless channels I-II. Inf. Contr. 10(65–103), 522–552 (1967)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    D. Slepian, J.K. Wolf, Noiseless coding of correlated information sources. IEEE Trans. Inf. Theory IT–19, 471–480 (1973)Google Scholar
  26. 26.
    J. Wolfowitz, The coding of messages subject to chance errors. Illinois J. Math. 1, 591–606 (1957)MathSciNetzbMATHGoogle Scholar

Further Readings

  1. 27.
    R. Ahlswede, Coloring hypergraphs: a new approach to multi-user source coding, Part I. J. Comb. Inf. Syst. Sci. 1, 76–115 (1979)zbMATHGoogle Scholar
  2. 28.
    R.E. Blahut, Composition bounds for channel block codes. IEEE Trans. Inf. Theory IT–23, 656–674 (1977)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Department of MathematicsUniversity of BielefeldBielefeldGermany

Personalised recommendations