# About a Combinatorial Proof of the Noisy Channel Coding Theorem

• János Körner
Part of the International Centre for Mechanical Sciences book series (CISM, volume 265)

## Abstract

The most famous problem of information theory, that of determining the zero-error capacity of a discrete memory less channel, is of combinatorial nature. Originally stated by Shannon [1] in 1956, it has been studied by many combinatorialists. In a recent paper Lovász [2] developed a sophisticated method to derive converse results on the zero-error capacity and succeeded to settle an intriguing special case. This is a channel of which the five input letters can be arranged cyclically so that two input letters can result in a same output letter with positive probability iff they are adjacent in this cyclical array. This “pentagon” constitutes the simplest case for which Shannon was unable to determine the zero-error capacity in 1956. Unfortunately, the Lovász bounding technique also fails in many important cases, cf. Haemers [3]. It has often been argued that the problem is not intrinsically information-theoretic, since it can be stated without using probabilistic concepts. (This argument was even brought up as an excuse for the information theorists’ inability to solve the problem.) In the last couple of years, however, an increasing number of people seem to believe that in the discrete case, all the classical results of information theory can be rederived using combinatorial methods. Moreover, the proofs so obtained often are simpler and more intuitive than earlier ones. The present tutorial paper should propagate this belief.

## Keywords

Mutual Information Block Code Block Length Conditional Entropy Stochastic Matrix
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## References

1. [1]
Shannon, CE.: The zero-error capacity of a noisy channel IRE-IT, 2, pp. 8–19, 1956.
2. [2]
Lovász, L.: On the Shannon capacity of a graph. IEEE Trans. Inform. Theory, 25, pp. 1–7, 1979.
3. [3]
Haemers, W.: On some problems of Lovász concerning the Shannon capacity of a graph. IEEE Trans. Inform. Theory, 25, pp. 231–232, 1979.
4. [4]
Shannon, CE.: A mathematical theory of communication, Bell System Techn. Journal, 27, pp. 379–423, 623–656, 1948.
5. [5]
Shannon, CE.: Certain results in coding theory for noisy channels, Information and Control 1, pp. 6–25, 1957.
6. [6]
Wolfowitz, J.: The coding of messages subject to chance errors, Illinois J. Math. 1, pp. 591–606, 1957.
7. [7]
Wolfowitz, J.: Coding Theorems of Information Theory, Springer Berlin-Heidelberg, 1961, 3rd edition, 1978.
8. [8]
Dueck, G., Körner, J.: Reliability function of a discrete memoryless channel at rates above capacity, IEEE Trans. Inform. Theory, 25, pp. 82–85, 1979.
9. [9]
Csiszár, I., Körner, J.: Graph decomposition: a new key to coding theorems, IEEE Trans. Inform. Theory, to appear.Google Scholar
10. [10]
Fano, R.M.: Transmission of information, A Statistical Theory of Communications, Wiley, New York-London, 1961.Google Scholar
11. [11]
Gallager, R.G.: A simple derivation of the coding theorem and some applications, IEEE Trans. Inform. Theory 11, pp. 3–18, 1965.
12. [12]
Csiszár, I., Körner, J., Marton, K.: A new look at the error exponent of a discrete memoryless channel, Preprint, Presented at the IEEE Int’l. Symposium on Information Theory, 1977, Cornell Univ., Ithaca, N.Y.Google Scholar
13. [13]
Arimoto, S.: On the converse to the coding theorem for discrete memoryless channels, IEEE Trans. Inform. Theory 19, pp. 357–359, 1973.
14. [14]
Goppa, V.D.: Non probabilistic mutual information without memory, Problems of Control and Information Theory (in Russian), 4, pp. 97–102, 1975.
15. [15]
Lovász, L.: On decomposition of graphs, Studia Sci. Math. Hung. I, pp. 237–238, 1966.Google Scholar
16. [16]
Shannon, CE., Gallager, R.G., Berlekamp, E.R.: Lower bounds to error probability for coding in discrete memoryless channels, I-II, Information and Control, 10, pp. 65–103, 522–552.Google Scholar
17. [17]
Haroutunian, E.A.: Estimates of the error exponent tor the semicontinuous memoryless channel (in Russian), Problemi Peredaci Intormacii. 4, no. 4, pp. 37–48, 1968.Google Scholar
18. [18]
Blahut, R.e.: Hypothesis testing and information theory, IEEE Trans. Inform. Theory, 20, pp. 405–417, 1974.
19. [19]
Csiszár, L, Körner, J.: Information Theory, Coding Theorems for Discrete Memoryless Systems, Academic Press, to appear.Google Scholar
20. [20]
Fitingof, B.M.: Coding in the case of unknown and changing message statistics (in Russian), Problemi Peredaci Informacii, 2, no. 2, pp. 3–11, 1966.
21. [21]
Lynch, T.J.: Sequence time coding for data compression, Proc. IEEE, 54, pp. 1490–1491, 1966.
22. [22]
Davisson, L.D.: Comments on “sequence time coding for data compression”, Proc. IEEE 54, p. 2010, 1966.
23. [23]
Davisson, L.D.: Universal noiseless coding, IEEE Trans. Inform. Theory, 19, pp. 783–796, 1973.
24. [24]
Gilbert, E.N.: A comparison of signalling alphabets, Bell System Techn. J. 31, pp. 504–522.Google Scholar
25. [25]
Blahut, R.E.: Composition bounds for channel block codes, IEEE Trans. Inform. Theory, 23, pp. 656–674.Google Scholar
26. [26]
McEliece, R.J. Rodemich, E.R. Rumsey, H. Jr. Welch, L.R.: New upper bounds on the rate of a code via the Delsarte-McWilliams identities, IEEE Trans. Inform. Theory, 23, pp. 157–166, 1977.
27. [27]
van der Meulen, E.C.: A survey of multi-way channels in information theory: 1961–1976, IEEE Trans. Inform. Theory, 23, pp. 1–37, 1977.
28. [28]
Csiszár, I. Körner, J.: Towards a general theory of source networks, IEEE Trans. Inform. Theory, 26.Google Scholar
29. [29]
Körner, J., Sgarro, A.: Universally attainable error exponents for broadcast channels with degraded message sets, IEEE Trans. Inform. Theory, to appear.Google Scholar
30. [30]
Csiszár, I., Körner, J.: On the capacity of the arbitrarily varying channel for maximum probability of error, submitted to Z.f. Wahrscheinlichkeitsthesrie verw. Geb. 1979.Google Scholar
31. [31]
Ahlswede, R.: A method of coding and an application to arbitrarily varying channels. Preprint 1979.Google Scholar

## Authors and Affiliations

• János Körner
• 1
1. 1.Mathematical InstituteHungarian Academy of SciencesHungary