Theory of Information and its Value pp 53-75 | Cite as

# Encoding in the presence of penalties. First variational problem

## Abstract

The amount of information that can be recorded or transmitted is defined by a logarithm of the number of various recording realizations or transmission realizations, respectively. However, calculation of this number is not always a simple task. It can be complicated because of the presence of some constraints imposed on feasible realizations. In many cases, instead of direct calculation of the number of realizations, it is reasonable to compute the maximum value of recording entropy via maximization over distributions compatible with conditions imposed on the expected value of some random cost. This maximum value of entropy is called the *capacity of a channel* without noise. This variational problem is the first from the set of variational problems playing an important role in information theory.

## References

- 20.Hill, T.L.: Statistical Mechanics. McGraw-Hill Book Company Inc., New York (1956)zbMATHGoogle Scholar
- 21.Hill, T.L.: Statistical Mechanics (Translation to Russian). Inostrannaya Literatura, Moscow (1960)Google Scholar
- 31.Leontovich, M.A.: Statistical Physics. Gostekhizdat, Moscow (1944, in Russian)Google Scholar
- 32.Leontovich, M.A.: Introduction to Thermodynamics. GITTL, Moscow-Leningrad (1952, in Russian)Google Scholar
- 38.Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J.
**27**(1948)MathSciNetCrossRefGoogle Scholar - 45.Shannon, C.E.: A mathematical theory of communication (translation to Russian). In: R.L. Dobrushin, O.B. Lupanov (eds.) Works on Information Theory and Cybernetics. Inostrannaya Literatura, Moscow (1963)Google Scholar
- 46.Stratonovich, R.L.: On statistics of magnetism in the Ising model (in Russian). Fizika Tvyordogo Tela
**3**(10) (1961)Google Scholar