The Principle of Conservation of Entropy
Part of the International Centre for Mechanical Sciences book series (CISM, volume 31)
Let us go back to (1.8). If it is (approximately) true for the best possible encoding, then it must be true for arbitrary uniquely decodable coding. After the encoding we get a new random sequence of signals; we may consider it as a new information source. Let us denote by y. Further, H(y) denotes the information contant of one signal of y (for a moment heuristically). If the information is material-like, then the information contant of one information signal must be distributed on the L code signals which transmit it:It is called the principle of conservation of entropy  and we shall now formulate precisely and prove it.
KeywordsInformation Contant Information Source Rational Number Information Signal Block Code
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Unable to display preview. Download preview PDF.
© Springer-Verlag Wien 1970