Advertisement

The Principle of Conservation of Entropy

  • Gyula Katona
Part of the International Centre for Mechanical Sciences book series (CISM, volume 31)

Abstract

Let us go back to (1.8). If it is (approximately) true for the best possible encoding, then it must be true for arbitrary uniquely decodable coding. After the encoding we get a new random sequence of signals; we may consider it as a new information source. Let us denote by y. Further, H(y) denotes the information contant of one signal of y (for a moment heuristically). If the information is material-like, then the information contant of one information signal must be distributed on the L code signals which transmit it:
(2.17)
It is called the principle of conservation of entropy [2] and we shall now formulate precisely and prove it.

Keywords

Information Contant Information Source Rational Number Information Signal Block Code 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Wien 1970

Authors and Affiliations

  • Gyula Katona
    • 1
  1. 1.Mathematical InstituteHungarian Academy of SciencesBudapestHungary

Personalised recommendations