Advertisement

Laws of Information Which Govern Systems

  • Roger C. Conant
Chapter
Part of the International Federation for Systems Research International Series on Systems Science and Engineering book series (IFSR, volume 7)

Abstract

Information theory was created for the purpose of studying the communication of messages from one point to another, and since its appearance,14 its focus has remained on the question, “how can the constraint between the two variables X (message sent) and Y (message received) be measured and maximized”? Although the theory was generalized to N dimensions,10,2 and its relation to the analysis of variance noted,9 not much use seems to have been made of the result, perhaps in part because the descriptors “N-dimensional Information Theory” or “Uncertainty Analysis” did not adequately represent what can actually be seen as the analysis of constraints in multivariable systems. In any statistically-analyzable system of several variables interacting in a lively way, some variables (or sets of them) exert effects on others. These effects are reflected statistically as non-independence of the variables involved, and it is this deviation from independence which we indicate by the term “constraint.” We prefer this term to the term “dependence” because the latter suggests dependence of X on Y while the former is neutral as to direction. To the extent that the variables are not independent, they are “in communication” with one another, and information theory can be used to analyze the non-independence. In addition, the fluctuation of values taken by any variable can be viewed as a message it sends, a flow of information about itself to all other parts of the system which are “listening.” The view of systems as networks of information transfer leads to quantitative conclusions about system behavior and structure which are somewhat novel and of wide applicability.

Keywords

Transmission Rate Channel Capacity Output Rate Deterministic System Conditional Entropy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    W. R. Ashby, Introduction to Cybernetics. London: Chapman and Hall, 1956.Google Scholar
  2. 2.
    W. R. Ashby, “Measuring the internal information exchange in a system,” Cybernetica, vol 8, pp. 5–22, 1965.Google Scholar
  3. 3.
    W. R. Ashby, “Two tables of identities governing information flows within large systems,” Commun. Amer. Soc. Cybern., vol. 1, no. 2, pp. 2–7, 1969.Google Scholar
  4. 4.
    W. R. Ashby, “Information flows within co-ordinated systems,” in Progress of Cybernetics: Proc. 1st Int. Cong. Cybernetics, J. Rose, Ed. London: Gordon and Breach, 1970, pp. 57–64.Google Scholar
  5. 5.
    R. C. Conant, “The information transfer required in regulatory processes,” IEEE Trans. Syst. Sci. Cybern., vol. SSC-5, no. 4, pp. 334–338.Google Scholar
  6. 6.
    R. C. Conant, “Detecting subsystems of a complex system,” IEEE Trans. Syst., Man, Cybern, vol. SMC-2, no. 4, pp. 550–553.Google Scholar
  7. 7.
    R. C. Conant, “Information flows in hierarchical systems,” Int. J. Gen. Syst., vol. 1, no. 1, pp. 918, 1974.Google Scholar
  8. 8.
    J. N. Cronholm, “A general method of obtaining exact sampling probabilities of the Shannon-Wiener measure of information R*,” Psychometrika, vol. 28, no. 4, pp. 405–413, 1963.CrossRefGoogle Scholar
  9. 9.
    W. R. Garner and W. J. McGill, “The relation between information and variance analyses,” Psychometrika, vol. 21, no. 3, pp. 219–228, 1956.CrossRefGoogle Scholar
  10. 10.
    W. J. McGill, “Multivariate information transmission,” Psychometrika, vol 19, no. 2, pp. 97–116, 1954.CrossRefGoogle Scholar
  11. 11.
    G. A. Miller, “On the bias of information estimates,” in Information Theory in Psychology, H. Quastler, Ed. Glencoe, IL: The Free Press, 1955, pp. 95–100.Google Scholar
  12. 12.
    M. S. Rogers and B. F. Green, “Moments of sample information when the alternatives are equally likely,” in Information Theory in Psychology, H. Quastler, Ed. Glencoe, IL: The Free Press, 1955, pp. 101–107.Google Scholar
  13. 13.
    H. Simon, “The architecture of complexity,” in Proc. Amer. Phil. Soc., vol 106, pp. 467–482, 1962. Reprinted in Gen. Syst., vol. 10, pp. 63–76, 1965.Google Scholar
  14. 14.
    C. E. Shannon and W. Weaver, The Mathematical Theory of Communication. Urbana, IL: Univ. Illinois, 1949.Google Scholar
  15. 15.
    C. E. Shannon, “Prediction and entropy of printed English,” Bell Syst. Tech. J., vol. 30, pp. 5064, 1951.Google Scholar

Copyright information

© Springer Science+Business Media New York 1991

Authors and Affiliations

  • Roger C. Conant

There are no affiliations available

Personalised recommendations