Advertisement

The Nature of Information

Chapter
Part of the Computational Biology book series (COBO, volume 21)

Abstract

This chapter is the first of seven covering the mathematical background of bioinformatics and together constituting Part I of the book. This chapter covers the fundamentals of information, starting with the very basic concept of variety, and developing the notion of constraint. Key elements of information theory are introduced, such as form and content, the generation of information by observation and experiment, conditional and unconditional information, and its quantification through the work of Shannon, Kolmogorov and others. The relationship between information and entropy are discussed; different kinds of entropy (relative, cross etc.) are defined and used to make further useful ideas such as redundancy precise. Different kinds of information are introduced and explained. Ideas about the value and quality of information are developed. Going beyond syntax and transmission accuracy, the meaning of information (semantics), the importance of context in contributing to meaning, and the effect information can have in inducing action are discussed.

Keywords

Markov Process Shannon Index Preceding Symbol Logical Depth Conditional Readiness 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Ashby WR (1956) An introduction to cybernetics. Chapman & Hall, LondonCrossRefMATHGoogle Scholar
  2. Ashby WR (1962) Principles of the self-organizing system. In: von Foerster H, Zopf GW (eds) Principles of self-organization. Pergamon Press, Oxford, pp 255–278Google Scholar
  3. Bennett CH (1988) Logical depth and physical complexity. In: Herken R (ed) The Universal Turing Machine—A Half-Century Survey. Oxford University Press, Oxford, pp 227–257Google Scholar
  4. Chernavsky DS (1990) Synergetics and information. Matematika Kibernetika 5:3–42 (in Russian)Google Scholar
  5. Carnap R, Bar-Hillel Y (1952) An outline of a theory of semantic information. MIT Research Laboratory of Electronics Technical Report No 247Google Scholar
  6. Dewey TG (1996) Algorithmic complexity of a protein. Phys Rev E 54:R39–R41Google Scholar
  7. Dewey TG (1997) Algorithmic complexity and thermodynamics of sequence-structure relationships in proteins. Phys Rev E 56:4545–4552Google Scholar
  8. Fisher RA (1951) The design of experiments, 6th edn. Oliver & Boyd, EdinburghGoogle Scholar
  9. von Foerster H (1960) On self-organizing systems and their environments. In: Yorvitz MC, Cameron S (eds) Self-organizing systems. Pergamon Press, OxfordGoogle Scholar
  10. Good IJ (1969) Statistics of language. In: Meetham AR (ed) Encyclopaedia of linguistics, information and control. Pergamon Press, Oxford, pp 567–581Google Scholar
  11. Karbowski J (2000) Fisher information and temporal correlations for spiking neurons with stochastic dynamics. Phys Rev E 61:4235–4252CrossRefGoogle Scholar
  12. Kullback S, Leibler RA (1951) On information and sufficiency. Ann Math Stat 22:79–86CrossRefMATHMathSciNetGoogle Scholar
  13. Mackay DM (1950) Quantal aspects of scientific information. Phil Mag (ser 7) 41:289–311CrossRefMATHGoogle Scholar
  14. Markov AA (1913) Statistical analysis of the text of “Eugene Onegin” illustrating the connexion with investigations into chains. Izv Imp Akad Nauk, Ser 6(3):153–162 (in Russian)Google Scholar
  15. Shannon CE (1951) Prediction and entropy of printed English. Bell Syst Tech J 30:50–64CrossRefMATHGoogle Scholar
  16. Thomas PJ (2010) An absolute scale for measuring the utility of money. J Phys: Conf Ser 238:012039Google Scholar
  17. Tureck R (1995) Cells, functions, relationships in musical structure and performance. Proc R Inst 67:277–318Google Scholar
  18. Welby V (1911) Significs. Encyclopaedia britannica, 11th edn. Cambridge University Press, CambridgeGoogle Scholar
  19. Wiener N (1948) Cybernetics, or control and communication in the animal and the machine (Actualités Sci. Ind. no 1053). Hermann & Cie, ParisGoogle Scholar
  20. Zurek WH (1989) Thermodynamic cost of computation, algorithmic complexity, and the information metric. Nature (Lond) 341:119–124CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2015

Authors and Affiliations

  1. 1.The University of BuckinghamBuckinghamUK

Personalised recommendations