Advertisement

An Algorithmic Approach to Information Theory

  • Roland Heim
Conference paper
Part of the Lecture Notes in Biomathematics book series (LNBM, volume 4)

Abstract

Classical probability theory is based on the well known axioms of Kolmogoroff. A characteristic difficulty of this measure-theoretic approach is the physical interpretation of probability: we can observe only the frequency of events and the order in which they occur, not however the probability in the axiomatic sense. Attempts to formulate a frequency theory of probability are quite old, but it was not until the fundamental work of C.P. Schnorr (1971) that there existed a complete canonical theory of probability and randomness based on the concept of an effective (that is, computable) procedure to detect possible regularities in a sequence of events.

Keywords

Turing Machine Initial Segment Binary Sequence Growth Function Code Word 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ash, R.: Information theory. J. Wiley & Sons, New York, London, Sydney (1965)MATHGoogle Scholar
  2. Blum, M.: A machine-independent theory of complexity of recursive func-r tions, J. Assoc. Comp. Machin. 14, 322 – 3 36 (1967)CrossRefMATHGoogle Scholar
  3. Bremermann, H.: Complexity of automata, brains and behaviour, this volumeGoogle Scholar
  4. Daley, R.P.: Minimal-program complexity of sequences with restricted resources, Information and Control 23, 301 – 312 (1973a)CrossRefMATHMathSciNetGoogle Scholar
  5. Daley, R.P.: An example of information and computation resource trade-off J. Assoc. Comp. Machin. 20, 667 – 695 (1973b)MathSciNetGoogle Scholar
  6. Gallager, R.G.: Information theory and reliable communication. John Wiley and Sons Inc., New York, London, Sydney, Toronto (1968)MATHGoogle Scholar
  7. Hartmanis, J. and Hopcroft, J.E.: An overwiev of the theory of computational complexity, J. Assoc. Comp. Machin. 18, 444 – 475 (1971)CrossRefMATHMathSciNetGoogle Scholar
  8. Heim, R.: The algorithmic foundation of information theory, to appearGoogle Scholar
  9. Kolmogoroff, A.: Three approaches for defining the concept of information quantity, Information Transmission 1, 3 – 11 (1964)Google Scholar
  10. Martin-Löf, P.: The definition of random sequences, Inform. Control 6, 602 – 619 (1966)CrossRefGoogle Scholar
  11. Martin-Löf, P.: Complexity oscillations in binary sequences, Z. Wahrscheinlichkeitstheorie verw. Geb. 19, 225 – 230 (1971)CrossRefMATHGoogle Scholar
  12. Scnorr, C.P.: Zufälligkeit und Wahrscheinlichkeit, Lecture Notes in Mathematics 218, Springer-Verlag Berlin, Heidelberg, New York (1971)Google Scholar
  13. Schnorr, C.P.: Process complexity and effective random tests, J. of Comp.and Syst. Sc., 4, 376 – 388 (1973)CrossRefMathSciNetGoogle Scholar
  14. Schnorr, C.P. and Stimm, H.: Endliche Automaten und Zufallsfolgen, Acta Informatica 1, 345 – 359 (1972)CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin · Heidelberg 1974

Authors and Affiliations

  • Roland Heim
    • 1
  1. 1.Institute for Information SciencesUniversity of TübingenFederal Republic of Germany

Personalised recommendations