Suppose we want to describe a given object by a finite binary string. We do not care whether the object has many descriptions; however, each description should describe but one object. Prom among all descriptions of an object we can take the length of the shortest description as a measure of the object’s complexity. It is natural to call an object “simple” if it has at least one short description, and to call it “complex” if all of its descriptions are long.


Turing Machine Binary String Recursive Function Code Word Conjunctive Normal Form 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

History and References

  1. A.K. Zvonkin and L.A. Levin [Russ. Math. Surveys, 25:6(1970), 83–124].MathSciNetzbMATHCrossRefGoogle Scholar
  2. [D.E. Knuth, SIGACT News, 4:2(1976), 18–24.CrossRefGoogle Scholar
  3. P.M.B. Vitányi and L. Meertens, SIGACT News, 16:4(1985), 56–59.CrossRefGoogle Scholar
  4. D.E. Knuth, Fundamental Algorithms, Addison-Wesley, 1973.Google Scholar
  5. D.E. Knuth, Fundamental Algorithms, Addison-Wesley, 1973.W. Feller, An Introduction to Probability Theory and Its Applications, Wiley, 1968.Google Scholar
  6. A.M. Turing in an important paper [Proc. London Math. Soc, 42(1936), 230–265; Correction, Ibid.].MathSciNetCrossRefGoogle Scholar
  7. [E.L. Post, J. Symb. Logic, 1936].Google Scholar
  8. H. Rogers, Jr., Theory of Recursive Functions and Effective Computability, McGraw-Hill, 1967.Google Scholar
  9. P. Odifreddi, Classical Recursion Theory, North-Holland, 1989.Google Scholar
  10. A.K. Zvonkin and L.A. Levin, Ibid.Google Scholar
  11. M.B. Pour-El and J.I. Richards, Computability in Analysis and Physics, Springer-Verlag, 1989.Google Scholar
  12. J. Hartmanis, Feasible computations and provable complexity properties, SIAM, 1978.Google Scholar
  13. J.L. Balcázar, J. Diaz, and J. Gabarró, Structural complexity, Springer-Verlag, 1988.Google Scholar
  14. M.R. Garey and D.S. Johnson, Computers and Intractability, Freeman, 1979.Google Scholar
  15. A.N. Kolmogorov’s classic treatment of the set-theoretic axioms of the calculus of probabilities is his slim book Grundbegriffe der Wahrscheinlichkeitsrechnung, Springer-Verlag, 1933.Google Scholar
  16. Chelsea, New York, 1956.Google Scholar
  17. [W. Feller, An Introduction to Probability Theory and Its Applications, Wiley, 1968].Google Scholar
  18. [D.E. Knuth, Seminumerical Algorithms, Addison-Wesley, 1981].Google Scholar
  19. D.E. Knuth, Ibid., 144-145.Google Scholar
  20. S.S. Skiena, Complex Systems, 1(1987), 361–366].MathSciNetzbMATHGoogle Scholar
  21. J. von Neumann (1903–1957) appears in [Various techniques used in connection with random digits, Collected Works, Vol.V, Macmillan, 1963.Google Scholar
  22. R. von Mises’s foundation of probability theory based on frequencies is set forth in [Mathemat. Zeitsch., 5(1919), 52–99.CrossRefGoogle Scholar
  23. Correction, Ibid., 6(1920); Probability, Statistics and Truth, Macmillan, 1939.Google Scholar
  24. T.L. Fine, Theories of Probability, Academic Press, 1973.Google Scholar
  25. [J.E. Littlewood, Littlewood’s Miscellany, Cambridge Univ. Press, 1986, 71-73].Google Scholar
  26. [Sankhyā, Series A, 25(1963), 369-376].Google Scholar
  27. [Ergebnisse eines Mathematischen Kolloquiums, Vol. 8, 1937, 38-72].Google Scholar
  28. A. Church [Bull. Amer. Math. Soc., 46(1940), 130–135].MathSciNetCrossRefGoogle Scholar
  29. J. Ville [Etude Critique de la Notion de Collectif, Gauthier-Villars, 1939].Google Scholar
  30. D.G. Champernowne [J. London Math. Soc, 8(1933), 254–260].MathSciNetCrossRefGoogle Scholar
  31. D.E. Knuth [Ibid., 142-169; summary, history, and references: 164-166].Google Scholar
  32. M. van Lambalgen [J. Symb. Logic, 52(1987), 725–755; Random Sequences, Ph.D. Thesis, Universiteit van Amsterdam, 1987].zbMATHCrossRefGoogle Scholar
  33. R. von Mises [Probability, Statistics and Truth, Macmillan, 1939].Google Scholar
  34. R.J. Solomonoff. Lemma 1.10.1, the Chernoff bound, is due to H. Chernoff [Ann. Math. Stat., 23(1952), 493–509].CrossRefGoogle Scholar
  35. L.G. Valiant and D. Angluin [J. Comput. System Sci., 18:2(1979), 155–193]MathSciNetzbMATHCrossRefGoogle Scholar
  36. [P. Erdős and J. Spencer, Probabilistic methods in combinatorics, Academic Press, 1974, p. 18].Google Scholar
  37. C.E. Shannon’s classic paper [Bell System Tech. J., 27(1948), 379–423, 623-656].MathSciNetGoogle Scholar
  38. R.G. Gallager, Information Theory and Reliable Communication, Wiley & Sons, 1968.Google Scholar
  39. T.M. Cover and J.A. Thomas, Elements of Information Theory, Wiley & Sons, New York, 1991].zbMATHCrossRefGoogle Scholar
  40. C.E. Shannon [Bell System Tech. J., 27(1948), 379–423, 623-656].MathSciNetzbMATHGoogle Scholar
  41. R.M. Fano. [Problems Inform. Transmission, 1(1965), 1–7; Russian Math. Surveys, 38:4(1983), 29-40].Google Scholar
  42. A.N. Kolmogorov [Sankhyā, Series A, 25(1963), 369–376].MathSciNetzbMATHGoogle Scholar
  43. [R.G. Gallager, Information Theory and Reliable Communication, Wiley, 1968].Google Scholar
  44. L.G. Kraft [A Device for Quantizing, Grouping, and Coding Amplitude Modulated Pulses, M. Sc. Thesis, Dept. Electr. Eng., MIT, Cambridge, Mass., 1949].Google Scholar
  45. C.E. Shannon’s Noiseless Coding Theorem 1.11.2 establishing the minimal average code-word length is from [Bell System Tech. J., 27(1948), 379–423, 623-656].MathSciNetGoogle Scholar
  46. Kolmogorov [Problems Inform. Transmission, 1:1(1965), 1–7].MathSciNetGoogle Scholar
  47. [P. Elias, IEEE Trans. Inform. Theory, IT-21(1975), 1094–203Google Scholar
  48. S.K. Leung-Yan-Cheong and T.M. Cover IEEE Trans. Inform. Theory, IT-24(1978), 331–339.MathSciNetCrossRefGoogle Scholar
  49. J. Rissanen, Ann. Stat, 11(1982), 416–431.MathSciNetCrossRefGoogle Scholar
  50. Stochastic Complexity in Statistical Inquiry, World Scientific, 1989.Google Scholar
  51. C.E. Shannon [Automata Studies, C.E. Shannon and J. McCarthy (Eds.), Princeton Univ. Press, 1956, 129-153.Google Scholar
  52. R.J. Solomonoff, of Cambridge, Massachusetts, USA; A.N. Kolmogorov, of Moscow, Russia; and G.J. Chaitin, of New York City, USA.Google Scholar
  53. Already in November 1960 R.J. Solomonoff published a Zator Company technical report.Google Scholar
  54. [A preliminary report on a general theory of inductive inference, Tech. Rept. ZTB-138, Zator Company, Cambridge, Mass., November 1960]Google Scholar
  55. [pp. 1-18 in: Proc. 2nd European Conf. Comput. Learning Theory, Lect. Notes Artific. Intell., Vol. 904, Springer-Verlag, 1995].Google Scholar
  56. [Logical Foundations of Probability, Univ. Chicago Press, 1950].Google Scholar
  57. [Inform. Contr., 7(1964), 1-22, 224-254].Google Scholar
  58. [T.L. Fine, Ibid.].Google Scholar
  59. M. Minsky referred to Solomonoff’s work [Proc. I.R.E., January 1961, 8-30; p. 43 in Proc. Symp. Appl. Math. XIV, Amer. Math. Soc, 1962.Google Scholar
  60. N. Kolmogorov, born 25 April 1903 in Tambov, Russia, died 20 October 1987 in Moscow. Many biographical details can be found in the Soviet Union’s foremost mathematics journal, Uspekhi Mat. Nauk, translated into English as Russian Math. Surveys.Google Scholar
  61. [B.V. Gnedenko, 28:5(1973), 5-16.Google Scholar
  62. P.S. Aleksandrov, 38:4(1983), 5-7.Google Scholar
  63. N.N. Bogolyubov, B.V. Gnedenko, and S.L. Sobolev, 38:4(1983), 9-27.Google Scholar
  64. A.N. Kolmogorov, 41:6(1986), 225-246.Google Scholar
  65. the entire memorial issue 43:6(1988), especially 1-39 by V.M. Tikhomirov].Google Scholar
  66. Nauka, Moscow [The Annals of Probability, 17:3(1989)].Google Scholar
  67. T.M. Cover, P. Gács, and R.M. Gray. See also the obituary in [Bull. London Math. Soc., 22:1(1990), 31-100].Google Scholar
  68. [V.A. Uspensky, J. Symb. Logic, 57:2(1992), 385–412].MathSciNetzbMATHCrossRefGoogle Scholar
  69. [Grundbegriffe der Wahrscheinlichkeitsrechnung, Springer-Verlag, 1933].Google Scholar
  70. [Problems Inform. Transmission, 1(1965), 1-7].Google Scholar
  71. [A.N. Shiryaev, Ibid., 921].Google Scholar
  72. Says Kolmogorov:[IEEE Trans. Inform. Theory, IT 14:5(1968), 662–664].CrossRefGoogle Scholar
  73. [A.N. Shiryaev, Ibid., 921].Google Scholar
  74. A.N. Kolmogorov and V.A. Uspensky, Uspekhi Mat. Nauk, 13:4(1958), 3–28zbMATHGoogle Scholar
  75. [in Russian; translated Amer. Math. Soc. Transi. (2), 29(1963), 217–245].Google Scholar
  76. [Inform. Contr., 9(1966), 602-619.Google Scholar
  77. Z. Wahrsch. Verw. Geb., 19(1971), 225-230].Google Scholar
  78. G.J. Chaitin [J. ACM, 13(1966), 547–569; J. ACM, 16(1969), 145-159]MathSciNetzbMATHCrossRefGoogle Scholar
  79. As Chaitin [Scientific American, 232:5(1975), 47–52]CrossRefGoogle Scholar
  80. [G.J. Chaitin, Information-Theoretic Incompleteness, World Scientific, Singapore, 1992].Google Scholar

Copyright information

© Springer Science+Business Media New York 1997

Authors and Affiliations

  • Ming Li
    • 1
  • Paul Vitányi
    • 2
  1. 1.Department of Computer ScienceUniversity of WaterlooWaterlooCanada
  2. 2.Centrum voor Wiskunde en InformaticaSJ AmsterdamThe Netherlands

Personalised recommendations