Skip to main content

Meaningful Information

Extended Abstract

  • Conference paper
  • First Online:
Algorithms and Computation (ISAAC 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2518))

Included in the following conference series:

Abstract

The information in an individual finite object (like a binary string) is commonly measured by its Kolmogorov complexity. One can divide that information into two parts: the information accounting for the useful regularity present in the object and the information accounting for the remaining accidental information. There can be several ways (model classes) in which the regularity is expressed. Kolmogorov has proposed the model class of finite sets, generalized later to computable probability mass functions. The resulting theory, known as Algorithmic Statistics, analyzes the algorithmic sufficient statistic when the statistic is restricted to the given model class. However, the most general way to proceed is perhaps to express the useful information as a recursive function. The resulting measure has been called the “sophistication” of the object. We develop the theory of recursive functions statistic, the maximum and minimum value, the existence of absolutely nonstochastic objects (that have maximal sophistication—all the information in them is meaningful and there is no residual randomness), determine its relation with the more restricted model classes of finite sets, and computable probability distributions, in particular with respect to the algorithmic (Kolmogorov) minimal sufficient statistic, the relation to the halting problem and further algorithmic properties.

First electronic version published November, 2001, on the LANL archives http://xxx.lanl.gov/abs/cs.CC/0111053.

Partially supported by the EU fifth framework project QAIP, IST-1999-11234, the NoE QUIPROCONE IST-1999-29064, the ESF QiT Programmme, and the EU Fourth Framework BRA NeuroCOLT II Working Group EP 27150. Also affiliated with the University of Amsterdam.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. A.R. Barron, J. Rissanen, and B. Yu, The minimum description length principle in coding and modeling, IEEE Trans. Inform. Theory, IT-44:6(1998), 2743–2760.

    Article  MathSciNet  Google Scholar 

  2. T.M. Cover, Kolmogorov complexity, data compression, and inference, pp. 23–33 in: The Impact of Processing Techniques on Communications, J.K. Skwirzynski, Ed., Martinus Nijhoff Publishers, 1985.

    Google Scholar 

  3. T.M. Cover and J.A. Thomas, Elements of Information Theory, Wiley, New York, 1991.

    MATH  Google Scholar 

  4. R. A. Fisher, On the mathematical foundations of theoretical statistics, Philosophical Transactions of the Royal Society of London, Ser. A, 222(1922), 309–368.

    Article  Google Scholar 

  5. P. Gács, On the symmetry of algorithmic information, Soviet Math. Dokl., 15 (1974) 1477–1480. Correction: ibid., 15 (1974) 1480.

    MATH  Google Scholar 

  6. P. Gács, J. Tromp, and P. Vitányi, Algorithmic statistics, IEEE Trans. Inform. Theory, 47:6(2001), 2443–2463.

    Article  MATH  MathSciNet  Google Scholar 

  7. Q. Gao, M. Li and P.M.B. Vitányi, Applying MDL to learn best model granularity, Artificial Intelligence, 121(2000), 1–29.

    Article  MATH  MathSciNet  Google Scholar 

  8. M. Gell-Mann, The Quark and the Jaguar, W. H. Freeman and Company, New York, 1994.

    MATH  Google Scholar 

  9. A.N. Kolmogorov, Three approaches to the quantitative definition of information, Problems Inform. Transmission 1:1 (1965) 1–7.

    MathSciNet  Google Scholar 

  10. A.N. Kolmogorov, On logical foundations of probability theory, Pp. 1–5 in: Probability Theory and Mathematical Statistics, Lect. Notes Math., Vol. 1021, K. Itô and Yu. V. Prokhorov, Eds., Springer-Verlag, Heidelberg, 1983.

    Chapter  Google Scholar 

  11. A.N. Kolmogorov and V.A. Uspensky, Algorithms and Randomness, SI AM Theory Probab. Appl., 32:3(1988), 389–412.

    Article  Google Scholar 

  12. M. Koppel, Complexity, depth, and sophistication, Complex Systems, 1(1987), 1087–1091

    MATH  MathSciNet  Google Scholar 

  13. M. Koppel, Structure, The Universal Turing Machine: A Half-Century Survey, R. Herken (Ed.), Oxford Univ. Press, 1988, pp. 435–452.

    Google Scholar 

  14. M. Li and P. Vitanyi, An Introduction to Kolmogorov Complexity and Its Applications, Springer-Verlag, New York, 1997 (2nd Edition).

    MATH  Google Scholar 

  15. A.Kh. Shen, The concept of (α, β)-stochasticity in the Kolmogorov sense, and its properties, Soviet Math. Dokl., 28:1(1983), 295–299.

    MATH  Google Scholar 

  16. A.Kh. Shen, Discussion on Kolmogorov complexity and statistical analysis, The Computer Journal, 42:4(1999), 340–342.

    Article  MATH  Google Scholar 

  17. N.K. Vereshchagin and P.M.B. Vitányi, Kolmogorov’s structure functions and an application to the foundations of model selection, Proc. 47th IEEE Symp. Found. Comput. Sci., 2002. Full version: http://xxx.lanl.gov/abs/cs.CC/0204037

  18. P.M.B. Vitányi and M. Li, Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity, IEEE Trans. Inform. Theory, IT-46:2(2000), 446–464.

    Article  Google Scholar 

  19. V.V. V’yugin, On the defect of randomness of a finite object with respect to measures with given complexity bounds, SIAM Theory Probab. Appl., 32:3(1987), 508–512.

    Article  MathSciNet  Google Scholar 

  20. V.V. V’yugin, Algorithmic complexity and stochastic properties of finite binary sequences, The Computer Journal, 42:4(1999), 294–317.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Vitányi, P. (2002). Meaningful Information. In: Bose, P., Morin, P. (eds) Algorithms and Computation. ISAAC 2002. Lecture Notes in Computer Science, vol 2518. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36136-7_51

Download citation

  • DOI: https://doi.org/10.1007/3-540-36136-7_51

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-00142-3

  • Online ISBN: 978-3-540-36136-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics