Abstract
The information in an individual finite object (like a binary string) is commonly measured by its Kolmogorov complexity. One can divide that information into two parts: the information accounting for the useful regularity present in the object and the information accounting for the remaining accidental information. There can be several ways (model classes) in which the regularity is expressed. Kolmogorov has proposed the model class of finite sets, generalized later to computable probability mass functions. The resulting theory, known as Algorithmic Statistics, analyzes the algorithmic sufficient statistic when the statistic is restricted to the given model class. However, the most general way to proceed is perhaps to express the useful information as a recursive function. The resulting measure has been called the “sophistication” of the object. We develop the theory of recursive functions statistic, the maximum and minimum value, the existence of absolutely nonstochastic objects (that have maximal sophistication—all the information in them is meaningful and there is no residual randomness), determine its relation with the more restricted model classes of finite sets, and computable probability distributions, in particular with respect to the algorithmic (Kolmogorov) minimal sufficient statistic, the relation to the halting problem and further algorithmic properties.
First electronic version published November, 2001, on the LANL archives http://xxx.lanl.gov/abs/cs.CC/0111053.
Partially supported by the EU fifth framework project QAIP, IST-1999-11234, the NoE QUIPROCONE IST-1999-29064, the ESF QiT Programmme, and the EU Fourth Framework BRA NeuroCOLT II Working Group EP 27150. Also affiliated with the University of Amsterdam.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
A.R. Barron, J. Rissanen, and B. Yu, The minimum description length principle in coding and modeling, IEEE Trans. Inform. Theory, IT-44:6(1998), 2743–2760.
T.M. Cover, Kolmogorov complexity, data compression, and inference, pp. 23–33 in: The Impact of Processing Techniques on Communications, J.K. Skwirzynski, Ed., Martinus Nijhoff Publishers, 1985.
T.M. Cover and J.A. Thomas, Elements of Information Theory, Wiley, New York, 1991.
R. A. Fisher, On the mathematical foundations of theoretical statistics, Philosophical Transactions of the Royal Society of London, Ser. A, 222(1922), 309–368.
P. Gács, On the symmetry of algorithmic information, Soviet Math. Dokl., 15 (1974) 1477–1480. Correction: ibid., 15 (1974) 1480.
P. Gács, J. Tromp, and P. Vitányi, Algorithmic statistics, IEEE Trans. Inform. Theory, 47:6(2001), 2443–2463.
Q. Gao, M. Li and P.M.B. Vitányi, Applying MDL to learn best model granularity, Artificial Intelligence, 121(2000), 1–29.
M. Gell-Mann, The Quark and the Jaguar, W. H. Freeman and Company, New York, 1994.
A.N. Kolmogorov, Three approaches to the quantitative definition of information, Problems Inform. Transmission 1:1 (1965) 1–7.
A.N. Kolmogorov, On logical foundations of probability theory, Pp. 1–5 in: Probability Theory and Mathematical Statistics, Lect. Notes Math., Vol. 1021, K. Itô and Yu. V. Prokhorov, Eds., Springer-Verlag, Heidelberg, 1983.
A.N. Kolmogorov and V.A. Uspensky, Algorithms and Randomness, SI AM Theory Probab. Appl., 32:3(1988), 389–412.
M. Koppel, Complexity, depth, and sophistication, Complex Systems, 1(1987), 1087–1091
M. Koppel, Structure, The Universal Turing Machine: A Half-Century Survey, R. Herken (Ed.), Oxford Univ. Press, 1988, pp. 435–452.
M. Li and P. Vitanyi, An Introduction to Kolmogorov Complexity and Its Applications, Springer-Verlag, New York, 1997 (2nd Edition).
A.Kh. Shen, The concept of (α, β)-stochasticity in the Kolmogorov sense, and its properties, Soviet Math. Dokl., 28:1(1983), 295–299.
A.Kh. Shen, Discussion on Kolmogorov complexity and statistical analysis, The Computer Journal, 42:4(1999), 340–342.
N.K. Vereshchagin and P.M.B. Vitányi, Kolmogorov’s structure functions and an application to the foundations of model selection, Proc. 47th IEEE Symp. Found. Comput. Sci., 2002. Full version: http://xxx.lanl.gov/abs/cs.CC/0204037
P.M.B. Vitányi and M. Li, Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity, IEEE Trans. Inform. Theory, IT-46:2(2000), 446–464.
V.V. V’yugin, On the defect of randomness of a finite object with respect to measures with given complexity bounds, SIAM Theory Probab. Appl., 32:3(1987), 508–512.
V.V. V’yugin, Algorithmic complexity and stochastic properties of finite binary sequences, The Computer Journal, 42:4(1999), 294–317.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Vitányi, P. (2002). Meaningful Information. In: Bose, P., Morin, P. (eds) Algorithms and Computation. ISAAC 2002. Lecture Notes in Computer Science, vol 2518. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36136-7_51
Download citation
DOI: https://doi.org/10.1007/3-540-36136-7_51
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00142-3
Online ISBN: 978-3-540-36136-7
eBook Packages: Springer Book Archive