On Polynomially Time Bounded Symmetry of Information

  • Troy Lee
  • Andrei Romashchenko
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3153)


The information contained in a string x about a string y is defined as the difference between the Kolmogorov complexity of y and the conditional Kolmogorov complexity of y given x, i.e., I(x:y)=C(y)–C(y|x). From the well-known Kolmogorov–Levin Theorem it follows that I(x:y) is symmetric up to a small additive term O(logC(x,y)). We investigate if this property can hold for several versions of polynomial time bounded Kolmogorov complexity.

In particular, we study symmetry of information for some variants of distinguishing complexity CD where CD(x) is the length of a shortest program which accepts x and only x. We show relativized worlds where symmetry of information does not hold for deterministic and nondeterministic polynomial time distinguishing complexities CDpoly and CNDpoly For nondeterministic polynomial time distinguishing with randomness, CAMpoly, we prove that symmetry of information holds for most pairs of strings in any set in NP. In proving this last statement we extend a recent result of Buhrman et al. [6], which may be of independent utility.


Kolmogorov Complexity Distinguishing Complexity Short Program 47th IEEE Symposium Oracle Access 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Allender, E., Buhrman, H., Koucky, M., van Melkebeek, D., Ronneburger, D.: Power from random strings. In: Proceedings of the 47th IEEE Symposium on Foundations of Computer Science, pp. 669–678. IEEE, Los Alamitos (2002)Google Scholar
  2. 2.
    Babai, L.: Trading group theory for randomness. In: Proceedings of the 17th ACM Symposium on the Theory of Computing, pp. 421–429. ACM, New York (1985)Google Scholar
  3. 3.
    Buhrman, H., Fortnow, L.: Distinguishing complexity and symmetry of information. Technical Report TR-95-11, Department of Computer Science, The University of Chicago (1995)Google Scholar
  4. 4.
    Buhrman, H., Fortnow, L., Laplante, S.: Resource bounded Kolmogorov complexity revisited. SIAM Journal on Computing 31(3), 887–905 (2002)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Buhrman, H., Laplante, S., Miltersen, P.B.: New bounds for the language compression problem. In: Proceedings of the 15th IEEE Conference on Computational Complexity, pp. 126–130. IEEE, Los Alamitos (2000)CrossRefGoogle Scholar
  6. 6.
    Buhrman, H., Lee, T., van Melkebeek, D.: Language compression and pseudorandom generators. To appear in 19th IEEE Conference on Computational Complexity (2004)Google Scholar
  7. 7.
    Fürer, M., Goldreich, O., Mansour, Y., Sipser, M., Zachos, S.: On completeness and soundness in interactive proof systems. In: Micali, S. (ed.) Randomness and Computation. Advances in Computing Research, vol. 5, pp. 429–442. JAI Press, Greenwich (1989)Google Scholar
  8. 8.
    Fortnow, L., Kummer, M.: On resource-bounded instance complexity. Theoretical Computer Science A 161, 123–140 (1996)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Impagliazzo, R., Rudich, S.: Limits on the provable consequences of one-way functions. In: Proceedings of the 21st ACM Symposium on the Theory of Computing, pp. 41–61. ACM, New York (1989)Google Scholar
  10. 10.
    Jiang, T., Seiferas, J., Vitányi, P.: Two heads are better than two tapes. Journal of the ACM 44(2), 237–256 (1997)zbMATHCrossRefMathSciNetGoogle Scholar
  11. 11.
    Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Problems Information Transmission 1(1), 1–7 (1965)MathSciNetGoogle Scholar
  12. 12.
    Lee, T., Romashchenko, A.: On Polynomially Time Bounded Symmetry of Information. Electronic Colloquium on Computational Complexity, Report TR04-031 (April 2004)Google Scholar
  13. 13.
    Levin, L.A.: Universal Search Problems. Problems Information Transmission 9(3), 265–266 (1973)Google Scholar
  14. 14.
    Li, M., Vitányi, P.: An Introduction to Kolmogorov Complexity and its Applications, 2nd edn. Springer, New York (1997)zbMATHGoogle Scholar
  15. 15.
    Longpré, L., Mocas, S.: Symmetry of information and one-way functions. Information Processing Letters 46(2), 95–100 (1993)zbMATHCrossRefMathSciNetGoogle Scholar
  16. 16.
    Longpré, L., Watanabe, O.: On symmetry of information and polynomial time invertibility. Information and Computation 121(1), 14–22 (1995)zbMATHCrossRefMathSciNetGoogle Scholar
  17. 17.
    Nisan, N., Wigderson, A.: Hardness vs. randomness. Journal of Computer and System Sciences 49, 149–167 (1994)zbMATHCrossRefMathSciNetGoogle Scholar
  18. 18.
    Ronneburger, D.: Personal Communication (2004) Google Scholar
  19. 19.
    Sipser, M.: A complexity theoretic approach to randomness. In: Proceedings of the 15th ACM Symposium on the Theory of Computing, pp. 330–335. ACM, New York (1983)Google Scholar
  20. 20.
    Trevisan, L.: Construction of extractors using pseudo-random generators. In: Proceedings of the 31st ACM Symposium on the Theory of Computing, pp. 141–148. ACM, New York (1999)Google Scholar
  21. 21.
    Vereshchagin, N., Vitányi, P.: Kolmogorov’s structure function with an application to the foundations of model selection. In: Proceedings of the 47th IEEE Symposium on Foundations of Computer Science, pp. 751–760. IEEE, Los Alamitos (2002)Google Scholar
  22. 22.
    Zvonkin, A., Levin, L.: The complexity of finite objects and the algorithmic concepts of information and randomness. Russian Mathematical Surveys 25, 83–124 (1970)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Troy Lee
    • 1
  • Andrei Romashchenko
    • 2
  1. 1.CWI and University of Amsterdam 
  2. 2.Institute for Information Transmission Problems 

Personalised recommendations