Kolmogorov Complexity and Combinatorial Methods in Communication Complexity
We introduce a method based on Kolmogorov complexity to prove lower bounds on communication complexity. The intuition behind our technique is close to information theoretic methods [1,2]. Our goal is to gain a better understanding of how information theoretic techniques differ from the family of techniques that follow from Linial and Shraibman’s work on factorization norms . This family extends to quantum communication, which prevents them from being used to prove a gap with the randomized setting.
We use Kolmogorov complexity for three different things: first, to give a general lower bound in terms of Kolmogorov mutual information; second, to prove an alternative to Yao’s minmax principle based on Kolmogorov complexity; and finally, to identify worst case inputs.
We show that our method implies the rectangle and corruption bounds , known to be closely related to the subdistribution bound . We apply our method to the hidden matching problem, a relation introduced to prove an exponential gap between quantum and classical communication . We then show that our method generalizes the VC dimension  and shatter coefficient lower bounds . Finally, we compare one-way communication and simultaneous communication in the case of distributional communication complexity and improve the previous known result .
KeywordsCommunication Complexity Kolmogorov Complexity Auxiliary Input Universal Turing Machine Deterministic Protocol
Unable to display preview. Download preview PDF.
- 2.Jain, R., Klauck, H., Nayak, A.: Direct product theorems for classical communication complexity via subdistribution bounds: extended abstract. In: Proc. of the 40th Annual ACM Symposium on Theory of Computing (STOC), pp. 599–608 (2008)Google Scholar
- 3.Linial, N., Shraibman, A.: Lower bounds in communication complexity based on factorization norms. Random Structures and Algorithms (to appear)Google Scholar
- 7.Bar-Yossef, Z., Jayram, T.S., Kumar, R., Sivakumar, D.: Information theory methods in communication complexity. In: Proc. of the 17th Annual IEEE Conference on Computational Complexity (CCC), pp. 93–102 (2002)Google Scholar
- 8.Yao, A.C.C.: Some complexity questions related to distributive computing (preliminary report). In: Proc. of the 11h Annual ACM Symposium on Theory of Computing (STOC), pp. 209–213. ACM, New York (1979)Google Scholar
- 10.Jayram, T.S., Kumar, R., Sivakumar, D.: Two applications of information complexity. In: Proc. of the 35th Annual ACM Symposium on Theory of Computing (STOC), pp. 673–682. ACM, New York (2003)Google Scholar
- 14.Buhrman, H., Koucký, M., Vereshchagin, N.: Randomised individual communication complexity. In: Proc. of the 23rd Annual IEEE Conference on Computational Complexity (CCC), pp. 321–331 (2008)Google Scholar
- 17.Lee, T., Shraibman, A.: Disjointness is hard in the multi-party number-on-the-forehead model. In: Proc. of the 23rd Annual IEEE Conference on Computational Complexity (CCC), pp. 81–91 (2008)Google Scholar
- 18.Lee, T., Shraibman, A., Špalek, R.: A direct product theorem for discrepancy. In: Proc. of the 23rd Annual IEEE Conference on Computational Complexity (CCC), pp. 71–80 (2008)Google Scholar
- 21.Degorre, J., Kaplan, M., Laplante, S., Roland, J.: The communication complexity of non-signaling distributions. Technical Report quant-ph/0804.4859, arXiv e-Print archive (2008)Google Scholar
- 23.Yao, A.C.C.: Lower bounds by probabilistic arguments (extended abstract). In: Proc. of the 24th Annual IEEE Symposium on Foundations of Computer Science (FOCS), pp. 420–428. IEEE, Los Alamitos (1983)Google Scholar