An Improved Communication-Randomness Tradeoff

  • Martin Fürer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2976)


Two processors receive inputs X and Y respectively. The communication complexity of the function f is the number of bits (as a function of the input size) that the processors have to exchange to compute f(X,Y) for worst case inputs X and Y. The List-Non-Disjointness problem (X=(x 1,...,x n ), Y=(y 1,...,y n ), \(x^{j},y^{j}\in {\rm Z}^{n}_{2}\), to decide whether \(\exists_{j}x^{j}=y^{j}\)) exhibits maximal discrepancy between deterministic n 2 and Las Vegas (Θ(n)) communication complexity. Fleischer, Jung, Mehlhorn (1995) have shown that if a Las Vegas algorithm expects to communicate Ω(n logn) bits, then this can be done with a small number of coin tosses.

Even with an improved randomness efficiency, this result is extended to the (much more interesting) case of efficient algorithms (i.e. with linear communication complexity). For any R ∈ ℕ, R coin tosses are sufficient for O(n+n 2 /2 R ) transmitted bits.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Aho, A.V., Ullman, J.D., Yannakakis, M.: On notions of information transfer in VLSI circuits. In: Proceedings of the fifteenth annual ACM symposium on Theory of computing, pp. 133–139 (1983)Google Scholar
  2. 2.
    Lawrence Carter, J., Wegman, M.N.: Universal classes of hash functions. Journal of Computer and System Sciences 18(2), 143–154 (1979)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Chernoff, H.: A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations. Annals of Math. Stat. 23, 493–509 (1952)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Fleischer, R.: Communication complexity of multi-processor systems. Information Processing Letters 30(2), 57–65 (1989)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Fleischer, R., Jung, H., Mehlhorn, K.: A communicationrandomness tradeoff for two-processor systems. Information and Computation 116(2), 155–161 (1995)zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Fürer, M.: The power of randomness for communication complexity. In: Proceedings of the nineteenth annual ACM conference on Theory of computing, pp. 178–181. ACM Press, New York (1987)CrossRefGoogle Scholar
  7. 7.
    Fürer, M.: Universal hashing in VLSI. In: Reif, J.H. (ed.) AWOC 1988. LNCS, vol. 319, pp. 312–318. Springer, Heidelberg (1988)CrossRefGoogle Scholar
  8. 8.
    Mehlhorn, K., Schmidt, E.M.: Las Vegas is better than determinism in VLSI and distributed computing (extended abstract). In: Proceedings of the Fourteenth Annual ACM Symposium on Theory of Computing, pp. 330–337 (1982)Google Scholar
  9. 9.
    Motwani, R., Raghavan, P.: Randomized algorithms. Cambridge University Press, Cambridge (1995)zbMATHGoogle Scholar
  10. 10.
    Papadimitriou, C.H., Sipser, M.: Communication complexity. In: Proceedings of the fourteenth annual ACM symposium on Theory of computing, pp. 196–200 (1982)Google Scholar
  11. 11.
    Yao, C.: Some complexity questions related to distributive computing (preliminary report). In: Proceedings of the eleventh annual ACM symposium on Theory of computing, pp. 209–213 (1979)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Martin Fürer
    • 1
  1. 1.Department of Computer Science and EngineeringPennsylvania State UniversityUniversity ParkUSA

Personalised recommendations