Communication Complexity (original) (raw)

Lower bounds on information complexity via zero-communication protocols and applications

53rd Annual IEEE Symposium on Foundations of Computer Science (FOCS'12), 2012

We show that almost all known lower bound methods for communication complexity are also lower bounds for the information complexity. In particular, we define a relaxed version of the partition bound of Jain and Klauck and prove that it lower bounds the information complexity of any function. Our relaxed partition bound subsumes all norm based methods (e.g. the γ 2 method) and rectangle-based methods (e.g. the rectangle/corruption bound, the smooth rectangle bound, and the discrepancy bound), except the partition bound.

Direct Sum Questions in Classical Communication Complexity

2012

In 1988, Karchmer and Wigderson generalized Yao’s two-party communication model of functions to relations and showed a remarkable connection of this model to the Boolean circuit model. A few years later, continuing this line of work, Karchmer, Raz, and Wigderson proposed a program to separate NC from P through direct-sum-type inequalities in communication complexity. This spurred the study of this fundamental question in communication complexity: given problems A and B, is it easier to solve A and B together than separately? It seems that we are still far from separating NC from P ; however, during the last 20 years of research our knowledge of the behavior of dierent communication complexity measures with respect to the direct sum has seen a lot of progress. We survey some of these results and make a new observation about the recent approach to the direct-sum question in the randomized setting.

Lower Bounds on the Multiparty Communication Complexity

Journal of Computer and System Sciences, 1998

We derive a general technique for obtaining lower bounds on the multiparty communication complexity of boolean functions. We extend the two-party method based on a crossing sequence argument introduced by Yao to the multiparty communication model. We use our technique to derive optimal lower and upper bounds of some simple boolean functions. Lower bounds for the multiparty model have been a challenge since (D. Dolev and T. Feder, in``Proceedings, 30th IEEE FOCS, 1989,'' pp. 428 433), where only an upper bound on the number of bits exchanged by a deterministic algorithm computing a boolean function f (x 1 , ..., x n) was derived, namely of the order (k 0 C 0)(k 1 C 1) 2 , up to logarithmic factors, where k 1 and C 1 are the number of processors accessed and the bits exchanged in a nondeterministic algorithm for f, and k 0 and C 0 are the analogous parameters for the complementary function 1& f. We show that C 0 n(1+2 C1) and D n(1+2 C1), where D is the number of bits exchanged by a deterministic algorithm computing f. We also investigate the power of a restricted multiparty communication model in which the coordinator is allowed to send at most one message to each party.

Communication Complexity of Simultaneous Messages

Siam Journal on Computing, 2003

In the multiparty communication game (CFL-game) of Chandra, Furst, and Lipton (Proc. 15th ACM STOC, 1983, 94-99) k players collaboratively evaluate a function f (x 0 , . . . , x k−1 ) in which player i knows all inputs except x i . The players have unlimited computational power. The objective is to minimize communication.

A Direct Sum Theorem in Communication Complexity via Message Compression

Computing Research Repository, 2003

We prove lower bounds for the direct sum problem for two-party bounded error randomised multipleround communication protocols. Our proofs use the notion of information cost of a protocol, as defined by Chakrabarti et al. and refined further by . Our main technical result is a 'compression' theorem saying that, for any probability distribution £ over the inputs, a ¤ -round private coin bounded error protocol for a function ¥ with information cost ¦ can be converted into a ¤ round deterministic protocol for ¥ with bounded distributional error and communication cost § © ¤ ¦

The communication complexity of the Exact-N Problem revisited

2005

If Alice has x, y, Bob has x, z and Carol has y, z can they determine if x+y+z=N? They can if (say) Alice broadcasts x to Bob and Carol; can they do better? Chandra, Furst, and Lipton studied this problem and showed sublinear upper bounds. They also had matching (up to an additive constant) lower bounds. We give an exposition of their result with some attention to what happens for particular values of N.

Multiparty communication complexity and very hard functions

Information and Computation, 2004

A boolean function f(x 1 ,. .. , x n) with x i ∈ {0, 1} m for each i is hard if its nondeterministic multiparty communication complexity (introduced in [in: Proceedings of the 30th IEEE FOCS, 1989, p. 428-433]), C(f), is at least nm. Note that C(f) nm for each f(x 1 ,. .. , x n) with x i ∈ {0, 1} m for each i. A boolean function is very hard if it is hard and its complementary function is also hard. In this paper, we show that randomly chosen boolean function f(x 1 ,. .. , x n) with x i ∈ {0, 1} m for each i is very hard with very high probability (for n 3 and m large enough). In [in: Proceedings of the 12th Symposium on Theoretical Aspects of Computer Science, LNCS 900, 1995, p. 350-360], it has been shown that if f(x 1 ,. .. , x k ,. .. , x n) = f 1 (x 1 ,. .. , x k) • f 2 (x k+1 ,. .. , x n), where C(f 1) > 0 and C(f 2) > 0, then C(f) = C(f 1) + C(f 2). We prove here an analogical result: If f(x 1 ,. .. , x k ,. .. , x n) = f 1 (x 1 ,. .. , x k) ⊕ f 2 (x k+1 ,. .. , x n) then DC(f) = DC(f 1) + DC(f 2), where DC(g) denotes the deterministic multiparty communication complexity of the function g and "⊕" denotes the parity function.