Lower bounds on information complexity via zero-communication protocols and applications (original) (raw)

Lower bounds on information complexity

2014

We show that almost all known lower bound methods for communication complexity are also lower bounds for the information complexity. In particular, we define a relaxed version of the partition bound of Jain and Klauck [JK10] and prove that it lower bounds the information complexity of any function. Our relaxed partition bound subsumes all norm based methods (e.g. the γ 2 method) and rectangle-based methods (e.g. the rectangle/corruption bound, the smooth rectangle bound, and the discrepancy bound), except the partition bound. Our result uses a new connection between rectangles and zero-communication protocols where the players can either output a value or abort. We prove the following compression lemma: given a protocol for a function f with information complexity I, one can construct a zero-communication protocol that has non-abort probability at least 2 −O(I) and that computes f correctly with high probability conditioned on not aborting. Then, we show how such a zero-communication protocol relates to the relaxed partition bound. We use our main theorem to resolve three of the open questions raised by Braverman [Bra12]. First, we show that the information complexity of the Vector in Subspace Problem [KR11] is Ω(n 1/3), which, in turn, implies that there exists an exponential separation between quantum communication complexity and classical information complexity. Moreover, we provide an Ω(n) lower bound on the information complexity of the Gap Hamming Distance Problem.

A Direct Sum Theorem in Communication Complexity via Message Compression

Computing Research Repository, 2003

We prove lower bounds for the direct sum problem for two-party bounded error randomised multipleround communication protocols. Our proofs use the notion of information cost of a protocol, as defined by Chakrabarti et al. and refined further by . Our main technical result is a 'compression' theorem saying that, for any probability distribution £ over the inputs, a ¤ -round private coin bounded error protocol for a function ¥ with information cost ¦ can be converted into a ¤ round deterministic protocol for ¥ with bounded distributional error and communication cost § © ¤ ¦

On a conditional inequality in Kolmogorov complexity and its applications in communication complexity

2019

Romashchenko and Zimand~\cite{rom-zim:c:mutualinfo} have shown that if we partition the set of pairs (x,y)(x,y)(x,y) of nnn-bit strings into combinatorial rectangles, then I(x:y)geqI(x:ymidt(x,y))−O(logn)I(x:y) \geq I(x:y \mid t(x,y)) - O(\log n)I(x:y)geqI(x:ymidt(x,y))O(logn), where III denotes mutual information in the Kolmogorov complexity sense, and t(x,y)t(x,y)t(x,y) is the rectangle containing (x,y)(x,y)(x,y). We observe that this inequality can be extended to coverings with rectangles which may overlap. The new inequality essentially states that in case of a covering with combinatorial rectangles, I(x:y)geqI(x:ymidt(x,y))−logrho−O(logn)I(x:y) \geq I(x:y \mid t(x,y)) - \log \rho - O(\log n)I(x:y)geqI(x:ymidt(x,y))logrhoO(logn), where t(x,y)t(x,y)t(x,y) is any rectangle containing (x,y)(x,y)(x,y) and rho\rhorho is the thickness of the covering, which is the maximum number of rectangles that overlap. We discuss applications to communication complexity of protocols that are nondeterministic, or randomized, or Arthur-Merlin, and also to the information complexity of interactive protocols.

Communication Complexity

Here, we discuss a technique to compute the lower bound of a problem where two parties say Alice and Bob are given two n−bit inputs and they are trying to compute the AND of it i.e. whether there is a position in their inputs such that both the bits are 1. This problem is known by the name set disjointness problem and is studied under the context of communication complexity.

Space-bounded communication complexity

Proceedings of the 4th conference on Innovations in Theoretical Computer Science, 2013

In the past thirty years, Communication Complexity has emerged as a foundational tool to proving lower bounds in many areas of computer science. Its power comes from its generality, but this generality comes at a price-no superlinear communication lower bound is possible, since a player may communicate his entire input. However, what if the players are limited in their ability to recall parts of their interaction? We introduce memory models for 2-party communication complexity. Our general model is as follows: two computationally unrestricted players, Alice and Bob, each have s(n) bits of memory. When a player receives a bit of communication, he "compresses" his state. This compression may be an arbitrary function of his current memory contents, his input, and the bit of communication just received; the only restriction is that the compression must return at most s(n) bits. We obtain memory hierarchy theorems (also comparing this general model with its restricted variants), and show super-linear lower bounds for some explicit (non-boolean) functions. Our main conceptual and technical contribution concerns the following variant. The communication is one-way, from Alice to Bob, where Bob controls two types of memory: (i) a large, oblivious memory, where updates are only a function of the received bit and the current memory content, and (ii) a smaller, non-oblivious/general memory, where updates can be a function of the input given to Bob. We exhibit natural protocols where this semi-obliviousness shows up. For this model we also introduce new techniques through which certain limitations of space-bounded computation are revealed. One of the main motivations of this work is in understanding the difference in the use of space when computing the following functions: Equality (EQ), Inner Product (IP), and connectivity in a directed graph (Reach). When viewed as communication problems, EQ can be decided using 0 non-oblivious bits (and log 2 n oblivious bits), IP requires exactly 1 non-oblivious bit, whereas for Reach we obtain the same lower bound as for IP and conjecture that the actual bound is Ω(log 2 n). In fact, proving that 1 non-oblivious bit is required becomes technically sophisticated, and the question even for 2 non-oblivious bits for any explicit boolean function remains open.

Communication Complexity of the Secret Key Agreement in Algorithmic Information Theory

ArXiv, 2020

It is known that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel, assuming that the parties hold as their inputs x and y respectively. We determine the worst-case communication complexity of this problem for the setting where the parties can use private sources of random bits. We show that for some x, y the communication complexity of the secret key agreement does not decrease even if the parties have to agree on a secret key whose size is much smaller than the mutual information between x and y. On the other hand, we discuss examples of x, y such that the communication complexity of the protocol declines gradually with the size of the derived secret key. The proof of the main result uses spectral properties of appropriate graphs and the expander mixing lemma, as well as information theo...

The power of randomness for communication complexity

Proceedings of the nineteenth annual ACM conference on Theory of computing - STOC '87

Improving a result of Mehlhorn and Schmidt, a function f with deterministiccommunication complexity n2 is shown to have Las Vegas communication complexity O(n:l. This is the best possible, because the deterministic complexity cannot be more than the square of the Las Vegas communication complexity for any function.

Two applications of information complexity

2003

(1) In the two-party communication complexity model, we show that the tribes function on n inputs [6] has two-sided error randomized complexity Ω(n), while its nondeterminstic complexity and co-nondeterministic complexity are both Θ( √ n).

Communication Complexity of Simultaneous Messages

Siam Journal on Computing, 2003

In the multiparty communication game (CFL-game) of Chandra, Furst, and Lipton (Proc. 15th ACM STOC, 1983, 94-99) k players collaboratively evaluate a function f (x 0 , . . . , x k−1 ) in which player i knows all inputs except x i . The players have unlimited computational power. The objective is to minimize communication.