[SOLVED] CS Prof. Mark Bun

$25

File Name: CS_Prof._Mark_Bun.zip
File Size: 160.14 KB

5/5 - (1 vote)

Prof. Mark Bun
CAS CS 591 B: Communication Complexity
Lecture Notes 1: Introduction / Deterministic Protocols
Fall 2019
Reading.
Rao-Yehudayoff Introduction, Chapter 1
Communication is an essential resource for computing in distributed environments. It is also intimately connected to computation itself. In many situations, communication (or the inability to communicate) is the bottleneck for complex computation. In this class, well learn a number of techniques for understanding the optimal communication protocols for important problems. Well also learn how to use this understanding to obtain a broad range of applications in theoretical computer science.
1 Some Examples
The Power of Randomness Suppose Alice holds a string x {0, 1}n and Bob holds a string y {0, 1}n. How many bits do they need to exchange to determine whether x = y? Deterministically, this requires n + 1 bits of communication. (Why?) Well actually see several different proofs of this. However, if Alice and Bob have access to a shared n-bit random string s, then Alice can send Bob x, s mod 2, and Bob can check whether this is equal to y, s mod 2. If x = y, these will be equal with probability 1, and if x = y, they will disagree with probability at least 1/2. The failure probability of this protocol can be decreased to 2k by repeating it k times. Well also see how to replace the shared random string s with private randomness in a generic fashion.
In the world of Turing machines, the question of whether P = RP is fascinating, wide open, and believed to be true. But for communication, theres a maximal separation.
Surprising Protocols Let G be a graph on n vertices, and suppose Alice holds a clique C and Bob holds an independent set I. How many bits do they need to exchange to determine whether C I = ? (Or whether they intersect in exactly one vertex.) Describing a clique or independent set requires about n bits, but theres an interactive protocol that does this using log2 n bits. Each round of the protocol maintains a set V of live vertices. If C V contains a vertex v with degree |V |/2, then Alice sends Bob the name of v using log n bits. Either v I, or all non-neighbors of v can be removed from V for the next round. On the other hand, if I V contains a vertex with degree |V |/2, Bob can send Alice the name of the vertex and remove all of its neighbors. (If no such vertex exists, then C I = .) This protocol can last for at most log n rounds, giving log2 n total communication.
Applications A major topic in circuit complexity, with applications to understanding neural networks, is to prove lower bounds on the expressive power of circuits comprised of threshold gates.
1

A linear threshold function computes sgn(w1z1 + + wszs) for reals w1, . . . , ws. Circuits of the form THRMAJ are essentially the most powerful threshold circuits against which we can prove lower bounds using communication complexity!
The idea is to show that if f(x,y) has a small circuit of this form, then it admits a randomized communication protocol that just barely beats random guessing. The protocol is as follows. Suppose the circuit has size s and the top threshold gate is computed by T (z) = sgn(w1z1 + + wszs). Observe that if we sample an index i with probability |wi|/(|w1|++|ws|, then sgn(wi)zi = T(z) with probability > 1/2. So Alice can simply send this random index, together with the sum of her inputs which feed into the MAJ gate at index i, using log s + log n bits.
In 2001, Forster showed that the function IP(x, y) = x, y mod 2 has nearly maximal commu- nication complexity (n) in this model. (Well see this beautiful proof later in the course.) Hence it requires exponential THR MAJ size 2(n).
2 The Deterministic Model
Let Alices input come from a set X and let Bobs input come from a set Y . Their goal is to compute a function f : X Y {0, 1}. A communication protocol is a rooted binary tree which encodes all sequences of possible messages between Alice and Bob over the course of trying to compute f. Each internal vertex of the tree v is marked by a speaker (A or B) and a function mv : X {0, 1} if the speaker is Alice or a function mv : Y {0, 1} if the speaker is Bob. The leaves are marked by 0 or 1 indicating the result of the protocol.
To execute on an input (x,y), the parties start at the root r of the tree. If the speaker is Alice, she announces mr(x) if the result is 0, the parties go on to the left child of r and if the result is 1, they go to the right child of r, and so forth, until they reach a leaf.
The protocol computes f if (x,y) = f(x,y) for every (x,y) X Y. The length of the protocol is the depth of the protocol tree, and the number of rounds is the maximum number of alternations between Alice and Bob. Let Pcc(f) denote the minimum length of a protocol which computes f.
2.1 Counting Lower Bounds
Every function f : {0, 1}n {0, 1}n {0, 1} can be computed using n bits of communication. (Alice sends her input to Bob). It turns out that almost all functions require essentially maximal communication. This can be proved by a counting argument. There are 222n functions f : {0, 1}n {0, 1}n {0, 1}. How many functions can be computed by communication protocols of length c?
A protocol tree of depth c has at most 2c leaves and 2c internal vertices. Each internal vertex
hasaspeaker(2choices)andanext-messagefunction(22n choices).Thisgivesatmost(222n)2c
22n+c+1 protocols. Together with the 22c labelings of the leaves, this is at most 22c+2n+c+1 functions.
Even taking c = n 2, the fraction of all possible functions this covers is only 22n2+22n122n 222n2 .
This lower bound shows that hard functions are ubiquitous in communication complexity. But we are still confronted with the the usual problem in complexity theory of finding hay in a haystack: Can we identify explicit functions which are hard? Or better, prove that functions we care about require high communication?
2

2.2 Rectangles
To answer this question, we take a closer look at the combinatorial structure of communication protocols. Every protocol can be thought of as being built from combinatorial rectangles. A rectangle is a set of the form AB X Y where A X and B Y. Equivalently, a rectangle isasetRXY suchthatforevery(x,y),(x,y)Rwehave(x,y)Rand(x,y)R.
Lemma 1 Let v be a vertex in a protocol tree . Let Rv be the set of inputs (x,y) which cause to pass through v. Then Rv is a rectangle.
Proof: We prove this by induction on the structure of the tree. As the base case, every input passes through the root r, and Rr = X Y is a rectangle. Now let Rv = AB be a rectangle of inputs passing through vertex v, and let u and w be the children of v. Suppose Alice is the speaker atvertexv,andletA0 ={xA:mv(x)=0}andA1 ={xA:mv(x)=1. Thenthesetsof inputs which pass through u and w are the rectangles A0 B and A1 B, respectively.
This proof actually shows a bit more: Namely, the leaves of the protocol tree partition the input space X Y into disjoint rectangles. Moreover, every such rectangle R is monochromatic, in the sense that (x,y) = 0 for all (x,y) R or (x,y) = 1 for all (x,y) R. The following fact gives us our first lower bound technique for explicit functions:
Theorem 2 If f : X Y {0, 1} is computed by a protocol of cost c, then X Y can be partitioned into at most 2c rectangles which are monochromatic with respect to f.
Intuitively, if a function f only has small monochromatic rectangles, then we need many of these rectangles (and hence a long protocol) to cover the input space. Lets try applying this observation to prove a lower bound for equality:
Example 3 Let EQn : {0, 1}n {0, 1}n {0, 1} denote the equality function EQ(x, y) = 1 iff x = y. We claim that Pcc(EQn) = n + 1. To see this, we observe that every 1-monochromatic rectangle with respect to EQn has size 1. Hence, we need at least 2n rectangles to cover the 1-inputs of g, plus at least one more rectangle to cover the 0-inputs. So the communication complexity is exactly n + 1.
Unfortunately, arguments based on rectangle size alone arent enough for some hard problems. Consider, for instance, the Greater-Than function GTn : {0, 1}n {0, 1}n {0, 1} defined by GTn(x,y) = 1 iff x > y in lexicographic order. Then GTn has both large 0-monochromatic and 1-monochromatic rectangles.
Next time, well see additional techniques which will let us prove lower bounds for this function.
3

Reviews

There are no reviews yet.

Only logged in customers who have purchased this product may leave a review.

Shopping Cart
[SOLVED] CS Prof. Mark Bun
$25