information complexity and exact communication bounds

48
1 Information complexity and exact communication bounds April 26, 2013 Mark Braverman Princeton University Based on joint work with Ankit Garg, Denis Pankratov, and Omri Weinstein

Upload: lilah

Post on 23-Feb-2016

47 views

Category:

Documents


0 download

DESCRIPTION

Information complexity and exact communication bounds. Mark Braverman Princeton University. April 26, 2013. Based on joint work with Ankit Garg , Denis Pankratov , and Omri Weinstein. Overview: information complexity. Information complexity :: communication complexity a s - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Information complexity and exact communication bounds

1

Information complexity and exact communication bounds

April 26, 2013

Mark BravermanPrinceton University

Based on joint work with Ankit Garg, Denis Pankratov, and Omri Weinstein

Page 2: Information complexity and exact communication bounds

Overview: information complexity

• Information complexity :: communication complexity

as• Shannon’s entropy ::

transmission cost

2

Page 3: Information complexity and exact communication bounds

Background – information theory

• Shannon (1948) introduced information theory as a tool for studying the communication cost of transmission tasks.

3

communication channel

Alice Bob

Page 4: Information complexity and exact communication bounds

Shannon’s entropy

• Assume a lossless binary channel. • A message is distributed according to some

prior .• The inherent amount of bits it takes to

transmit is given by its entropy.

4communication channel

X

Page 5: Information complexity and exact communication bounds

Shannon’s noiseless coding

• The cost of communicating many copies of scales as .

• Shannon’s source coding theorem:– Let be the cost of transmitting

independent copies of . Then the amortized transmission cost

.

5

Page 6: Information complexity and exact communication bounds

Shannon’s entropy – cont’d• Therefore, understanding the cost of

transmitting a sequence of ’s is equivalent to understanding Shannon’s entropy of .

• What about more complicated scenarios?

communication channelX

Y• Amortized transmission cost = conditional

entropy .

Page 7: Information complexity and exact communication bounds

A simple example• Alice has uniform • Cost of transmitting to Bob is

• Suppose for each Bob is given a unifomly

random such that then… cost of transmitting the ’s to Bob is

.

7

Easy and complete!

Page 8: Information complexity and exact communication bounds

Communication complexity [Yao]• Focus on the two party randomized setting.

8

A B

X YA & B implement a functionality .

F(X,Y)

e.g.

Meanwhile, in a galaxy far far away…

Shared randomness R

Page 9: Information complexity and exact communication bounds

Communication complexity

A B

X Y

Goal: implement a functionality .A protocol computing :

F(X,Y)

m1(X,R)m2(Y,m1,R)

m3(X,m1,m2,R)

Communication cost = #of bits exchanged.

Shared randomness R

Page 10: Information complexity and exact communication bounds

Communication complexity

• Numerous applications/potential applications (streaming, data structures, circuits lower bounds…)

• Considerably more difficult to obtain lower bounds than transmission (still much easier than other models of computation).

• Many lower-bound techniques exists. • Exact bounds??

10

Page 11: Information complexity and exact communication bounds

Communication complexity

• (Distributional) communication complexity with input distribution and error : Error w.r.t. .

• (Randomized/worst-case) communication complexity: . Error on all inputs.

• Yao’s minimax:.

11

Page 12: Information complexity and exact communication bounds

Set disjointness and intersectionAlice and Bob each given a set , (can be viewed as vectors in • Intersection .• Disjointness if , and otherwise. • is just 1-bit-ANDs in parallel. • is an OR of 1-bit-ANDs. • Need to understand amortized communication

complexity (of 1-bit-AND).

Page 13: Information complexity and exact communication bounds

Information complexity

• The smallest amount of information Alice and Bob need to exchange to solve .

• How is information measured?• Communication cost of a protocol?

– Number of bits exchanged. • Information cost of a protocol?

– Amount of information revealed.

13

Page 14: Information complexity and exact communication bounds

Basic definition 1: The information cost of a protocol

• Prior distribution: .

A B

X Y

Protocol πProtocol transcript

𝐼𝐶(𝜋 ,𝜇)= 𝐼 (Π ;𝑌∨𝑋 )+𝐼 (Π ; 𝑋∨𝑌 )what Alice learns about Y + what Bob learns about X

Page 15: Information complexity and exact communication bounds

Mutual information

• The mutual information of two random variables is the amount of information knowing one reveals about the other:

• If are independent, .• .

15

H(A) H(B)I(A,B)

Page 16: Information complexity and exact communication bounds

Basic definition 1: The information cost of a protocol

• Prior distribution: .

A B

X Y

Protocol πProtocol transcript

𝐼𝐶(𝜋 ,𝜇)= 𝐼 (Π ;𝑌∨𝑋 )+𝐼 (Π ; 𝑋∨𝑌 )what Alice learns about Y + what Bob learns about X

Page 17: Information complexity and exact communication bounds

Example• is .• is a distribution where w.p. and w.p. are

random.

A B

X Y

1 + 64.5 = 65.5 bits

what Alice learns about Y + what Bob learns about X

MD5(X) [128 bits]X=Y? [1 bit]

Page 18: Information complexity and exact communication bounds

Information complexity

• Communication complexity:.

• Analogously:.

18

Page 19: Information complexity and exact communication bounds

Prior-free information complexity

• Using minimax can get rid of the prior. • For communication, we had:

.• For information

.

19

Page 20: Information complexity and exact communication bounds

Connection to privacy

• There is a strong connection between information complexity and (information-theoretic) privacy.

• Alice and Bob want to perform computation without revealing unnecessary information to each other (or to an eavesdropper).

• Negative results through arguments.

20

Page 21: Information complexity and exact communication bounds

Information equals amortized communication

• Recall [Shannon]: .• [BR’11]: , for .• For : .

•[ an interesting open question.]

21

Page 22: Information complexity and exact communication bounds

Without priors

•[BR’11] For : .• [B’12] .

22

Page 23: Information complexity and exact communication bounds

Intersection

• Therefore

• Need to find the information complexity of the two-bit !

23

Page 24: Information complexity and exact communication bounds

The two-bit AND

• [BGPW’12] bits.• Find the value of for all priors .• Find the information-theoretically optimal

protocol for computing the of two bits.

24

Page 25: Information complexity and exact communication bounds

The optimal protocol for AND

A B

X{0,1} Y{0,1}

If X=1, A=1If X=0, A=U[0,1]

If Y=1, B=1If Y=0, B=U[0,1]

0

1

“Raise your hand when your number is reached”

Page 26: Information complexity and exact communication bounds

The optimal protocol for AND

A B

If X=1, A=1If X=0, A=U[0,1]

If Y=1, B=1If Y=0, B=U[0,1]

0

1

“Raise your hand when your number is reached”

X{0,1} Y{0,1}

Page 27: Information complexity and exact communication bounds

Analysis• An additional small step if the prior is not

symmetric (). • The protocol is clearly always correct. • How do we prove the optimality of a

protocol?• Consider the function as a function of .

27

Page 28: Information complexity and exact communication bounds

The analytical view• A message is just a mapping from the

current prior to a distribution of posteriors (new priors). Ex:

28

Y=0 Y=1

X=0 0.4 0.2

X=1 0.3 0.1

Y=0 Y=1

X=0 2/3 1/3

X=1 0 0

Y=0 Y=1

X=0 0 0

X=1 0.75 0.25Alice sends her bit

“0”: 0.6

“1”: 0.4

Page 29: Information complexity and exact communication bounds

The analytical view

29

Y=0 Y=1

X=0 0.4 0.2

X=1 0.3 0.1

Y=0 Y=1

X=0 0.545 0.273

X=1 0.136 0.045

Y=0 Y=1

X=0 2/9 1/9

X=1 1/2 1/6Alice sends her bit w.p ½ and unif. random bit w.p ½.

“0”: 0.55

“1”: 0.45

Page 30: Information complexity and exact communication bounds

Analytical view – cont’d

• Denote .• Each potential (one bit) message by either

party imposes a constraint of the form:

• In fact, is the point-wise largest function satisfying all such constraints (cf. construction of harmonic functions).

30

Page 31: Information complexity and exact communication bounds

IC of AND

• We show that for described above, satisfies all the constraints, and therefore represents the information complexity of at all priors.

• Theorem: represents the information-theoretically optimal protocol* for computing the of two bits.

31

Page 32: Information complexity and exact communication bounds

*Not a real protocol

• The “protocol” is not a real protocol (this is why IC has an inf in its definition).

• The protocol above can be made into a real protocol by discretizing the counter (e.g. into equal intervals).

• We show that the -round IC:

32

Page 33: Information complexity and exact communication bounds

Previous numerical evidence

• [Ma,Ishwar’09] – numerical calculation results.33

Page 34: Information complexity and exact communication bounds

Applications: communication complexity of intersection

• Corollary:

• Moreover:

34

Page 35: Information complexity and exact communication bounds

Applications 2: set disjointness

• Recall: .• Extremely well-studied. [Kalyanasundaram

and Schnitger’87, Razborov’92, Bar-Yossef et al.’02]: .

• What does a hard distribution for look like?

35

Page 36: Information complexity and exact communication bounds

A hard distribution?

36

0 0 1 1 0 1 0 0 0 1 0 0 1 1 1 1 0 1 1 0 0

1 0 1 0 0 1 1 1 0 0 1 1 1 0 1 0 1 1 0 0 0

Y=0 Y=1

X=0 1/4 1/4

X=1 1/4 1/4

Very easy!

Page 37: Information complexity and exact communication bounds

A hard distribution

37

0 0 0 1 0 1 0 0 0 1 0 0 1 1 0 1 0 1 1 0 0

1 0 1 0 0 0 1 1 0 0 1 1 1 0 0 0 1 0 0 0 0

Y=0 Y=1

X=0 1/3 1/3

X=1 1/3

At most one (1,1) location!

Page 38: Information complexity and exact communication bounds

Communication complexity of Disjointness

• Continuing the line of reasoning of Bar-Yossef et. al.

• We now know exactly the communication complexity of Disj under any of the “hard” prior distributions. By maximizing, we get:

• , where

• With a bit of work this bound is tight.38

Page 39: Information complexity and exact communication bounds

Small-set Disjointness

• A variant of set disjointness where we are given of size .

• A lower bound of is obvious (modulo ). • A very elegant matching upper bound was

known [Hastad-Wigderson’07]: .

39

Page 40: Information complexity and exact communication bounds

Using information complexity

• This setting corresponds to the prior distribution

• Gives information complexity • Communication complexity

Y=0 Y=1

X=0 1-2k/n k/n

X=1 k/n

Page 41: Information complexity and exact communication bounds

Overview: information complexity

• Information complexity :: communication complexity

as• Shannon’s entropy ::

transmission costToday: focused on exact bounds using IC.

41

Page 42: Information complexity and exact communication bounds

Selected open problems 1• The interactive compression problem. • For Shannon’s entropy we have

• E.g. by Huffman’s coding we also know that

• In the interactive setting

• But is it true that ??

Page 43: Information complexity and exact communication bounds

Interactive compression?

• is equivalent to , the “direct sum” problem for communication complexity. • Currently best general compression scheme

[BBCR’10]: protocol of information cost and communication cost compressed to bits of communication.

43

Page 44: Information complexity and exact communication bounds

Interactive compression?

• is equivalent to , the “direct sum” problem for communication complexity. • A counterexample would need to separate

IC from CC, which would require new lower bound techniques [Kerenidis, Laplante, Lerays, Roland, Xiao’12].

44

Page 45: Information complexity and exact communication bounds

Selected open problems 2

• Given a truth table for , a prior , and an , can we compute ?

• An uncountable number of constraints, need to understand structure better.

• Specific ’s with inputs in .

• Going beyond two players.

45

Page 46: Information complexity and exact communication bounds

External information cost

(𝑋 ,𝑌 ) 𝜇 .

A B

X Y

Protocol πProtocol transcript

𝐼𝐶𝑒𝑥𝑡 (𝜋 ,𝜇)=𝐼 (Π ; 𝑋𝑌 )≥ 𝐼 (Π ;𝑌|𝑋 )+𝐼 (Π ; 𝑋∨𝑌 )what Charlie learns about

C

Page 47: Information complexity and exact communication bounds

External information complexity

• .• Conjecture: Zero-error communication scales

like external information:

• Example: for this value is

47

Page 48: Information complexity and exact communication bounds

48

Thank You!