interactive channel capacity. [shannon 48]: a mathematical theory of communication an exact formula...

Post on 08-Jan-2018

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Interactive Channel Capacity

[Shannon 48]:A Mathematical Theory of

Communication

An exact formula for the channel

capacity of any noisy channel

-noisy channel:Each bit is flipped with prob

Alice wants to send bits to Bob. They

only have access to an -noisy channel.

How many bits Alice needs to send, so

that Bob can retrieve the original bits,

with prob ?

1-

1-

0

1 1

0𝛆𝛆

Channel Capacity [Shannon 48]: 1) are sufficient2) are needed

channel capacity:

Communication Complexity [Yao 79]:

Player gets . Player gets They need to compute ( is publicly known)How many bits they need to

communicate? probabilistic CC of (with negligible error for

every ) (with shared random string)

CC over the -noisy channel: How many communication bits are

neededto compute over the -noisy channel? CC of over -noisy channel (with negligible error for

every ) (with shared random string)

Interactive Channel Capacity: probabilistic CC of CC of over -noisy channel

(note: is not the input size)

[Schulman 92]: Hence, [Sch,BR,B,GMS,BK,BN]:Simulation of any CC protocol in

the presence of adversarial noise[Shannon 48]: [Schulman 92]: Is ?

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[KR 13]: Upper Bound: In particular, for small enough , (with strict inequality)Order of Communication Model: Order of communication in the

protocol is pre-determined (i.e., non-adaptive)

(otherwise both players may try to send bits at the same time)

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[KR 13]: Upper Bound: In particular, for small enough , (with strict inequality)

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[KR 13]: Upper Bound: In particular, for small enough , (with strict inequality)The main ideas seem to be valid for other communication models, but

without the factor

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

Order of Communication Models: 1) Pre-determined: At each time stepexactly one player sends a bit2) Alternating: The players alternatein sending bits3) Adaptive: If both send bits atthe same time these bits are lost4) Two channels: Each player sends abit whenever she wants

[KR 13]: Lower Bound: For the alternating communication

model

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[Haeupler 14]: Lower Bound: For alternating communication

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[Haeupler 14]: Lower Bound: For alternating communication For alternating communication in the

adversarial case!!

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[Haeupler 14]: Lower Bound: For the adversarial noise channel of

[GHS 14]

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[Haeupler 14]: Lower Bound: For the adversarial noise channel of

[GHS 14] Order of communication is not pre-

determined

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[Haeupler 14]: Lower Bound:

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

[Haeupler 14]: Lower Bound:

Conjecture: [H 14]: Similar bounds for pre-determinedorder of communication are false,without some regularity assumption

onthe order of communication

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

Upper Bound:

We give a function that proves this

We prove a lower bound on

π‘ͺ (𝜺 )=π₯π’π¦π’π§πŸπ’β†’βˆž

𝐦𝐒𝐧{ 𝒇 :π‘ͺπ‘ͺ ( 𝒇 )=𝒏}( 𝒏

π‘ͺπ‘ͺ𝜺 (𝒇 ) )

Pointer Jumping Game: ary tree, depth , owns odd layers owns even layers

Each player gets an edge going out ofevery node that she ownsGoal: Find the leaf reached

deg=

depth=

Pointer Jumping Game: Our main result:

Hence,

deg=

depth=

High Level Idea: starts by sending the first edge ( bits)With one of these bits was flippedCase I: sends the next edge ( bits)With these bits are wasted (since had the wrong first edge)In expectation: wasted bits

deg=

depth=

High Level Idea: starts by sending the first edge ( bits)With one of these bits was flippedCase II: sends additional bits, tocorrect the first edge.Needs to send bits to correct one error

deg=

depth=

High Level Idea: starts by sending the first edge ( bits)With one of these bits was flipped

In both cases bits were wasted (in expectation).

deg=

depth=

Some More Details: a distribution over inputs for Alice a distribution over inputs for Bob , min-entropy of (first edge of Alice)Nice Game:1) 2) 3) Lemma: For any nice game,

Proof: By induction on the depth : Consider the first bits transmitted. Let be the number of bits sent by Alice. After bits are sent, fix the first edge and reveal the errors (if any). Focus on the remaining tree of depth .Case 1: : The bits that Bob sent are wasted (using sub-additivity)Case 2: : Alice wasted bits.In both cases, the remained game is nice w.h.p.

Why it’s not so easy: 1) is constant, while is unbounded, so we cannot afford any error that depends on 2) With small probability, we get an un-nice game, so we must deal with un-nice games 3) In the inductive formula, we completely ignored the error of the protocol 4) We have to deal with both entropy and min-entropy in the same argument. We introduce a general way to deal with that (flattening a distribution)

Thank You!

top related