![Page 1: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/1.jpg)
Interactive Channel Capacity
![Page 2: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/2.jpg)
[Shannon 48]:A Mathematical Theory of
Communication
An exact formula for the channel
capacity of any noisy channel
![Page 3: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/3.jpg)
-noisy channel:Each bit is flipped with prob
Alice wants to send bits to Bob. They
only have access to an -noisy channel.
How many bits Alice needs to send, so
that Bob can retrieve the original bits,
with prob ?
1-
1-
0
1 1
0ππ
![Page 4: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/4.jpg)
Channel Capacity [Shannon 48]: 1) are sufficient2) are needed
channel capacity:
![Page 5: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/5.jpg)
Communication Complexity [Yao 79]:
Player gets . Player gets They need to compute ( is publicly known)How many bits they need to
communicate? probabilistic CC of (with negligible error for
every ) (with shared random string)
![Page 6: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/6.jpg)
CC over the -noisy channel: How many communication bits are
neededto compute over the -noisy channel? CC of over -noisy channel (with negligible error for
every ) (with shared random string)
![Page 7: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/7.jpg)
Interactive Channel Capacity: probabilistic CC of CC of over -noisy channel
(note: is not the input size)
![Page 8: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/8.jpg)
[Schulman 92]: Hence, [Sch,BR,B,GMS,BK,BN]:Simulation of any CC protocol in
the presence of adversarial noise[Shannon 48]: [Schulman 92]: Is ?
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 9: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/9.jpg)
[KR 13]: Upper Bound: In particular, for small enough , (with strict inequality)Order of Communication Model: Order of communication in the
protocol is pre-determined (i.e., non-adaptive)
(otherwise both players may try to send bits at the same time)
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 10: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/10.jpg)
[KR 13]: Upper Bound: In particular, for small enough , (with strict inequality)
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 11: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/11.jpg)
[KR 13]: Upper Bound: In particular, for small enough , (with strict inequality)The main ideas seem to be valid for other communication models, but
without the factor
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 12: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/12.jpg)
Order of Communication Models: 1) Pre-determined: At each time stepexactly one player sends a bit2) Alternating: The players alternatein sending bits3) Adaptive: If both send bits atthe same time these bits are lost4) Two channels: Each player sends abit whenever she wants
![Page 13: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/13.jpg)
[KR 13]: Lower Bound: For the alternating communication
model
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 14: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/14.jpg)
[Haeupler 14]: Lower Bound: For alternating communication
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 15: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/15.jpg)
[Haeupler 14]: Lower Bound: For alternating communication For alternating communication in the
adversarial case!!
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 16: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/16.jpg)
[Haeupler 14]: Lower Bound: For the adversarial noise channel of
[GHS 14]
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 17: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/17.jpg)
[Haeupler 14]: Lower Bound: For the adversarial noise channel of
[GHS 14] Order of communication is not pre-
determined
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 18: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/18.jpg)
[Haeupler 14]: Lower Bound:
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 19: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/19.jpg)
[Haeupler 14]: Lower Bound:
Conjecture: [H 14]: Similar bounds for pre-determinedorder of communication are false,without some regularity assumption
onthe order of communication
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 20: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/20.jpg)
Upper Bound:
We give a function that proves this
We prove a lower bound on
πͺ (πΊ )=π₯π’π¦π’π§ππββ
π¦π’π§{ π :πͺπͺ ( π )=π}( π
πͺπͺπΊ (π ) )
![Page 21: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/21.jpg)
Pointer Jumping Game: ary tree, depth , owns odd layers owns even layers
Each player gets an edge going out ofevery node that she ownsGoal: Find the leaf reached
deg=
depth=
![Page 22: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/22.jpg)
Pointer Jumping Game: Our main result:
Hence,
deg=
depth=
![Page 23: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/23.jpg)
High Level Idea: starts by sending the first edge ( bits)With one of these bits was flippedCase I: sends the next edge ( bits)With these bits are wasted (since had the wrong first edge)In expectation: wasted bits
deg=
depth=
![Page 24: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/24.jpg)
High Level Idea: starts by sending the first edge ( bits)With one of these bits was flippedCase II: sends additional bits, tocorrect the first edge.Needs to send bits to correct one error
deg=
depth=
![Page 25: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/25.jpg)
High Level Idea: starts by sending the first edge ( bits)With one of these bits was flipped
In both cases bits were wasted (in expectation).
deg=
depth=
![Page 26: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/26.jpg)
Some More Details: a distribution over inputs for Alice a distribution over inputs for Bob , min-entropy of (first edge of Alice)Nice Game:1) 2) 3) Lemma: For any nice game,
![Page 27: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/27.jpg)
Proof: By induction on the depth : Consider the first bits transmitted. Let be the number of bits sent by Alice. After bits are sent, fix the first edge and reveal the errors (if any). Focus on the remaining tree of depth .Case 1: : The bits that Bob sent are wasted (using sub-additivity)Case 2: : Alice wasted bits.In both cases, the remained game is nice w.h.p.
![Page 28: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/28.jpg)
Why itβs not so easy: 1) is constant, while is unbounded, so we cannot afford any error that depends on 2) With small probability, we get an un-nice game, so we must deal with un-nice games 3) In the inductive formula, we completely ignored the error of the protocol 4) We have to deal with both entropy and min-entropy in the same argument. We introduce a general way to deal with that (flattening a distribution)
![Page 29: Interactive Channel Capacity. [Shannon 48]: A Mathematical Theory of Communication An exact formula for the channel capacity of any noisy channel](https://reader035.vdocuments.site/reader035/viewer/2022081513/5a4d1aed7f8b9ab05997c3c4/html5/thumbnails/29.jpg)
Thank You!