entropy power inequalities: results and speculationghan/wci/venkat.pdf · v. jog and v. anantharam...

165
Entropy Power Inequalities: Results and Speculation Venkat Anantharam EECS Department University of California, Berkeley December 12, 2013 Workshop on Coding and Information Theory Institute of Mathematical Research The University of Hong Kong Contributions involve joint work with Varun Jog V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 1 / 57

Upload: others

Post on 06-Sep-2019

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Entropy Power Inequalities: Results and

Speculation

Venkat Anantharam

EECS DepartmentUniversity of California, Berkeley

December 12, 2013Workshop on Coding and Information Theory

Institute of Mathematical ResearchThe University of Hong Kong

Contributions involve joint work with Varun Jog

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 1 / 57

Page 2: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 2 / 57

Page 3: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57

Page 4: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Differential entropy and entropy power

Let X be an Rn-valued random variable with density f .

The differential entropy of X is

h(X) := E [− log f (X)] .

The entropy power of X is

N(X) :=1

2πee

2nh(X) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 4 / 57

Page 5: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Differential entropy and entropy power

Let X be an Rn-valued random variable with density f .

The differential entropy of X is

h(X) := E [− log f (X)] .

The entropy power of X is

N(X) :=1

2πee

2nh(X) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 4 / 57

Page 6: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Differential entropy and entropy power

Let X be an Rn-valued random variable with density f .

The differential entropy of X is

h(X) := E [− log f (X)] .

The entropy power of X is

N(X) :=1

2πee

2nh(X) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 4 / 57

Page 7: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Differential entropy and entropy power

Let X be an Rn-valued random variable with density f .

The differential entropy of X is

h(X) := E [− log f (X)] .

The entropy power of X is

N(X) :=1

2πee

2nh(X) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 4 / 57

Page 8: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shannon’s entropy power inequality

Theorem (Entropy Power Inequality)

If X and Y are independent Rn-valued random variables, then

e2nh(X+Y) ≥ e

2nh(X) + e

2nh(Y),

with equality if and only if X and Y are Gaussian with proportionalcovariance matrices.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 5 / 57

Page 9: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shannon’s entropy power inequality

Theorem (Entropy Power Inequality)

If X and Y are independent Rn-valued random variables, then

e2nh(X+Y) ≥ e

2nh(X) + e

2nh(Y),

with equality if and only if X and Y are Gaussian with proportionalcovariance matrices.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 5 / 57

Page 10: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Notable applications of the EPI-1

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 6 / 57

Page 11: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Notable applications of the EPI-2

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 7 / 57

Page 12: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Notable applications of the EPI-3

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 8 / 57

Page 13: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Notable applications of the EPI-4

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 9 / 57

Page 14: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 10 / 57

Page 15: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Minkowski sum of two subsets of Rn

Minkowski sum of a square and a circle

In general the Minkowski sum of two sets A,B ⊆ Rn is

A + B := {a + b : a ∈ A and b ∈ B} .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 11 / 57

Page 16: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Minkowski sum of two subsets of Rn

Minkowski sum of a square and a circle

In general the Minkowski sum of two sets A,B ⊆ Rn is

A + B := {a + b : a ∈ A and b ∈ B} .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 11 / 57

Page 17: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Minkowski sum of two subsets of Rn

Minkowski sum of a square and a circle

In general the Minkowski sum of two sets A,B ⊆ Rn is

A + B := {a + b : a ∈ A and b ∈ B} .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 11 / 57

Page 18: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Brunn Minkowski inequality

Let A,B ⊂ Rn be convex bodies (compact convex sets withnonempty interior).Then

Vol(A + B)1n ≥ Vol(A)

1n + Vol(B)

1n .

Equality holds iff A and B are homothetic, i.e. equal up totranslation and dilation.An equivalent form is that for convex bodies A,B ⊂ Rn and0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ (1− λ)Vol(A)

1n + λVol(B)

1n .

Another equivalent form is that for convex bodies A,B ⊂ Rn

and 0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 12 / 57

Page 19: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Brunn Minkowski inequalityLet A,B ⊂ Rn be convex bodies (compact convex sets withnonempty interior).

ThenVol(A + B)

1n ≥ Vol(A)

1n + Vol(B)

1n .

Equality holds iff A and B are homothetic, i.e. equal up totranslation and dilation.An equivalent form is that for convex bodies A,B ⊂ Rn and0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ (1− λ)Vol(A)

1n + λVol(B)

1n .

Another equivalent form is that for convex bodies A,B ⊂ Rn

and 0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 12 / 57

Page 20: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Brunn Minkowski inequalityLet A,B ⊂ Rn be convex bodies (compact convex sets withnonempty interior).Then

Vol(A + B)1n ≥ Vol(A)

1n + Vol(B)

1n .

Equality holds iff A and B are homothetic, i.e. equal up totranslation and dilation.An equivalent form is that for convex bodies A,B ⊂ Rn and0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ (1− λ)Vol(A)

1n + λVol(B)

1n .

Another equivalent form is that for convex bodies A,B ⊂ Rn

and 0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 12 / 57

Page 21: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Brunn Minkowski inequalityLet A,B ⊂ Rn be convex bodies (compact convex sets withnonempty interior).Then

Vol(A + B)1n ≥ Vol(A)

1n + Vol(B)

1n .

Equality holds iff A and B are homothetic, i.e. equal up totranslation and dilation.

An equivalent form is that for convex bodies A,B ⊂ Rn and0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ (1− λ)Vol(A)

1n + λVol(B)

1n .

Another equivalent form is that for convex bodies A,B ⊂ Rn

and 0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 12 / 57

Page 22: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Brunn Minkowski inequalityLet A,B ⊂ Rn be convex bodies (compact convex sets withnonempty interior).Then

Vol(A + B)1n ≥ Vol(A)

1n + Vol(B)

1n .

Equality holds iff A and B are homothetic, i.e. equal up totranslation and dilation.An equivalent form is that for convex bodies A,B ⊂ Rn and0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ (1− λ)Vol(A)

1n + λVol(B)

1n .

Another equivalent form is that for convex bodies A,B ⊂ Rn

and 0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 12 / 57

Page 23: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The Brunn Minkowski inequalityLet A,B ⊂ Rn be convex bodies (compact convex sets withnonempty interior).Then

Vol(A + B)1n ≥ Vol(A)

1n + Vol(B)

1n .

Equality holds iff A and B are homothetic, i.e. equal up totranslation and dilation.An equivalent form is that for convex bodies A,B ⊂ Rn and0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ (1− λ)Vol(A)

1n + λVol(B)

1n .

Another equivalent form is that for convex bodies A,B ⊂ Rn

and 0 < λ < 1 we have

Vol((1− λ)A + λB)1n ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 12 / 57

Page 24: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Parallels between the EPI and the Brunn

Minkowski inequality

The distribution of a random variable corresponds to a(measurable) subset of Rn.

Addition of independent random variables corresponds to theMinkowski sum.

Balls play the role of spherically symmetric Gaussians.

For instance, an equivalent form of the EPI is that if X and Y areindependent Rn-valued random variables, and 0 < λ < 1, then

h(√λX +

√1− λY) ≥ λh(X) + (1− λ)h(Y) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 13 / 57

Page 25: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Parallels between the EPI and the Brunn

Minkowski inequality

The distribution of a random variable corresponds to a(measurable) subset of Rn.

Addition of independent random variables corresponds to theMinkowski sum.

Balls play the role of spherically symmetric Gaussians.

For instance, an equivalent form of the EPI is that if X and Y areindependent Rn-valued random variables, and 0 < λ < 1, then

h(√λX +

√1− λY) ≥ λh(X) + (1− λ)h(Y) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 13 / 57

Page 26: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Parallels between the EPI and the Brunn

Minkowski inequality

The distribution of a random variable corresponds to a(measurable) subset of Rn.

Addition of independent random variables corresponds to theMinkowski sum.

Balls play the role of spherically symmetric Gaussians.

For instance, an equivalent form of the EPI is that if X and Y areindependent Rn-valued random variables, and 0 < λ < 1, then

h(√λX +

√1− λY) ≥ λh(X) + (1− λ)h(Y) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 13 / 57

Page 27: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Parallels between the EPI and the Brunn

Minkowski inequality

The distribution of a random variable corresponds to a(measurable) subset of Rn.

Addition of independent random variables corresponds to theMinkowski sum.

Balls play the role of spherically symmetric Gaussians.

For instance, an equivalent form of the EPI is that if X and Y areindependent Rn-valued random variables, and 0 < λ < 1, then

h(√λX +

√1− λY) ≥ λh(X) + (1− λ)h(Y) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 13 / 57

Page 28: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Parallels between the EPI and the Brunn

Minkowski inequality

The distribution of a random variable corresponds to a(measurable) subset of Rn.

Addition of independent random variables corresponds to theMinkowski sum.

Balls play the role of spherically symmetric Gaussians.

For instance, an equivalent form of the EPI is that if X and Y areindependent Rn-valued random variables, and 0 < λ < 1, then

h(√λX +

√1− λY) ≥ λh(X) + (1− λ)h(Y) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 13 / 57

Page 29: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 14 / 57

Page 30: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Some history

The EPI was conjectured by Shannon, who showed that X andY being Gaussian with proportional covariance matrices is astationary point of the difference between the entropy power ofthe sum and sum of the entropy powers. This did not excludethe possibility of it being a local maximum or a saddle point.

Stam gave the first rigorous proof of the EPI in 1959, using deBruijn’s identity which relates Fisher information and differentialentropy.

Blachman’s 1965 paper gives a simplified and succinct version ofStam’s proof.

Subsequently, several proofs have appeared, some of which willbe mentioned here for our story.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 15 / 57

Page 31: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Some history

The EPI was conjectured by Shannon, who showed that X andY being Gaussian with proportional covariance matrices is astationary point of the difference between the entropy power ofthe sum and sum of the entropy powers. This did not excludethe possibility of it being a local maximum or a saddle point.

Stam gave the first rigorous proof of the EPI in 1959, using deBruijn’s identity which relates Fisher information and differentialentropy.

Blachman’s 1965 paper gives a simplified and succinct version ofStam’s proof.

Subsequently, several proofs have appeared, some of which willbe mentioned here for our story.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 15 / 57

Page 32: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Some history

The EPI was conjectured by Shannon, who showed that X andY being Gaussian with proportional covariance matrices is astationary point of the difference between the entropy power ofthe sum and sum of the entropy powers. This did not excludethe possibility of it being a local maximum or a saddle point.

Stam gave the first rigorous proof of the EPI in 1959, using deBruijn’s identity which relates Fisher information and differentialentropy.

Blachman’s 1965 paper gives a simplified and succinct version ofStam’s proof.

Subsequently, several proofs have appeared, some of which willbe mentioned here for our story.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 15 / 57

Page 33: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Some history

The EPI was conjectured by Shannon, who showed that X andY being Gaussian with proportional covariance matrices is astationary point of the difference between the entropy power ofthe sum and sum of the entropy powers. This did not excludethe possibility of it being a local maximum or a saddle point.

Stam gave the first rigorous proof of the EPI in 1959, using deBruijn’s identity which relates Fisher information and differentialentropy.

Blachman’s 1965 paper gives a simplified and succinct version ofStam’s proof.

Subsequently, several proofs have appeared, some of which willbe mentioned here for our story.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 15 / 57

Page 34: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Some history

The EPI was conjectured by Shannon, who showed that X andY being Gaussian with proportional covariance matrices is astationary point of the difference between the entropy power ofthe sum and sum of the entropy powers. This did not excludethe possibility of it being a local maximum or a saddle point.

Stam gave the first rigorous proof of the EPI in 1959, using deBruijn’s identity which relates Fisher information and differentialentropy.

Blachman’s 1965 paper gives a simplified and succinct version ofStam’s proof.

Subsequently, several proofs have appeared, some of which willbe mentioned here for our story.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 15 / 57

Page 35: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s (1965) proof strategy for the EPI

following Stam (1959)

Step 1: Prove de Bruijn’s identity for scalar random variablesLet Xt = X +

√tN , where N ∼ N (0, 1). Let X ∼ p(x), and

X +√tN ∼ pt(xt).

De Bruijn’s Identity

d

dth(Xt) =

1

2J(Xt) =

1

2

∫ ∞−∞

(∂

∂xtpt(xt))2 dxt

pt(xt)

The proof follows from differentiating

h(Xt) = −∫ ∞−∞

pt(xt) log pt(xt)dxt ,

with respect to t, and using the heat equation

∂tpt(xt) =

1

2

∂2

∂t2pt(xt).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 16 / 57

Page 36: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s (1965) proof strategy for the EPI

following Stam (1959)Step 1: Prove de Bruijn’s identity for scalar random variables

Let Xt = X +√tN , where N ∼ N (0, 1). Let X ∼ p(x), and

X +√tN ∼ pt(xt).

De Bruijn’s Identity

d

dth(Xt) =

1

2J(Xt) =

1

2

∫ ∞−∞

(∂

∂xtpt(xt))2 dxt

pt(xt)

The proof follows from differentiating

h(Xt) = −∫ ∞−∞

pt(xt) log pt(xt)dxt ,

with respect to t, and using the heat equation

∂tpt(xt) =

1

2

∂2

∂t2pt(xt).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 16 / 57

Page 37: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s (1965) proof strategy for the EPI

following Stam (1959)Step 1: Prove de Bruijn’s identity for scalar random variables

Let Xt = X +√tN , where N ∼ N (0, 1). Let X ∼ p(x), and

X +√tN ∼ pt(xt).

De Bruijn’s Identity

d

dth(Xt) =

1

2J(Xt) =

1

2

∫ ∞−∞

(∂

∂xtpt(xt))2 dxt

pt(xt)

The proof follows from differentiating

h(Xt) = −∫ ∞−∞

pt(xt) log pt(xt)dxt ,

with respect to t, and using the heat equation

∂tpt(xt) =

1

2

∂2

∂t2pt(xt).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 16 / 57

Page 38: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s (1965) proof strategy for the EPI

following Stam (1959)Step 1: Prove de Bruijn’s identity for scalar random variables

Let Xt = X +√tN , where N ∼ N (0, 1). Let X ∼ p(x), and

X +√tN ∼ pt(xt).

De Bruijn’s Identity

d

dth(Xt) =

1

2J(Xt) =

1

2

∫ ∞−∞

(∂

∂xtpt(xt))2 dxt

pt(xt)

The proof follows from differentiating

h(Xt) = −∫ ∞−∞

pt(xt) log pt(xt)dxt ,

with respect to t, and using the heat equation

∂tpt(xt) =

1

2

∂2

∂t2pt(xt).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 16 / 57

Page 39: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s (1965) proof strategy for the EPI

following Stam (1959)Step 1: Prove de Bruijn’s identity for scalar random variables

Let Xt = X +√tN , where N ∼ N (0, 1). Let X ∼ p(x), and

X +√tN ∼ pt(xt).

De Bruijn’s Identity

d

dth(Xt) =

1

2J(Xt) =

1

2

∫ ∞−∞

(∂

∂xtpt(xt))2 dxt

pt(xt)

The proof follows from differentiating

h(Xt) = −∫ ∞−∞

pt(xt) log pt(xt)dxt ,

with respect to t, and using the heat equation

∂tpt(xt) =

1

2

∂2

∂t2pt(xt).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 16 / 57

Page 40: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 2: Prove Stam’s inequality

If X and Y are independent, and Z = X + Y then

Stam’s Inequality1

J(Z)≥ 1

J(X )+ 1

J(Y )

The proof goes by showing that

E(ap′X (x)

pX (x)+ b

p′Y (y)

pY (y)|Z = z

)= (a + b)

p′Z (z)

pZ (z),

for all a, b. Then squaring both sides, using Jensen’s inequalityand optimizing over the choice of a, b establishes the inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 17 / 57

Page 41: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 2: Prove Stam’s inequality

If X and Y are independent, and Z = X + Y then

Stam’s Inequality1

J(Z)≥ 1

J(X )+ 1

J(Y )

The proof goes by showing that

E(ap′X (x)

pX (x)+ b

p′Y (y)

pY (y)|Z = z

)= (a + b)

p′Z (z)

pZ (z),

for all a, b. Then squaring both sides, using Jensen’s inequalityand optimizing over the choice of a, b establishes the inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 17 / 57

Page 42: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 2: Prove Stam’s inequality

If X and Y are independent, and Z = X + Y then

Stam’s Inequality1

J(Z)≥ 1

J(X )+ 1

J(Y )

The proof goes by showing that

E(ap′X (x)

pX (x)+ b

p′Y (y)

pY (y)|Z = z

)= (a + b)

p′Z (z)

pZ (z),

for all a, b. Then squaring both sides, using Jensen’s inequalityand optimizing over the choice of a, b establishes the inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 17 / 57

Page 43: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 2: Prove Stam’s inequality

If X and Y are independent, and Z = X + Y then

Stam’s Inequality1

J(Z)≥ 1

J(X )+ 1

J(Y )

The proof goes by showing that

E(ap′X (x)

pX (x)+ b

p′Y (y)

pY (y)|Z = z

)= (a + b)

p′Z (z)

pZ (z),

for all a, b. Then squaring both sides, using Jensen’s inequalityand optimizing over the choice of a, b establishes the inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 17 / 57

Page 44: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 3: Set up the one parameter flow defined by the heat equationstarting with the two initial distributions, with suitable timereparametrizations f (t), g(t) for each, respectively

Let f (t) and g(t) be increasing functions tending to infinity witht, and consider the function

s(t) =e2h(Xf ) + e2h(Yg )

e2h(Zf +g ).

Intuitively, limt→∞ s(t) = 1, since both initial distributionsbecome increasingly Gaussian as time progresses along the flow.

Choosing f ′(t) = exp(2h(Xf )) and g ′(t) = exp(2h(Yg )), andusing Stam’s inequality gives s ′(t) ≥ 0. This gives s(0) ≤ 1,proving the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 18 / 57

Page 45: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 3: Set up the one parameter flow defined by the heat equationstarting with the two initial distributions, with suitable timereparametrizations f (t), g(t) for each, respectively

Let f (t) and g(t) be increasing functions tending to infinity witht, and consider the function

s(t) =e2h(Xf ) + e2h(Yg )

e2h(Zf +g ).

Intuitively, limt→∞ s(t) = 1, since both initial distributionsbecome increasingly Gaussian as time progresses along the flow.

Choosing f ′(t) = exp(2h(Xf )) and g ′(t) = exp(2h(Yg )), andusing Stam’s inequality gives s ′(t) ≥ 0. This gives s(0) ≤ 1,proving the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 18 / 57

Page 46: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 3: Set up the one parameter flow defined by the heat equationstarting with the two initial distributions, with suitable timereparametrizations f (t), g(t) for each, respectively

Let f (t) and g(t) be increasing functions tending to infinity witht, and consider the function

s(t) =e2h(Xf ) + e2h(Yg )

e2h(Zf +g ).

Intuitively, limt→∞ s(t) = 1, since both initial distributionsbecome increasingly Gaussian as time progresses along the flow.

Choosing f ′(t) = exp(2h(Xf )) and g ′(t) = exp(2h(Yg )), andusing Stam’s inequality gives s ′(t) ≥ 0. This gives s(0) ≤ 1,proving the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 18 / 57

Page 47: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 3: Set up the one parameter flow defined by the heat equationstarting with the two initial distributions, with suitable timereparametrizations f (t), g(t) for each, respectively

Let f (t) and g(t) be increasing functions tending to infinity witht, and consider the function

s(t) =e2h(Xf ) + e2h(Yg )

e2h(Zf +g ).

Intuitively, limt→∞ s(t) = 1, since both initial distributionsbecome increasingly Gaussian as time progresses along the flow.

Choosing f ′(t) = exp(2h(Xf )) and g ′(t) = exp(2h(Yg )), andusing Stam’s inequality gives s ′(t) ≥ 0. This gives s(0) ≤ 1,proving the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 18 / 57

Page 48: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Blachman’s proof strategy (contd.)

Step 3: Set up the one parameter flow defined by the heat equationstarting with the two initial distributions, with suitable timereparametrizations f (t), g(t) for each, respectively

Let f (t) and g(t) be increasing functions tending to infinity witht, and consider the function

s(t) =e2h(Xf ) + e2h(Yg )

e2h(Zf +g ).

Intuitively, limt→∞ s(t) = 1, since both initial distributionsbecome increasingly Gaussian as time progresses along the flow.

Choosing f ′(t) = exp(2h(Xf )) and g ′(t) = exp(2h(Yg )), andusing Stam’s inequality gives s ′(t) ≥ 0. This gives s(0) ≤ 1,proving the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 18 / 57

Page 49: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Lieb’s EPI proof as interpreted by Verdu and Guo

Reparametrization of the de Bruijn identity gives theGuo-Shamai-Verdu I-MMSE relation:

d

dγI (X ;

√γX + N) =

1

2mmse(X , γ),

and thus

h(X ) =1

2log 2πe − 1

2

∫ ∞0

1

1 + γ−mmse(X , γ)dγ.

The EPI in its equivalent form,

h(X1 cosα + X2 sinα) ≥ cos2 αh(X1) + sin2 αh(X2),

can be proved using these ideas.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 19 / 57

Page 50: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Lieb’s EPI proof as interpreted by Verdu and Guo

Reparametrization of the de Bruijn identity gives theGuo-Shamai-Verdu I-MMSE relation:

d

dγI (X ;

√γX + N) =

1

2mmse(X , γ),

and thus

h(X ) =1

2log 2πe − 1

2

∫ ∞0

1

1 + γ−mmse(X , γ)dγ.

The EPI in its equivalent form,

h(X1 cosα + X2 sinα) ≥ cos2 αh(X1) + sin2 αh(X2),

can be proved using these ideas.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 19 / 57

Page 51: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Lieb’s EPI proof as interpreted by Verdu and Guo

Reparametrization of the de Bruijn identity gives theGuo-Shamai-Verdu I-MMSE relation:

d

dγI (X ;

√γX + N) =

1

2mmse(X , γ),

and thus

h(X ) =1

2log 2πe − 1

2

∫ ∞0

1

1 + γ−mmse(X , γ)dγ.

The EPI in its equivalent form,

h(X1 cosα + X2 sinα) ≥ cos2 αh(X1) + sin2 αh(X2),

can be proved using these ideas.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 19 / 57

Page 52: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Verdu and Guo’s proof: Proof strategy

h(X1 cosα + X2 sinα) ≥ cos2 αh(X1) + sin2 αh(X2) =

1

2

∫ (mmse(X1 cosα + X2 sinα)− cos2 α mmse(X1, γ)

+ sin2 α mmse(X2, γ)

)dγ

Consider Z1 =√γX1 + N1, Z2 =

√γX2 + N2 and

Z = Z1 cosα + Z2 sinα. Then

mmse(X1 cosα + X2 sinα|Z ) ≥ mmse(X |Z1,Z2)

immediately gives the term inside the integral is ≥ 0.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 20 / 57

Page 53: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Verdu and Guo’s proof: Proof strategy

h(X1 cosα + X2 sinα) ≥ cos2 αh(X1) + sin2 αh(X2) =

1

2

∫ (mmse(X1 cosα + X2 sinα)− cos2 α mmse(X1, γ)

+ sin2 α mmse(X2, γ)

)dγ

Consider Z1 =√γX1 + N1, Z2 =

√γX2 + N2 and

Z = Z1 cosα + Z2 sinα. Then

mmse(X1 cosα + X2 sinα|Z ) ≥ mmse(X |Z1,Z2)

immediately gives the term inside the integral is ≥ 0.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 20 / 57

Page 54: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Verdu and Guo’s proof: Proof strategy

h(X1 cosα + X2 sinα) ≥ cos2 αh(X1) + sin2 αh(X2) =

1

2

∫ (mmse(X1 cosα + X2 sinα)− cos2 α mmse(X1, γ)

+ sin2 α mmse(X2, γ)

)dγ

Consider Z1 =√γX1 + N1, Z2 =

√γX2 + N2 and

Z = Z1 cosα + Z2 sinα. Then

mmse(X1 cosα + X2 sinα|Z ) ≥ mmse(X |Z1,Z2)

immediately gives the term inside the integral is ≥ 0.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 20 / 57

Page 55: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof of the EPI

Szarek and Voiculescu (1996) give a proof of the EPI workingdirectly with typical sets, using a ‘restricted’ version of theBrunn-Minkowski inequality.

The restricted Minkowski sum of sets A and B , based onΘ ⊆ A× B is defined as

A +Θ B := {x + y : (x , y) ∈ Θ} .

Szarek and Voiculescu’s restricted B-M inequality says: For everyε, there exists δ such that if Θ is large enough, vizV2n(Θ) ≥ (1− δ)nVn(A)Vn(B), then

Vn(A +Θ B)2n ≥ (1− ε)(Vn(A)

2n + Vn(B)

2n ).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 21 / 57

Page 56: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof of the EPI

Szarek and Voiculescu (1996) give a proof of the EPI workingdirectly with typical sets, using a ‘restricted’ version of theBrunn-Minkowski inequality.

The restricted Minkowski sum of sets A and B , based onΘ ⊆ A× B is defined as

A +Θ B := {x + y : (x , y) ∈ Θ} .

Szarek and Voiculescu’s restricted B-M inequality says: For everyε, there exists δ such that if Θ is large enough, vizV2n(Θ) ≥ (1− δ)nVn(A)Vn(B), then

Vn(A +Θ B)2n ≥ (1− ε)(Vn(A)

2n + Vn(B)

2n ).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 21 / 57

Page 57: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof of the EPI

Szarek and Voiculescu (1996) give a proof of the EPI workingdirectly with typical sets, using a ‘restricted’ version of theBrunn-Minkowski inequality.

The restricted Minkowski sum of sets A and B , based onΘ ⊆ A× B is defined as

A +Θ B := {x + y : (x , y) ∈ Θ} .

Szarek and Voiculescu’s restricted B-M inequality says: For everyε, there exists δ such that if Θ is large enough, vizV2n(Θ) ≥ (1− δ)nVn(A)Vn(B), then

Vn(A +Θ B)2n ≥ (1− ε)(Vn(A)

2n + Vn(B)

2n ).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 21 / 57

Page 58: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof of the EPI

Szarek and Voiculescu (1996) give a proof of the EPI workingdirectly with typical sets, using a ‘restricted’ version of theBrunn-Minkowski inequality.

The restricted Minkowski sum of sets A and B , based onΘ ⊆ A× B is defined as

A +Θ B := {x + y : (x , y) ∈ Θ} .

Szarek and Voiculescu’s restricted B-M inequality says: For everyε, there exists δ such that if Θ is large enough, vizV2n(Θ) ≥ (1− δ)nVn(A)Vn(B), then

Vn(A +Θ B)2n ≥ (1− ε)(Vn(A)

2n + Vn(B)

2n ).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 21 / 57

Page 59: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof (contd.)

To prove the EPI, Szarek and Voiculescu replace A and B bytypical sets for n i.i.d. copies of X and Y respectively, anddefine Θ as all pairs (xn, yn) (each marginal typical) such thatxn + yn is typical for X + Y .

For this choice of restriction, they determine suitable sequenceof typicality defining constants (εn, δn) going to 0, thus provingthe EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 22 / 57

Page 60: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof (contd.)

To prove the EPI, Szarek and Voiculescu replace A and B bytypical sets for n i.i.d. copies of X and Y respectively, anddefine Θ as all pairs (xn, yn) (each marginal typical) such thatxn + yn is typical for X + Y .

For this choice of restriction, they determine suitable sequenceof typicality defining constants (εn, δn) going to 0, thus provingthe EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 22 / 57

Page 61: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Szarek and Voiculescu’s proof (contd.)

To prove the EPI, Szarek and Voiculescu replace A and B bytypical sets for n i.i.d. copies of X and Y respectively, anddefine Θ as all pairs (xn, yn) (each marginal typical) such thatxn + yn is typical for X + Y .

For this choice of restriction, they determine suitable sequenceof typicality defining constants (εn, δn) going to 0, thus provingthe EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 22 / 57

Page 62: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 23 / 57

Page 63: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s Lemma for Gaussian random

variables

n2

log(e2nx + e

2ny ) is the minimum achievable differential entropy

of X + Y where X and Y are independent Rn-valued randomvariables with differential entropies x and y respectively.

Note that(x , y) 7→ n

2log(e

2nx + e

2ny )

is convex in (x , y).

In particular, for fixed x ,

y 7→ n

2log(e

2nx + e

2ny )

is convex in y and for fixed y ,

x 7→ n

2log(e

2nx + e

2ny )

is convex in x .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 24 / 57

Page 64: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s Lemma for Gaussian random

variablesn2

log(e2nx + e

2ny ) is the minimum achievable differential entropy

of X + Y where X and Y are independent Rn-valued randomvariables with differential entropies x and y respectively.

Note that(x , y) 7→ n

2log(e

2nx + e

2ny )

is convex in (x , y).

In particular, for fixed x ,

y 7→ n

2log(e

2nx + e

2ny )

is convex in y and for fixed y ,

x 7→ n

2log(e

2nx + e

2ny )

is convex in x .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 24 / 57

Page 65: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s Lemma for Gaussian random

variablesn2

log(e2nx + e

2ny ) is the minimum achievable differential entropy

of X + Y where X and Y are independent Rn-valued randomvariables with differential entropies x and y respectively.

Note that(x , y) 7→ n

2log(e

2nx + e

2ny )

is convex in (x , y).

In particular, for fixed x ,

y 7→ n

2log(e

2nx + e

2ny )

is convex in y and for fixed y ,

x 7→ n

2log(e

2nx + e

2ny )

is convex in x .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 24 / 57

Page 66: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s Lemma for Gaussian random

variablesn2

log(e2nx + e

2ny ) is the minimum achievable differential entropy

of X + Y where X and Y are independent Rn-valued randomvariables with differential entropies x and y respectively.

Note that(x , y) 7→ n

2log(e

2nx + e

2ny )

is convex in (x , y).

In particular, for fixed x ,

y 7→ n

2log(e

2nx + e

2ny )

is convex in y and for fixed y ,

x 7→ n

2log(e

2nx + e

2ny )

is convex in x .V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 24 / 57

Page 67: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 25 / 57

Page 68: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequality

The Hausdorff-Young inequality gives an estimate on the normof the Fourier transform: ‖F f ‖p′ ≤ ‖f ‖p , for 1 ≤ p ≤ 2.

The sharp constant in the inequality was found by Beckner

‖F f ‖p′ ≤ Cp‖f ‖p , where Cp :=√|p|1/p|p′|1/p′ .

This leads to Young’s inequality. If p, q, r > 0 satisfy1p

+ 1q

= 1 + 1r, then

‖f ∗ g‖r ≤CpCq

Cr‖f ‖p‖g‖q , if p, q, r ≥ 1 ,

‖f ∗ g‖r ≥CpCq

Cr‖f ‖p‖g‖q , if 1 ≥ p, q, r .

The second half of this is called the reverse Young’s inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 26 / 57

Page 69: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequalityThe Hausdorff-Young inequality gives an estimate on the normof the Fourier transform: ‖F f ‖p′ ≤ ‖f ‖p , for 1 ≤ p ≤ 2.

The sharp constant in the inequality was found by Beckner

‖F f ‖p′ ≤ Cp‖f ‖p , where Cp :=√|p|1/p|p′|1/p′ .

This leads to Young’s inequality. If p, q, r > 0 satisfy1p

+ 1q

= 1 + 1r, then

‖f ∗ g‖r ≤CpCq

Cr‖f ‖p‖g‖q , if p, q, r ≥ 1 ,

‖f ∗ g‖r ≥CpCq

Cr‖f ‖p‖g‖q , if 1 ≥ p, q, r .

The second half of this is called the reverse Young’s inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 26 / 57

Page 70: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequalityThe Hausdorff-Young inequality gives an estimate on the normof the Fourier transform: ‖F f ‖p′ ≤ ‖f ‖p , for 1 ≤ p ≤ 2.

The sharp constant in the inequality was found by Beckner

‖F f ‖p′ ≤ Cp‖f ‖p , where Cp :=√|p|1/p|p′|1/p′ .

This leads to Young’s inequality. If p, q, r > 0 satisfy1p

+ 1q

= 1 + 1r, then

‖f ∗ g‖r ≤CpCq

Cr‖f ‖p‖g‖q , if p, q, r ≥ 1 ,

‖f ∗ g‖r ≥CpCq

Cr‖f ‖p‖g‖q , if 1 ≥ p, q, r .

The second half of this is called the reverse Young’s inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 26 / 57

Page 71: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequalityThe Hausdorff-Young inequality gives an estimate on the normof the Fourier transform: ‖F f ‖p′ ≤ ‖f ‖p , for 1 ≤ p ≤ 2.

The sharp constant in the inequality was found by Beckner

‖F f ‖p′ ≤ Cp‖f ‖p , where Cp :=√|p|1/p|p′|1/p′ .

This leads to Young’s inequality. If p, q, r > 0 satisfy1p

+ 1q

= 1 + 1r, then

‖f ∗ g‖r ≤CpCq

Cr‖f ‖p‖g‖q , if p, q, r ≥ 1 ,

‖f ∗ g‖r ≥CpCq

Cr‖f ‖p‖g‖q , if 1 ≥ p, q, r .

The second half of this is called the reverse Young’s inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 26 / 57

Page 72: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequalityThe Hausdorff-Young inequality gives an estimate on the normof the Fourier transform: ‖F f ‖p′ ≤ ‖f ‖p , for 1 ≤ p ≤ 2.

The sharp constant in the inequality was found by Beckner

‖F f ‖p′ ≤ Cp‖f ‖p , where Cp :=√|p|1/p|p′|1/p′ .

This leads to Young’s inequality. If p, q, r > 0 satisfy1p

+ 1q

= 1 + 1r, then

‖f ∗ g‖r ≤CpCq

Cr‖f ‖p‖g‖q , if p, q, r ≥ 1 ,

‖f ∗ g‖r ≥CpCq

Cr‖f ‖p‖g‖q , if 1 ≥ p, q, r .

The second half of this is called the reverse Young’s inequality.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 26 / 57

Page 73: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 74: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 75: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 76: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 77: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 78: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 79: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI from the Young’s inequality

For p > 1, hp(X) := p1−p‖f ‖p is called the Renyi entropy of

X ∼ f .

Np(X) := 12πp−p

′/p‖f ‖−2p′/np is the Renyi entropy power of

X ∼ f .

As p → 1 these converge respectively to the differential entropyand the entropy power of X.

Given r > 1 and 0 < λ < 1 define p(r) := r(1−λ)+λr

and

q(r) := rλ+(1−λ)r

.

Young’s inequality gives

Nr (X + Y) ≥(Np(X)

1− λ

)1−λ(Nq(Y)

λ

)λ.

Optimize over λ and let r → 1 to get the EPI.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 27 / 57

Page 80: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality from the reverse

Young’s inequality

The limit as r → 0 in Young’s inequality leads to thePrekopa-Leindler inequality: If 0 < λ < 1 and f , g , h arenonnegative integrable functions on Rn satisfying

h((1− λ)x + λy) ≥ f (x)1−λg(y)λ

for all x , y ∈ Rn, then∫Rn

h(x)dx ≥(∫

Rn

f (x)dx

)1−λ(∫Rn

g(x)dx

)λ.

Setting f := 1A, g := 1B , and h := 1(1−λ)A+λB , this gives

Vol((1−λ)A+λB) ≥ Vol(A)1−λVol(B)λ ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 28 / 57

Page 81: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality from the reverse

Young’s inequalityThe limit as r → 0 in Young’s inequality leads to thePrekopa-Leindler inequality: If 0 < λ < 1 and f , g , h arenonnegative integrable functions on Rn satisfying

h((1− λ)x + λy) ≥ f (x)1−λg(y)λ

for all x , y ∈ Rn, then∫Rn

h(x)dx ≥(∫

Rn

f (x)dx

)1−λ(∫Rn

g(x)dx

)λ.

Setting f := 1A, g := 1B , and h := 1(1−λ)A+λB , this gives

Vol((1−λ)A+λB) ≥ Vol(A)1−λVol(B)λ ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 28 / 57

Page 82: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality from the reverse

Young’s inequalityThe limit as r → 0 in Young’s inequality leads to thePrekopa-Leindler inequality: If 0 < λ < 1 and f , g , h arenonnegative integrable functions on Rn satisfying

h((1− λ)x + λy) ≥ f (x)1−λg(y)λ

for all x , y ∈ Rn, then∫Rn

h(x)dx ≥(∫

Rn

f (x)dx

)1−λ(∫Rn

g(x)dx

)λ.

Setting f := 1A, g := 1B , and h := 1(1−λ)A+λB , this gives

Vol((1−λ)A+λB) ≥ Vol(A)1−λVol(B)λ ≥ min(Vol(A),Vol(B)) .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 28 / 57

Page 83: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 29 / 57

Page 84: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s lemma

Let h be the binary entropy function, and h−1 be its inverse.Wyner and Ziv (1973) showed that for a binary X and arbitraryU , if Z ∼ Bern(p) is independent of (X ,U) then

h(X ⊕ Z |U) ≥ h(h−1(H(X |U) ? p).

The result follows immediately from the following lemma, usingJensen’s inequality -

LemmaThe function

x 7→ h(h−1(x) ? p)

is convex on 0 ≤ x ≤ log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 30 / 57

Page 85: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s lemma

Let h be the binary entropy function, and h−1 be its inverse.Wyner and Ziv (1973) showed that for a binary X and arbitraryU , if Z ∼ Bern(p) is independent of (X ,U) then

h(X ⊕ Z |U) ≥ h(h−1(H(X |U) ? p).

The result follows immediately from the following lemma, usingJensen’s inequality -

LemmaThe function

x 7→ h(h−1(x) ? p)

is convex on 0 ≤ x ≤ log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 30 / 57

Page 86: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mrs. Gerber’s lemma

Let h be the binary entropy function, and h−1 be its inverse.Wyner and Ziv (1973) showed that for a binary X and arbitraryU , if Z ∼ Bern(p) is independent of (X ,U) then

h(X ⊕ Z |U) ≥ h(h−1(H(X |U) ? p).

The result follows immediately from the following lemma, usingJensen’s inequality -

LemmaThe function

x 7→ h(h−1(x) ? p)

is convex on 0 ≤ x ≤ log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 30 / 57

Page 87: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Wyner and Ziv’s proof of Mrs. Gerber’s lemma

Introduce the parametrization α = 1−h−1(x)2

, let p = 1−a2

, anddifferentiate wrt α, to get that

f ′′(x) ≥ 0 ⇐⇒ a(1− α2) log1 + α

1− α≤ (1− aα)2 log

1 + aα

1− aα,

which can be verified using the series expansion of log 1+x1−x .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 31 / 57

Page 88: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Wyner and Ziv’s proof of Mrs. Gerber’s lemma

Introduce the parametrization α = 1−h−1(x)2

, let p = 1−a2

, anddifferentiate wrt α, to get that

f ′′(x) ≥ 0 ⇐⇒ a(1− α2) log1 + α

1− α≤ (1− aα)2 log

1 + aα

1− aα,

which can be verified using the series expansion of log 1+x1−x .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 31 / 57

Page 89: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shamai and Wyner’s EPI

Shamai and Wyner show that -

EPI for binary sequences

For binary independent processes with entropies H(X ) and H(Y ),

σ(Z ) ≥ σ(X ) ? σ(Y ),

where Z = X ⊕ Y and σ(X ) = h−1(H(X )), σ(Y ) = h−1(H(Y ) andσ(Z ) = h−1(H(Z )).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 32 / 57

Page 90: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shamai and Wyner’s EPI

Shamai and Wyner show that -

EPI for binary sequences

For binary independent processes with entropies H(X ) and H(Y ),

σ(Z ) ≥ σ(X ) ? σ(Y ),

where Z = X ⊕ Y and σ(X ) = h−1(H(X )), σ(Y ) = h−1(H(Y ) andσ(Z ) = h−1(H(Z )).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 32 / 57

Page 91: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shamai and Wyner’s proof strategy for the binary

EPI

H(Z0|Z−1−n ) ≥ H(Z0|Z−1

−n ,X−1−n ,Y

−1−n ) = H(Z0|X−1

−n ,Y−1−n ),

=∑

P(X−1−n = x)P(Y −1

−n = y)H(Z0|X−1−n = x ,Y −1

−n = y)

=∑

P(X−1−n = x)P(Y −1

−n = y)h(α(x) ? β(y))

Where α(x) = P(X0 = 1|X−1−n = x), β(x) = P(Y0 = 1|Y −1

−n = y).

Using convexity in MGL twice, summing over x and then y , weget

H(Z0|Z−1−n ) ≥ h(h−1(H(X0|X−1

−n ) ? h−1(H(Y0|Y −1−n )).

Taking n→∞ proves the result.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 33 / 57

Page 92: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shamai and Wyner’s proof strategy for the binary

EPI

H(Z0|Z−1−n ) ≥ H(Z0|Z−1

−n ,X−1−n ,Y

−1−n ) = H(Z0|X−1

−n ,Y−1−n ),

=∑

P(X−1−n = x)P(Y −1

−n = y)H(Z0|X−1−n = x ,Y −1

−n = y)

=∑

P(X−1−n = x)P(Y −1

−n = y)h(α(x) ? β(y))

Where α(x) = P(X0 = 1|X−1−n = x), β(x) = P(Y0 = 1|Y −1

−n = y).

Using convexity in MGL twice, summing over x and then y , weget

H(Z0|Z−1−n ) ≥ h(h−1(H(X0|X−1

−n ) ? h−1(H(Y0|Y −1−n )).

Taking n→∞ proves the result.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 33 / 57

Page 93: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Shamai and Wyner’s proof strategy for the binary

EPI

H(Z0|Z−1−n ) ≥ H(Z0|Z−1

−n ,X−1−n ,Y

−1−n ) = H(Z0|X−1

−n ,Y−1−n ),

=∑

P(X−1−n = x)P(Y −1

−n = y)H(Z0|X−1−n = x ,Y −1

−n = y)

=∑

P(X−1−n = x)P(Y −1

−n = y)h(α(x) ? β(y))

Where α(x) = P(X0 = 1|X−1−n = x), β(x) = P(Y0 = 1|Y −1

−n = y).

Using convexity in MGL twice, summing over x and then y , weget

H(Z0|Z−1−n ) ≥ h(h−1(H(X0|X−1

−n ) ? h−1(H(Y0|Y −1−n )).

Taking n→∞ proves the result.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 33 / 57

Page 94: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Harremoes’s EPI for the Binomial family

Let Xn ∼ Binomial(n, 12), then for all m, n ≥ 1,

e2H(Xn+Xm) ≥ e2H(Xn) + e2H(Xm).

The proof follows by showing that Yn = e2H(Xn)

nis an increasing

sequence, and thus is super additive, i.e Ym+n ≥ Ym + Yn.

Figure : Values of Yn

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 34 / 57

Page 95: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Harremoes’s EPI for the Binomial family

Let Xn ∼ Binomial(n, 12), then for all m, n ≥ 1,

e2H(Xn+Xm) ≥ e2H(Xn) + e2H(Xm).

The proof follows by showing that Yn = e2H(Xn)

nis an increasing

sequence, and thus is super additive, i.e Ym+n ≥ Ym + Yn.

Figure : Values of Yn

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 34 / 57

Page 96: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Harremoes’s EPI for the Binomial family

Let Xn ∼ Binomial(n, 12), then for all m, n ≥ 1,

e2H(Xn+Xm) ≥ e2H(Xn) + e2H(Xm).

The proof follows by showing that Yn = e2H(Xn)

nis an increasing

sequence, and thus is super additive, i.e Ym+n ≥ Ym + Yn.

Figure : Values of Yn

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 34 / 57

Page 97: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our approach

Suppose the random variables take values on a finite abelian group G .We define the function fG : [0, log |G |]× [0, log |G |]→ [0, log |G |] by

fG (x , y) = minH(X )=x ,H(Y )=y

H(X + Y )

We now ask the questions-

Does fG have a succinct description?

Does fG have any nice properties?

We try to exploit the group structure to answer these questions.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 35 / 57

Page 98: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our approach

Suppose the random variables take values on a finite abelian group G .We define the function fG : [0, log |G |]× [0, log |G |]→ [0, log |G |] by

fG (x , y) = minH(X )=x ,H(Y )=y

H(X + Y )

We now ask the questions-

Does fG have a succinct description?

Does fG have any nice properties?

We try to exploit the group structure to answer these questions.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 35 / 57

Page 99: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our approach

Suppose the random variables take values on a finite abelian group G .We define the function fG : [0, log |G |]× [0, log |G |]→ [0, log |G |] by

fG (x , y) = minH(X )=x ,H(Y )=y

H(X + Y )

We now ask the questions-

Does fG have a succinct description?

Does fG have any nice properties?

We try to exploit the group structure to answer these questions.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 35 / 57

Page 100: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our approach

Suppose the random variables take values on a finite abelian group G .We define the function fG : [0, log |G |]× [0, log |G |]→ [0, log |G |] by

fG (x , y) = minH(X )=x ,H(Y )=y

H(X + Y )

We now ask the questions-

Does fG have a succinct description?

Does fG have any nice properties?

We try to exploit the group structure to answer these questions.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 35 / 57

Page 101: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our approach

Suppose the random variables take values on a finite abelian group G .We define the function fG : [0, log |G |]× [0, log |G |]→ [0, log |G |] by

fG (x , y) = minH(X )=x ,H(Y )=y

H(X + Y )

We now ask the questions-

Does fG have a succinct description?

Does fG have any nice properties?

We try to exploit the group structure to answer these questions.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 35 / 57

Page 102: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Blachman’s proof?

Find a 1-parameter group to evolve the probability distributionstowards the uniform distribution on the group?

Perhaps the analogs of the Gaussian distributions are thedistributions of the form

p(0) = 1− ε , p(g) =1

|G | − 1ε for g 6= 0 , 0 ≤ ε ≤ |G | − 1

|G |?

But these are not extremal for the EPI. For instance, forG = Z2 ⊕ Z2, if ε is chosen so that h(ε) + ε log 3 = log 2 (notethat ε < 3

4), then we have

h(1− 2ε +4

3ε2) + 2ε(1− 2

3ε) log 3 > log 2 .

However, for the uniform distribution on a subgroup, itsconvolution with itself has entropy log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 36 / 57

Page 103: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Blachman’s proof?Find a 1-parameter group to evolve the probability distributionstowards the uniform distribution on the group?

Perhaps the analogs of the Gaussian distributions are thedistributions of the form

p(0) = 1− ε , p(g) =1

|G | − 1ε for g 6= 0 , 0 ≤ ε ≤ |G | − 1

|G |?

But these are not extremal for the EPI. For instance, forG = Z2 ⊕ Z2, if ε is chosen so that h(ε) + ε log 3 = log 2 (notethat ε < 3

4), then we have

h(1− 2ε +4

3ε2) + 2ε(1− 2

3ε) log 3 > log 2 .

However, for the uniform distribution on a subgroup, itsconvolution with itself has entropy log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 36 / 57

Page 104: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Blachman’s proof?Find a 1-parameter group to evolve the probability distributionstowards the uniform distribution on the group?

Perhaps the analogs of the Gaussian distributions are thedistributions of the form

p(0) = 1− ε , p(g) =1

|G | − 1ε for g 6= 0 , 0 ≤ ε ≤ |G | − 1

|G |?

But these are not extremal for the EPI. For instance, forG = Z2 ⊕ Z2, if ε is chosen so that h(ε) + ε log 3 = log 2 (notethat ε < 3

4), then we have

h(1− 2ε +4

3ε2) + 2ε(1− 2

3ε) log 3 > log 2 .

However, for the uniform distribution on a subgroup, itsconvolution with itself has entropy log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 36 / 57

Page 105: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Blachman’s proof?Find a 1-parameter group to evolve the probability distributionstowards the uniform distribution on the group?

Perhaps the analogs of the Gaussian distributions are thedistributions of the form

p(0) = 1− ε , p(g) =1

|G | − 1ε for g 6= 0 , 0 ≤ ε ≤ |G | − 1

|G |?

But these are not extremal for the EPI. For instance, forG = Z2 ⊕ Z2, if ε is chosen so that h(ε) + ε log 3 = log 2 (notethat ε < 3

4), then we have

h(1− 2ε +4

3ε2) + 2ε(1− 2

3ε) log 3 > log 2 .

However, for the uniform distribution on a subgroup, itsconvolution with itself has entropy log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 36 / 57

Page 106: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Blachman’s proof?Find a 1-parameter group to evolve the probability distributionstowards the uniform distribution on the group?

Perhaps the analogs of the Gaussian distributions are thedistributions of the form

p(0) = 1− ε , p(g) =1

|G | − 1ε for g 6= 0 , 0 ≤ ε ≤ |G | − 1

|G |?

But these are not extremal for the EPI. For instance, forG = Z2 ⊕ Z2, if ε is chosen so that h(ε) + ε log 3 = log 2 (notethat ε < 3

4), then we have

h(1− 2ε +4

3ε2) + 2ε(1− 2

3ε) log 3 > log 2 .

However, for the uniform distribution on a subgroup, itsconvolution with itself has entropy log 2.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 36 / 57

Page 107: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Lieb’s proof?

Given group valued random variables X1 and X2, what is theanalog of X1 cosα + X2 sinα?

There is a version of Stam’s inequality for finite abelian groups(Gibilisco and Isola, 2008). Given a set of generatorsΓ := {γ1, γ2, . . . , γm} for the group G , define the “Fisherinformation” of a random variable X by

J(X ) :=∑γ∈Γ

∑g∈G

(pX (g)− pX (γ−1g)

pX (g)

)2

pX (g) .

Then one has 1J(X1+X2)

≥ 1J(X1)

+ 1J(X2)

for independent G -valuedrandom variables X1 and X2.

However this notion of “Fisher information” is merely mimickingformal properties of the Fisher information from the continuouscase. It seems to have little to do with estimation, which, forfinite groups ought to refer to likelihood ratio type quantities.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 37 / 57

Page 108: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Lieb’s proof?Given group valued random variables X1 and X2, what is theanalog of X1 cosα + X2 sinα?

There is a version of Stam’s inequality for finite abelian groups(Gibilisco and Isola, 2008). Given a set of generatorsΓ := {γ1, γ2, . . . , γm} for the group G , define the “Fisherinformation” of a random variable X by

J(X ) :=∑γ∈Γ

∑g∈G

(pX (g)− pX (γ−1g)

pX (g)

)2

pX (g) .

Then one has 1J(X1+X2)

≥ 1J(X1)

+ 1J(X2)

for independent G -valuedrandom variables X1 and X2.

However this notion of “Fisher information” is merely mimickingformal properties of the Fisher information from the continuouscase. It seems to have little to do with estimation, which, forfinite groups ought to refer to likelihood ratio type quantities.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 37 / 57

Page 109: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Lieb’s proof?Given group valued random variables X1 and X2, what is theanalog of X1 cosα + X2 sinα?

There is a version of Stam’s inequality for finite abelian groups(Gibilisco and Isola, 2008). Given a set of generatorsΓ := {γ1, γ2, . . . , γm} for the group G , define the “Fisherinformation” of a random variable X by

J(X ) :=∑γ∈Γ

∑g∈G

(pX (g)− pX (γ−1g)

pX (g)

)2

pX (g) .

Then one has 1J(X1+X2)

≥ 1J(X1)

+ 1J(X2)

for independent G -valuedrandom variables X1 and X2.

However this notion of “Fisher information” is merely mimickingformal properties of the Fisher information from the continuouscase. It seems to have little to do with estimation, which, forfinite groups ought to refer to likelihood ratio type quantities.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 37 / 57

Page 110: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Lieb’s proof?Given group valued random variables X1 and X2, what is theanalog of X1 cosα + X2 sinα?

There is a version of Stam’s inequality for finite abelian groups(Gibilisco and Isola, 2008). Given a set of generatorsΓ := {γ1, γ2, . . . , γm} for the group G , define the “Fisherinformation” of a random variable X by

J(X ) :=∑γ∈Γ

∑g∈G

(pX (g)− pX (γ−1g)

pX (g)

)2

pX (g) .

Then one has 1J(X1+X2)

≥ 1J(X1)

+ 1J(X2)

for independent G -valuedrandom variables X1 and X2.

However this notion of “Fisher information” is merely mimickingformal properties of the Fisher information from the continuouscase. It seems to have little to do with estimation, which, forfinite groups ought to refer to likelihood ratio type quantities.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 37 / 57

Page 111: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Szarek and Voiculescu’s proof?

This is a promising direction.

The question remains, what is the correct analog of therestricted Minkowski sum?

The key lemma driving the proof of Szarek and Voiculescuappears to be Euclidean in character.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 38 / 57

Page 112: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Szarek and Voiculescu’s proof?

This is a promising direction.

The question remains, what is the correct analog of therestricted Minkowski sum?

The key lemma driving the proof of Szarek and Voiculescuappears to be Euclidean in character.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 38 / 57

Page 113: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Szarek and Voiculescu’s proof?

This is a promising direction.

The question remains, what is the correct analog of therestricted Minkowski sum?

The key lemma driving the proof of Szarek and Voiculescuappears to be Euclidean in character.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 38 / 57

Page 114: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Mimicking Szarek and Voiculescu’s proof?

This is a promising direction.

The question remains, what is the correct analog of therestricted Minkowski sum?

The key lemma driving the proof of Szarek and Voiculescuappears to be Euclidean in character.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 38 / 57

Page 115: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 39 / 57

Page 116: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our results

For a group G of order 2n, we explicitly describe fG in terms offZ2 .

We also describe the minimum entropy achieving distributionson G , these distributions are analogs of Gaussians in this sense.

The fG (x , y) function obtained has the property that it is convexin x for a fixed y . This is yet another version of Mrs. Gerber’sLemma.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 40 / 57

Page 117: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our results

For a group G of order 2n, we explicitly describe fG in terms offZ2 .

We also describe the minimum entropy achieving distributionson G , these distributions are analogs of Gaussians in this sense.

The fG (x , y) function obtained has the property that it is convexin x for a fixed y . This is yet another version of Mrs. Gerber’sLemma.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 40 / 57

Page 118: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our results

For a group G of order 2n, we explicitly describe fG in terms offZ2 .

We also describe the minimum entropy achieving distributionson G , these distributions are analogs of Gaussians in this sense.

The fG (x , y) function obtained has the property that it is convexin x for a fixed y . This is yet another version of Mrs. Gerber’sLemma.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 40 / 57

Page 119: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Our results

For a group G of order 2n, we explicitly describe fG in terms offZ2 .

We also describe the minimum entropy achieving distributionson G , these distributions are analogs of Gaussians in this sense.

The fG (x , y) function obtained has the property that it is convexin x for a fixed y . This is yet another version of Mrs. Gerber’sLemma.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 40 / 57

Page 120: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Statement of our results

TheoremfG depends only on the size of G , and is denoted by f2n , where

f2n(x , y) =

f2(x − k log 2, y − k log 2) + k log 2,

if k log 2 ≤ x , y ≤ (k + 1) log 2 ,

max(x , y) otherwise.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 41 / 57

Page 121: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Statement of our results

TheoremfG depends only on the size of G , and is denoted by f2n , where

f2n(x , y) =

f2(x − k log 2, y − k log 2) + k log 2,

if k log 2 ≤ x , y ≤ (k + 1) log 2 ,

max(x , y) otherwise.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 41 / 57

Page 122: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Visualizing our results

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 42 / 57

Page 123: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Visualizing our results (contd.)

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 43 / 57

Page 124: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

A conjecture

Generalized Mrs. Gerber’s LemmaIf G is a finite group, then fG (x , y) is convex in x for a fixed y , andby symmetry convex in y for a fixed x .

If one is less optimistic, one might make this conjecture only forfinite abelian groups.

Simulations for Z3 and Z5 seem to support this conjecture.

We saw already that f2n satisfies the conjecture.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 44 / 57

Page 125: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

A conjecture

Generalized Mrs. Gerber’s LemmaIf G is a finite group, then fG (x , y) is convex in x for a fixed y , andby symmetry convex in y for a fixed x .

If one is less optimistic, one might make this conjecture only forfinite abelian groups.

Simulations for Z3 and Z5 seem to support this conjecture.

We saw already that f2n satisfies the conjecture.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 44 / 57

Page 126: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

A conjecture

Generalized Mrs. Gerber’s LemmaIf G is a finite group, then fG (x , y) is convex in x for a fixed y , andby symmetry convex in y for a fixed x .

If one is less optimistic, one might make this conjecture only forfinite abelian groups.

Simulations for Z3 and Z5 seem to support this conjecture.

We saw already that f2n satisfies the conjecture.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 44 / 57

Page 127: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

A conjecture

Generalized Mrs. Gerber’s LemmaIf G is a finite group, then fG (x , y) is convex in x for a fixed y , andby symmetry convex in y for a fixed x .

If one is less optimistic, one might make this conjecture only forfinite abelian groups.

Simulations for Z3 and Z5 seem to support this conjecture.

We saw already that f2n satisfies the conjecture.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 44 / 57

Page 128: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

A conjecture

Generalized Mrs. Gerber’s LemmaIf G is a finite group, then fG (x , y) is convex in x for a fixed y , andby symmetry convex in y for a fixed x .

If one is less optimistic, one might make this conjecture only forfinite abelian groups.

Simulations for Z3 and Z5 seem to support this conjecture.

We saw already that f2n satisfies the conjecture.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 44 / 57

Page 129: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI for abelian groups of order 2n: proof strategy

First observe that for Z2 the function fZ2(x , y) is explicitly givenby

fZ2(x , y) = h(h−1(x) ? h−1(y)).

The proof proceeds by first proving the EPI for Z4, usingconvexity properties of fZ2 and concavity of entropy.

We then use induction, where we assume that the result holdsfor all groups of size 2n and prove it groups of size 2n+1.

The induction part of the proof has almost exactly the samestructure as the proof for Z4. Thus proving the result for Z4 isthe key step in our proof.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 45 / 57

Page 130: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI for abelian groups of order 2n: proof strategy

First observe that for Z2 the function fZ2(x , y) is explicitly givenby

fZ2(x , y) = h(h−1(x) ? h−1(y)).

The proof proceeds by first proving the EPI for Z4, usingconvexity properties of fZ2 and concavity of entropy.

We then use induction, where we assume that the result holdsfor all groups of size 2n and prove it groups of size 2n+1.

The induction part of the proof has almost exactly the samestructure as the proof for Z4. Thus proving the result for Z4 isthe key step in our proof.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 45 / 57

Page 131: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI for abelian groups of order 2n: proof strategy

First observe that for Z2 the function fZ2(x , y) is explicitly givenby

fZ2(x , y) = h(h−1(x) ? h−1(y)).

The proof proceeds by first proving the EPI for Z4, usingconvexity properties of fZ2 and concavity of entropy.

We then use induction, where we assume that the result holdsfor all groups of size 2n and prove it groups of size 2n+1.

The induction part of the proof has almost exactly the samestructure as the proof for Z4. Thus proving the result for Z4 isthe key step in our proof.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 45 / 57

Page 132: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI for abelian groups of order 2n: proof strategy

First observe that for Z2 the function fZ2(x , y) is explicitly givenby

fZ2(x , y) = h(h−1(x) ? h−1(y)).

The proof proceeds by first proving the EPI for Z4, usingconvexity properties of fZ2 and concavity of entropy.

We then use induction, where we assume that the result holdsfor all groups of size 2n and prove it groups of size 2n+1.

The induction part of the proof has almost exactly the samestructure as the proof for Z4. Thus proving the result for Z4 isthe key step in our proof.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 45 / 57

Page 133: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

EPI for abelian groups of order 2n: proof strategy

First observe that for Z2 the function fZ2(x , y) is explicitly givenby

fZ2(x , y) = h(h−1(x) ? h−1(y)).

The proof proceeds by first proving the EPI for Z4, usingconvexity properties of fZ2 and concavity of entropy.

We then use induction, where we assume that the result holdsfor all groups of size 2n and prove it groups of size 2n+1.

The induction part of the proof has almost exactly the samestructure as the proof for Z4. Thus proving the result for Z4 isthe key step in our proof.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 45 / 57

Page 134: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Proof for Z4

For Z4, splitting the support of distributions on Z4 into twoparts, {0, 2} (even part) and{1, 3} (odd part) and usingprecisely two ingredients: concavity of entropy, and convexity inMGL we arrive at the lower bound

Lower bound

f4(x , y) ≥ minu,v

f2(u, v) + f2(x − u, y − v)

To evaluate the minimum, we prove some additional propertiesof fZ2 , specifically regarding its behavior along lines through theorigin. The key Lemma we use is

Lemma∂fZ2

∂x(and by symmetry,

∂fZ2

∂y) strictly decreases along lines through the

origin

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 46 / 57

Page 135: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Proof for Z4

For Z4, splitting the support of distributions on Z4 into twoparts, {0, 2} (even part) and{1, 3} (odd part) and usingprecisely two ingredients: concavity of entropy, and convexity inMGL we arrive at the lower bound

Lower bound

f4(x , y) ≥ minu,v

f2(u, v) + f2(x − u, y − v)

To evaluate the minimum, we prove some additional propertiesof fZ2 , specifically regarding its behavior along lines through theorigin. The key Lemma we use is

Lemma∂fZ2

∂x(and by symmetry,

∂fZ2

∂y) strictly decreases along lines through the

origin

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 46 / 57

Page 136: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Proof for Z4

For Z4, splitting the support of distributions on Z4 into twoparts, {0, 2} (even part) and{1, 3} (odd part) and usingprecisely two ingredients: concavity of entropy, and convexity inMGL we arrive at the lower bound

Lower bound

f4(x , y) ≥ minu,v

f2(u, v) + f2(x − u, y − v)

To evaluate the minimum, we prove some additional propertiesof fZ2 , specifically regarding its behavior along lines through theorigin. The key Lemma we use is

Lemma∂fZ2

∂x(and by symmetry,

∂fZ2

∂y) strictly decreases along lines through the

origin

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 46 / 57

Page 137: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Proving the lemma

We use the parametrization h−1(x) = p and h−1(y) = q. Ourstrategy involves brute force differentiation, and a series of stepswhere we conclude that proving the lemma is equivalent toproving

F (p) = p2(log(p))2 − (1− p)2(log(1− p))2+

(1− 2p) (log(p) log(1− p) + p log(p) + (1− p) log(1− p))

≤ 0

We show that proving F (p) ≤ 0 is the same as provingd5

dp5F (p) ≤ 0, we differentiate F five times, use a polynomialapproximation of log, and finally use Sturm’s theorem to provethat the resulting polynomial is non-positive.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 47 / 57

Page 138: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Proving the lemma

We use the parametrization h−1(x) = p and h−1(y) = q. Ourstrategy involves brute force differentiation, and a series of stepswhere we conclude that proving the lemma is equivalent toproving

F (p) = p2(log(p))2 − (1− p)2(log(1− p))2+

(1− 2p) (log(p) log(1− p) + p log(p) + (1− p) log(1− p))

≤ 0

We show that proving F (p) ≤ 0 is the same as provingd5

dp5F (p) ≤ 0, we differentiate F five times, use a polynomialapproximation of log, and finally use Sturm’s theorem to provethat the resulting polynomial is non-positive.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 47 / 57

Page 139: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Proving the lemma

We use the parametrization h−1(x) = p and h−1(y) = q. Ourstrategy involves brute force differentiation, and a series of stepswhere we conclude that proving the lemma is equivalent toproving

F (p) = p2(log(p))2 − (1− p)2(log(1− p))2+

(1− 2p) (log(p) log(1− p) + p log(p) + (1− p) log(1− p))

≤ 0

We show that proving F (p) ≤ 0 is the same as provingd5

dp5F (p) ≤ 0, we differentiate F five times, use a polynomialapproximation of log, and finally use Sturm’s theorem to provethat the resulting polynomial is non-positive.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 47 / 57

Page 140: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 48 / 57

Page 141: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Norm of the Fourier transform on a finite abelian

group

Every finite abelian group G has a dual group G comprised ofthe characters of G , i.e. the homomorphisms from G to the unitcircle in the complex plane.

G has the same cardinality and structure as G and G can bethought of as the dual of G .

The Fourier transform on the group is the map from functions onG to functions on G given by F(f )(γ) := 1

|G |−1/2

∑g γ(g)f (g).

With counting measure on G and G , we have, for p, q > 0,F : Lp(G ) 7→ Lq(G ). What is the norm of this map?

Resolved by Gilbert and Rzeszotnik (2010).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 49 / 57

Page 142: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Norm of the Fourier transform on a finite abelian

group

Every finite abelian group G has a dual group G comprised ofthe characters of G , i.e. the homomorphisms from G to the unitcircle in the complex plane.

G has the same cardinality and structure as G and G can bethought of as the dual of G .

The Fourier transform on the group is the map from functions onG to functions on G given by F(f )(γ) := 1

|G |−1/2

∑g γ(g)f (g).

With counting measure on G and G , we have, for p, q > 0,F : Lp(G ) 7→ Lq(G ). What is the norm of this map?

Resolved by Gilbert and Rzeszotnik (2010).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 49 / 57

Page 143: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Norm of the Fourier transform on a finite abelian

group

Every finite abelian group G has a dual group G comprised ofthe characters of G , i.e. the homomorphisms from G to the unitcircle in the complex plane.

G has the same cardinality and structure as G and G can bethought of as the dual of G .

The Fourier transform on the group is the map from functions onG to functions on G given by F(f )(γ) := 1

|G |−1/2

∑g γ(g)f (g).

With counting measure on G and G , we have, for p, q > 0,F : Lp(G ) 7→ Lq(G ). What is the norm of this map?

Resolved by Gilbert and Rzeszotnik (2010).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 49 / 57

Page 144: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Norm of the Fourier transform on a finite abelian

group

Every finite abelian group G has a dual group G comprised ofthe characters of G , i.e. the homomorphisms from G to the unitcircle in the complex plane.

G has the same cardinality and structure as G and G can bethought of as the dual of G .

The Fourier transform on the group is the map from functions onG to functions on G given by F(f )(γ) := 1

|G |−1/2

∑g γ(g)f (g).

With counting measure on G and G , we have, for p, q > 0,F : Lp(G ) 7→ Lq(G ). What is the norm of this map?

Resolved by Gilbert and Rzeszotnik (2010).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 49 / 57

Page 145: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Norm of the Fourier transform on a finite abelian

group

Every finite abelian group G has a dual group G comprised ofthe characters of G , i.e. the homomorphisms from G to the unitcircle in the complex plane.

G has the same cardinality and structure as G and G can bethought of as the dual of G .

The Fourier transform on the group is the map from functions onG to functions on G given by F(f )(γ) := 1

|G |−1/2

∑g γ(g)f (g).

With counting measure on G and G , we have, for p, q > 0,F : Lp(G ) 7→ Lq(G ). What is the norm of this map?

Resolved by Gilbert and Rzeszotnik (2010).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 49 / 57

Page 146: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Norm of the Fourier transform on a finite abelian

group

Every finite abelian group G has a dual group G comprised ofthe characters of G , i.e. the homomorphisms from G to the unitcircle in the complex plane.

G has the same cardinality and structure as G and G can bethought of as the dual of G .

The Fourier transform on the group is the map from functions onG to functions on G given by F(f )(γ) := 1

|G |−1/2

∑g γ(g)f (g).

With counting measure on G and G , we have, for p, q > 0,F : Lp(G ) 7→ Lq(G ). What is the norm of this map?

Resolved by Gilbert and Rzeszotnik (2010).

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 49 / 57

Page 147: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Visualizing the norm of the FT on a finite abelian

group

Norm of the Fourier Transform on a finite abelian group

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 50 / 57

Page 148: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 51 / 57

Page 149: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequality on a finite abelian group

There is a forward Young’s inequality, which reads: For everyp, q, r ≥ 1 such that 1

p+ 1

q= 1 + 1

rwe have

‖f ∗ g‖r ≤ ‖f ‖p‖g‖q .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 52 / 57

Page 150: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Young’s inequality on a finite abelian group

There is a forward Young’s inequality, which reads: For everyp, q, r ≥ 1 such that 1

p+ 1

q= 1 + 1

rwe have

‖f ∗ g‖r ≤ ‖f ‖p‖g‖q .

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 52 / 57

Page 151: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 53 / 57

Page 152: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality on a finite abelian

group

For a finite abelian group G , consider the function

µG (r , s) := min{|A + B | : A,B ⊂ G , |A| = r , |B | = s} .

Then we have µG (r , s) = mind divides |G |

(d rde+ d s

de − 1

)d .

In particular for G a finite abelian group of size n = 2k , we have

µG (r , s) = r ◦ s

where r ◦ s is the Hopf-Stiefel function defined as the smallestpositive integer m for which the polynomial (x + y)m falls in theideal generated by x r and y s in the polynomial ring F2[x , y ].

Involved history of partial results. See Eliahu and Kervaire(2007) for the history.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 54 / 57

Page 153: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality on a finite abelian

groupFor a finite abelian group G , consider the function

µG (r , s) := min{|A + B | : A,B ⊂ G , |A| = r , |B | = s} .

Then we have µG (r , s) = mind divides |G |

(d rde+ d s

de − 1

)d .

In particular for G a finite abelian group of size n = 2k , we have

µG (r , s) = r ◦ s

where r ◦ s is the Hopf-Stiefel function defined as the smallestpositive integer m for which the polynomial (x + y)m falls in theideal generated by x r and y s in the polynomial ring F2[x , y ].

Involved history of partial results. See Eliahu and Kervaire(2007) for the history.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 54 / 57

Page 154: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality on a finite abelian

groupFor a finite abelian group G , consider the function

µG (r , s) := min{|A + B | : A,B ⊂ G , |A| = r , |B | = s} .

Then we have µG (r , s) = mind divides |G |

(d rde+ d s

de − 1

)d .

In particular for G a finite abelian group of size n = 2k , we have

µG (r , s) = r ◦ s

where r ◦ s is the Hopf-Stiefel function defined as the smallestpositive integer m for which the polynomial (x + y)m falls in theideal generated by x r and y s in the polynomial ring F2[x , y ].

Involved history of partial results. See Eliahu and Kervaire(2007) for the history.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 54 / 57

Page 155: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality on a finite abelian

groupFor a finite abelian group G , consider the function

µG (r , s) := min{|A + B | : A,B ⊂ G , |A| = r , |B | = s} .

Then we have µG (r , s) = mind divides |G |

(d rde+ d s

de − 1

)d .

In particular for G a finite abelian group of size n = 2k , we have

µG (r , s) = r ◦ s

where r ◦ s is the Hopf-Stiefel function defined as the smallestpositive integer m for which the polynomial (x + y)m falls in theideal generated by x r and y s in the polynomial ring F2[x , y ].

Involved history of partial results. See Eliahu and Kervaire(2007) for the history.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 54 / 57

Page 156: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Brunn Minkowski inequality on a finite abelian

groupFor a finite abelian group G , consider the function

µG (r , s) := min{|A + B | : A,B ⊂ G , |A| = r , |B | = s} .

Then we have µG (r , s) = mind divides |G |

(d rde+ d s

de − 1

)d .

In particular for G a finite abelian group of size n = 2k , we have

µG (r , s) = r ◦ s

where r ◦ s is the Hopf-Stiefel function defined as the smallestpositive integer m for which the polynomial (x + y)m falls in theideal generated by x r and y s in the polynomial ring F2[x , y ].

Involved history of partial results. See Eliahu and Kervaire(2007) for the history.

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 54 / 57

Page 157: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Outline1 Setting the stage

2 The Brunn Minkowski inequality

3 Proofs of the EPI

4 Mrs. Gerber’s Lemma

5 Young’s inequality

6 Versions of the EPI for discrete random variables

7 EPI for Groups or Order 2n

8 The Fourier transform on a finite abelian group

9 Young’s inequality on a finite abelian group

10 Brunn Minkowski inequality on a finite abelian group

11 Speculation

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 55 / 57

Page 158: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

Questions

What form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 159: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

QuestionsWhat form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 160: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

QuestionsWhat form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 161: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

QuestionsWhat form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 162: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

QuestionsWhat form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 163: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

QuestionsWhat form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 164: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

QuestionsWhat form does the reverse Young’s inequality take on finiteabelian groups?

Less ambitiously, what form does the reverse Young’s inequalitytake on finite groups of size 2k?

Is the EPI we proved for finite abelian groups of size 2k derivableas a limit from the Young’s inequality for such groups?

Is the Brunn Minkowski inequality for finite abelian groups ofsize 2k (expressed via the Hopf-Stiefel function) derivable as alimit from the appropriate reverse Young’s inequality for suchgroups?

Is the general Brunn Minkowski inequality for finite abeliangroups derivable as a limit from the appropriate reverse Young’sinequality for such groups?

Will a limit of the Young’s inequality for a general finite Abeliangroup give rise to an EPI for each such group?

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 56 / 57

Page 165: Entropy Power Inequalities: Results and Speculationghan/WCI/Venkat.pdf · V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 3 / 57 Di erential entropy and entropy power

The End

V. Jog and V. Anantharam (UC Berkeley) EPI December 12, 2013 57 / 57