sum rate of gaussian multiterminal source coding

Post on 11-Jun-2022

7 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Sum Rate of Gaussian Multiterminal Source

Coding

Pramod Viswanath

University of Illinois, Urbana-Champaign

March 19, 2003

Gaussian Multiterminal Source Coding

ENCODER 2

ENCODER 1

ENCODER K

E

DECOD

R

yK = htKn

n2

n1

nN

n =n

y2 = ht2n

y1 = ht1n

mK

m2

m1

Sum of rates of encoders R and distortion metric d (n, n)

Quadratic Gaussian CEO Problem

ENCODER K

ENCODER 1

ENCODER 2

ER

DOCED

y1 = n + z1

n n

yK = n + zK

x2 = n + z2

mK

m2

m1

Sum of rates of encoders R and distortion metric d (n, n)

Result: Quadratic Gaussian CEO

• quadratic distortion metric d (n, n) = (n− n)2

• For large number of encoders K,

R(D) =1

2log+

(σ2

n

D

)+

σ2z

2σ2n

(σ2

n

D− 1

)+

– Second term is loss w.r.t. cooperating encoders

Outline

• Problem Formulation:

Tradeoff between sum rate R and distortion

(metric d (n, n)).

• Main Result:

Characterize a class of distortion metrics for which no loss

in sum rate compared with encoder cooperation

– A multiple antenna test channel

Random Binning of Slepian-Wolf

O

××

×O

O

O

×

∆∆

×

××

O

O

O

y1

y2

×

×

O

O

×

• Rate is number of quantizers

Encoding in Slepian-Wolf

O

××

×O

O

O

×

∆∆

×

××

O

O

O

y1

y2

×

×

O

O

×

• Quantizer closest to realization

Decoding in Slepian-Wolf

• Decoder knows joint distribution of y1, y2

• It is given the two quantizer numbers from the encoders

• Picks the pair of points in the quantizers which best matches

the joint distribution

– For jointly Gaussian y1, y2 nearest neighbor type test

• R1 + R2 = H (y1, y2) is sufficient for zero distortion

Deterministic Broadcast Channel

y2 = g(x)

x

y1 = f(x)

• Pick distribution on x such that y1, y2 have desired joint dis-

tribtion (Cover 98)

Slepian-Wolf code for Broadcast Channel

• Encoding: implement Slepian-Wolf decoder

– given two messages, find the appropriate pair y1, y2 in the

two quantizers

– transmit x that generates this pair y1, y2.

Slepian-Wolf code for Broadcast Channel

• Encoding: implement Slepian-Wolf decoder

– given two messages, find the appropriate pair y1, y2 in the

two quantizers

– transmit x that generates this pair y1, y2.

• Decoding: implement Slepian-Wolf encoder

– quantize y1, y2 to nearest point

– messages are the quantizer numbers

Lossy Slepian-Wolf Source Coding

O

××

×O

O

O

×

∆∆

×

××

O

O

O

u1

u2

×

×

O

O

×

• Approximate y1, y2 by u1, u2

Lossy Slepian-Wolf Source Coding

• Encoding: Find ui that matches source yi, separately for

each i

– For jointly Gaussian r.v. s, nearest neighbor calculation

– Each encoder sends quantizer number containing the u

picked

Lossy Slepian-Wolf Source Coding

• Encoding: Find ui that matches source yi, separately foreach i

– For jointly Gaussian r.v. s, nearest neighbor calculation

– Each encoder sends quantizer number containing the u

picked

• Decoding: Reconstruct the u’s picked by the encoders

– reconstruction based on joint distribution of u’s

– Previously ui = yi were correlated

– Here u’s are independently picked

Lossy Slepian-Wolf

• We require

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

Lossy Slepian-Wolf

• We require

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

• Generate n1, . . . , nK deterministically from reconstructed u’s.

• Need sum rate

Rsum = I (u1, . . . , uK; y1, . . . , yK) .

• Distortion equal to

E [d (n, n)] .

Marton Coding for Broadcast Channel

Dec1

Dec2

w ∼ N (0, I)

Hty

u1

u2

xu1, u2

Enc

• Reversed encoding and decoding operations

• Sum rate I (u1, . . . , uK; y1, . . . , yK) .

• No use for the Markov property

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

Achievable Rates: Costa Precoding

m1 u1

w

xHt

m2 u2

m1

m2

• Users’ data modulated onto spatial signatures u1,u2

Stage 1: Costa Precoding

Dec1

m1 u1

w

xHt

m2 u2

m1

• Encoding for user 1 treating signal from user 2 as known

interference at transmitter

Stage 2

Dec2

m1 u1

w

xHt

m2 u2

m2

• Encode user 2 treating signal for user 1 as noise

Adaptation to Lossy Slepian-Wolf

DecHt

z ∼ N (0, Kz)

xu1, u2

Ency

u1

u2

• Joint distribution of u’s and y’s depends on noise z

• Performance independent of correlation in z

Noise Coloring

• Fix particular Costa coding scheme - fixes u’s and x.

• Idea:

Choose z such that

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

and (Kz)ii = 1

• Then can adapt to Lossy Multiterminal Source Coding

Markov Condition and Broadcast Channel

• The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

of independent interest in the broadcast channel

Markov Condition and Broadcast Channel

• The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

of independent interest in the broadcast channel

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p

[ui|y1, . . . , yK, u1, u2, . . . , ui−1

]

Markov Condition and Broadcast Channel

• The Markov condition

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p [ui|yi]

of independent interest in the broadcast channel

p [u1, . . . , uK|y1, . . . , yK] = ΠKi=1p

[ui|y1, . . . , yK, u1, u2, . . . , ui−1

]

• Equivalent to: given u1, . . . , ui−1

ui −→ yi −→ y1, . . . , yi−1, yi+1, . . . , yK

Implication

DecHt

z ∼ N (0, Kz)

xu1, u2

Ency

u1

u2

• Need only y1 to decode u1

• Given u1, need only y2 to decode u2

Performance of Costa scheme equals that when receivers

cooperate

Markov Condition and Noise Covariance

• The sum capacity is also achieved by such a scheme

(CS 01, YC 01, VT 02, VJG 02)

• For every Costa scheme, there is a choice of Kz such that

Markov condition holds (Yu and Cioffi, 01)

Sum Capacity

z ∼ N (0, Kz)ReceiversCooperating

Ht

H

Ht

H

Sato

Enc

w

wBroadcast

DecEnc

DecEnc

w

E[xtKzx

] ≤ P

ReciprocityReciprocity

x1, x2 independent

Multiple Access Cooperating Transmitters

Sum Capacity

z ∼ N (0, Kz)ReceiversCooperating

Ht

H

Ht

H

Sato

Enc

w

wBroadcast

DecEnc

DecEnc

w

E[xtKzx

] ≤ P

Reciprocity

Convex Duality

Reciprocity

x1, x2 independent

Multiple Access Cooperating Transmitters

Gaussian Multiterminal Source Coding

ENCODER 2

ENCODER 1

ENCODER K

E

DECOD

R

yK = htKn

n2

n1

nN

n =n

y2 = ht2n

y1 = ht1n

mK

m2

m1

H = [h1, . . . ,hK]

Sum of rates of encoders R and distortion metric d (n, n)

Main Result

• Distortion metric

d (n; n) =1

N(n− n)t

(I + Hdiag {p1, . . . , pK}Ht

)(n− n)

– Here p1, . . . , pK - powers of users in reciprocal MAC

• Rate distortion function

R(D) = Sum rate of MAC−N logD

= Sum rate of Broadcast Channel−N logD

= logdet(I + HDHt

)−N logD

Bells and Whistles

• For quadratic distortion metric

d (n; n) =1

N(n− n)t (n− n)

set of H can be characterized

• Analogy with CEO problem:

For large number of encoders and random H

characterization of R(D) almost surely

Discussion

• A “connection” made between coding schemes for multiter-

minal source and channel coding

Discussion

• A “connection” made between coding schemes for multiter-

minal source and channel coding

• Connection somewhat superficial

– relation between source coding and broadcast channel

through a common random coding argument (PR 02, CC

02)

– relation between source coding and multiple access chan-

nel through a change of variable (VT 02, JVG 01)

Discussion

• A “connection” made between coding schemes for multiter-minal source and channel coding

• Connection somewhat superficial

– relation between source coding and broadcast channelthrough a common random coding argument

– relation between source coding and multiple access chan-nel through a change of variable

• Connection is suggestive

– a codebook level duality

top related