exploiting tree sparse priors

23
Knowledge Enhanced Compressive Measurements Training Data Structured Sparsity Adap3ve Sensing LASeR LASeR: Learning Adap3ve Sensing Representa3ons a ` (1) a ` (2) a ` (5) a ` (3) a ` (4) a ` (6) a ` (7) Akshay Soni University of Minnesota www.tc.umn.edu/~sonix022 KECoM Student Workshop 2012 ExploiEng Tree Priors

Upload: sonix022

Post on 09-Jun-2015

115 views

Category:

Engineering


6 download

DESCRIPTION

Slides based on our papers -- http://arxiv.org/abs/1306.4391 and http://www.ece.umn.edu/~jdhaupt/publications/asilomar11_hierarchical.pdf

TRANSCRIPT

Page 1: Exploiting Tree Sparse Priors

Knowledge Enhanced Compressive Measurements

Training'Data'

Structured'Sparsity'

Adap3ve'Sensing'

LASeR'

LASeR:'Learning'Adap3ve'Sensing'Representa3ons'

a`(1)

a`(2)

a`(5)

a`(3)

a`(4) a`(6) a`(7)

Akshay  Soni  University  of  Minnesota  www.tc.umn.edu/~sonix022  

KECoM  Student  Workshop  2012  

ExploiEng  Tree  Priors  

Page 2: Exploiting Tree Sparse Priors

A Sparse Signal Model

Page 3: Exploiting Tree Sparse Priors

0 20 40 60 80 100 1200

5

10

15

20

25

30

#projections

Rec

onst

ruct

ion

SNR

(dB)

0 20 40 60 80 100 1200

2

4

6

8

10

12

14

16

#projections

Rec

onst

ruct

ion

SNR

(dB)

Can knowledge buy something?

12  dB  gain  

CS  DCT  Lasso  

CS  Random  Lasso  

Page 4: Exploiting Tree Sparse Priors

0 20 40 60 80 100 1200

2

4

6

8

10

12

14

16

#projections

Rec

onst

ruct

ion

SNR

(dB)

0 20 40 60 80 100 1200

5

10

15

20

25

30

#projections

Rec

onst

ruct

ion

SNR

(dB)

Can knowledge buy something?

15  dB  gain  

CS  DCT  Lasso  

CS  Random  Lasso  

Page 5: Exploiting Tree Sparse Priors

A Sparse Signal Model

|xi|⇢

> µ > 0 i 2 S,0 i /2 S.

Page 6: Exploiting Tree Sparse Priors

Exact Support Recovery (ESR)

CS: Non adaptive & Non structured

|xi|⇢

> µ > 0 i 2 S,0 i /2 S.

Page 7: Exploiting Tree Sparse Priors

The Big Picture: Minimum Signal Amplitudes for ESR

Can we exploit structure or adaptivity or both?

[*]  D.  Donoho  and  J.  Jin,  “Higher  criEcism  for  detecEng  sparse  heterogeneous  mixtures,”  Ann.  StaEst.,  vol.  32,  no.  3,  pp.  962–994,  2004.  

[*]  

[*]  S.  Aeron,  V.  Saligrama,  and  M.  Zhao,  "InformaEon  TheoreEc  Bounds  for  Compressed  Sensing,"  InformaEon  Theory,  IEEE  TransacEons  on  ,  vol.56,  no.10,  pp.5111-­‐5130,  Oct.  2010  

Uncompressed  /    compressed  

µ �q�

nR

�log n

Page 8: Exploiting Tree Sparse Priors

M.  Malloy  and  R.  Nowak,  “On  the  limits  of  sequenEal  tesEng  in  high  dimensions,”  preprint,  2011.  

[*]  

[*]  

SequenEal  but    non  structured  /  uncompressed    

The Big Picture: Minimum Signal Amplitudes for ESR

J.  Haupt,  R.  Baraniuk,  R.  Castro  and  R.  Nowak,  “SequenEally  Designed  Compressed  Sensing,”  SSP,  2012.  [*]  

µ �q�

nR

�log n

µ �q�

nR

�log k

Page 9: Exploiting Tree Sparse Priors

Tree Sparse Signal Model

Can  we  exploit  this  tree  structure  for  ESR  problem?  

Page 10: Exploiting Tree Sparse Priors

0 20 40 60 80 100 1200

5

10

15

20

25

30

#projections

Rec

onst

ruct

ion

SNR

(dB)

Can structure buy something?

Tree  Structured  

0 20 40 60 80 100 1200

2

4

6

8

10

12

14

16

#projections

Rec

onst

ruct

ion

SNR

(dB)

Random  CS  

DCT  CS  

Page 11: Exploiting Tree Sparse Priors

0 20 40 60 80 100 1200

2

4

6

8

10

12

14

16

#projections

Rec

onst

ruct

ion

SNR

(dB)

0 20 40 60 80 100 1200

5

10

15

20

25

30

#projections

Rec

onst

ruct

ion

SNR

(dB)

Can structure buy something?

Random  CS  

DCT  CS  

Page 12: Exploiting Tree Sparse Priors

[*]  

[*]  

The Big Picture: Minimum Signal Amplitudes for ESR

Arias-­‐Castro,  E.,  Candès,  E.  J.,  Helgason,  H.  and  Zeitouni,  O.  (2008).  Searching  for  a  trail  of  evidence  in  a  maze.  Ann.  StaEst.  36  1726–1757.  

Uncompressed  search    for  simple  trail  

µ �q�

nR

�log k

µ �q�

nR

�log n µ �

q�nR

Page 13: Exploiting Tree Sparse Priors

The Big Picture: Minimum Signal Amplitudes for ESR

[*]  A.  Soni  and  J.  Haupt,  “Efficient  adapEve  compressive  sensing  using  sparse  hierarchical   learned  dicEonaries,”   in  Proc.  Asilomar  Conf.  on  Signals,  Systems,  and  Computers,  2011,  pp.  1250–1254.  

µ �q�

nR

�log k

µ �q�

nR

�log n µ �

q�nR

µ �q�

kR

�log k

Page 14: Exploiting Tree Sparse Priors

Structure Dependent Adaptive Support Recovery – An Example

1

2 5

3 4 6 7

Stack&/&Queue&(both&ini1alized&to&index&of&root)&

!Repeat&&&&&&&&&&for&next&queue/

stack&element.&&

Pop if Queue/Stack not empty Queue: Insert indices of

children of node

Unknown signal 1&

No

1&

|y(i, k)| � � ?y(j) = (� dj)Tx� +N (0, 1)

Page 15: Exploiting Tree Sparse Priors

Theorem  (2011):  A.  Soni  &  J.  Haupt  

Tree Structured Adaptive Support Recovery

Page 16: Exploiting Tree Sparse Priors

0 5 10 15 20 250

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

αmin

Pr(S

=S)

The Big Picture: Minimum Signal Amplitudes for ESR

µ �q�

nR

�log k

µ �q�

nR

�log n µ �

q�nR

µ �q�

kR

�log k

Page 17: Exploiting Tree Sparse Priors

0 5 10 15 20 250

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

αmin

Pr(S

=S)

The Big Picture: Minimum Signal Amplitudes for ESR

Sufficient  condiEon  May  we  improve?  necessary    

condiEons  

Page 18: Exploiting Tree Sparse Priors

Tree Structured Signal Reconstruction Two-­‐step  ReconstrucEon  

AdapEve  Support  Recovery  

Measure  Support  LocaEons  

Corollary  (2011):  A.  Soni  &  J.  Haupt  

Page 19: Exploiting Tree Sparse Priors

Learning Adaptive Sensing Representations (LASeR)

Learning  Tree  Sparsifying  DicEonary  

[hop://spams-­‐devel.gforge.inria.fr/]  

Page 20: Exploiting Tree Sparse Priors

R  =  (128  x  128)  

Qualitative Results - I

Direct  Wavelet  Sensing  

PCA  

CS  LASSO  

CS  Tree  LASSO  

LASeR  

m  =  20   m  =  50   m  =  80  

Image  from  PICS  database  

Page 21: Exploiting Tree Sparse Priors

R  =  (128  x  128)/32  

Qualitative Results - II

Direct  Wavelet  Sensing  

PCA  

CS  LASSO  

CS  Tree  LASSO  

LASeR  

m  =  50   m  =  80  m  =  20  

Image  from  PICS  database  

Page 22: Exploiting Tree Sparse Priors

Quantitative Results

0 20 40 60 80 100 120 1400

5

10

15

20

25

30

35

#projections

Rec

onst

ruct

ion

SNR

(dB)

0 20 40 60 80 100 120 1400

2

4

6

8

10

12

14

16

18

#projections

Rec

onst

ruct

ion

SNR

(dB)

0 20 40 60 80 100 120 1400

5

10

15

20

25

#projectionsR

econ

stru

ctio

n SN

R(d

B)

Page 23: Exploiting Tree Sparse Priors

Future Directions for Tree Sensing

Thank You.

Contact:  Akshay  Soni  [email protected]  

1. LASeR with clutter signal model:

y = �(x+ c) + w

(clever regularization for di↵erent signal classes – eg., di↵usion of clutter

over whole signal space using `2 rather that `1 penalty)

2. LASeR with non-orthonormal learned dictionaries.

3. Exploiting signal amplitude correlation in LASeR.