lecture 18 - compressive sensing

63
8/20/2019 Lecture 18 - Compressive Sensing http://slidepdf.com/reader/full/lecture-18-compressive-sensing 1/63 U     N     I     V     E     R     S     I     T     Y      O     F  I     L     L     I     N     O     I     S      A     T  U     R     B     A     N     A   - C     H     A     M     P     A     I     G     N 30 Oc CS598PS – Machine Learning for Signal Processing Sparsity, Compressive Sensing and Random Projections

Upload: dhanya-rao-mamidi

Post on 07-Aug-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 1/63

U    N

    I    V    E    R    S    I    T    Y 

    O    F I

    L    L    I    N

    O

    I    S 

    A    T U

    R    B    A    N

    A  -C    H

    A    M

    P    A    I    G

    N

30 Oc

CS598PS – Machine Learning for Signal Processing

Sparsity, Compressive Sensing

and Random Projections

Page 2: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 2/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Today’s lecture

• Sparsity

• Compressive sensing

• Quantization, randomness and high dimensio

2

Page 3: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 3/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

What is sparsity?

• Depends who you ask

• Basic idea: We want most numbers

in a collection to be zero

• Too many ways to express that

3

Page 4: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 4/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

A starting point

• Linear equation with multiple solutions:

• Which solution would you pick?

4

 y= a !x " 2=   1 1#$%

  &'( !

 x1

 x2

#

$

%%%

&

'

(((

x=2

0

!

"

##

$

%

&&  x=

1

1

!

"

##

$

%

&&  x=

4

'2

!

"

##

$

%

&&

 A sparse answer What MATLAB gives This one is fine too!

Page 5: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 5/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

!4   !2

!4

!2

2

4

Infinite solutions

5

0,2

• All solutions lie on a line

• Which one do we pick?

• Why did MATLAB pick [1, 1]?

• Does it make a di! erence?

–! ,2+!

Page 6: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 6/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

The generic answer

• Least squares problem:

• Find the minimum-norm x that minimizes th• But why !x!2?

6

 y= A !x" x= A+! y

" argminx

 A !x# y2+   x

2( )

Page 7: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 7/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Least squares and !2

• Use Langrangian multipliers:

• Put back in original equation:

7

x2

2

+!!!  A !x" y( )# x̂=" 1

2 A!

!!

 A ! x̂=" 1

2 A ! A!

!!= y#!="2  A ! A!( )"1

# x̂= A!!  A ! A!( )

"1

! y = A+! y

Page 8: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 8/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Many more norms

• p-Norm (or Lp / L p / ! p)

• !2 norm is the Euclidean norm

• !1 norm is sum of absolute values

• !0 norm is the number of non-zero values

• !" norm is max of all values

8

x

 p=   x

i

 p

i

! p

Page 9: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 9/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

How do they look?

• Unit norms in 2D

• Unit norms in 3D

9

!2   !4   !10!1!#!0

!2   !4   !10!1!#!0

Page 10: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 10/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

!2-based pseudoinverse

• All possible solutions lie on a hyperplane

• Minimum !2 solution will be the point where the sma

possible !2-ball touches the solutions hyperplane

10

-3   -2   -1 1 2 3

-3

-2

-1

1

2

3

8 x  Æ  0.882353,  y  Æ  1.47059< 9 x  Æ  0.939597,  y  Æ -4.24125 ¥ 10-9 ,  z  Æ  1.34228=

-4-2

02

4

-4

-2

0

2

4

-4

-2

0

2

4

Page 11: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 11/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Other ! p-norm solutions

• With di! erent norms the solution will change

• Larger p will produce a “busier” solution

• Smaller p will produce sparser solution

11

-3   -2   -1 1 2 3

-3

-2

-1

1

2

3

8 x  Æ  1.12005,  y  Æ  1.32797<

-3   -2   -1 1 2 3

-3

-2

-1

1

2

3

9 x  Æ - 1.50657 ¥ 10-9 ,  y  Æ  2.=

-3   -2   -1 1 2 3

-3

-2

-1

1

2

3

9 x  Æ -1.269 ¥ 10-19,  y  Æ  2.=

-3   -2

!4   !1   !0.5  

Page 12: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 12/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Which one to use?

• For sparsity we ideally we want minimum !0 

• Directly results in smallest number of non-zero values

• But, this is an inconvenient form …

• Discontinuous, no derivative, not convex, NP-hard, etc

12

!   x

0

x

= ?   x0

+!!!  A !x" y( )

!0  x( )=   x

i ! 0"

#$  %

&'i

(Iverson bracket:

if content evaluates

to true it returns 1

otherwise it returns 0

Page 13: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 13/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Let’s try something simpler then

• How about using the !1 instead?

• Seems to produce the same solution

13

-3   -2   -1 1 2 3

-3

-2

-1

1

2

3

8 x  Æ  1.12005,  y  Æ  1.32797<

Page 14: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 14/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Why does this work?

• The !1 case is (sort of) convex

• There’s one ill-defined scenario,

but it isn’t a big problem

• Which is it?

• So let’s solve that instead

14

Page 15: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 15/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

The problem to solve

• We now have:

• Minor glitch: We can’t di! erentiate the

absolute values in the !1 norm!

• But we can use other tools

15

argminx

 A!x" y

2+   x

1( )

 xi!   +!

!"  A "x# y( )

Page 16: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 16/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Linear programming

• A linear program is defined as:

• A Nobel-prize staple of optimization theory,

resource allocation, economics, etc.

16

minimize c!!x

subject to  A !x" y

and x#0

Page 17: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 17/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Doesn’t exactly match the !1 problem

• We would like to change our problem definitio

the linear programming formulation

17

minimize x1

subject to  A !x= y

minimize c!!

subject to  A !x

and x#

What we have What we can solve

Page 18: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 18/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

With some shu!ing around

• We rewrite the unknown vector x as a di! eren

positive-valued vectors:

• Now our problem can written as:

18

x= u!  v, ui,v

i " 0, z=

  u

 v

#

$

%%

&

'

((

 A !x= y "   A,   # A$%&

  '() !   u

 v

$

%

&&

'

(

))= y "   A,   # A

$%&

 

Page 19: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 19/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Now it is a linear program

• And we can solve our problem

19

minimize x1

subject to  A!x= y

minimize z

1=1

subject to [ A ," 

and z#0

Minimum !1 problem Equivalent linear program

Page 20: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 20/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

A simple example

• Suppose you measure this sinusoid

• Can you recover the original signal using only thesmall number of measurements that we made?

20

20 40 60 80 100

!0.5

0

0.5

1A partially observed signal

 

 

Page 21: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 21/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Key observation

• The original signal is sparse in the frequency d

• How about we use that to construct a problem?

21

20 40 60 80 100

!0.5

0

0.5

1A partially observed signal

 

Sign

Mea

20 40 60 80 100

0

2

4

6

8DCT of signal

h bl l

Page 22: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 22/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

The problem to solve

• Find a set of small coe"cients x in the DCT do

and make sure that they explain all our data y

• Where matrix A selects only the indices that we obse

• Simple least squares problem using minimum

22

x=   A !C"1( )+

! y

  A !C"1!x= y

d i d ’ ll d h

Page 23: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 23/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

And it doesn’t really do much

• But you expected that!

23

20 40 60 80 100

!1

0

1

A partially observed signal

 

Sl

M

20 40 60 80 100

!2

0

2

4

6

8DCT of signal

Wh h d?

Page 24: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 24/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

What happened?

• Minimizing the !2 resulted in adding more

frequencies in the signal• !2 doesn’t give sparsity

• Small coe!cients " min !2 

• We instead should find a minimal !1 solution

• Because it actually enforces sparsity

24

A d th lt

Page 25: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 25/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

And the result

• A much better reconstruction

25

20 40 60 80 100

!0.5

0

0.5

1A partially observed signal

 

Signl1 re

Mea

20 40 60 80 1000

2

4

6

8DCT of signal

A li ti

Page 26: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 26/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

A realization

• According to the rules of sampling

this is impossible!

• What is the magic taking place here?

•Why is sparsity special?

26

Wh it ?

Page 27: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 27/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Why sparsity?

• Sparsity implies structure, and

structure is everywhere

• Signals often exhibit sparsity after

undergoing the right transformation

27

S it i i l

Page 28: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 28/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Sparsity in signals

• Many signals are sparse

in certain domains• e.g. sound spectra

• Image wavelets

• …

• This is an opportunity to

measure signals better

• or to describe them easier

2810 20 30 40 50 60

10

20

30

40

50

60

Image

10

10

20

30

40

50

60

50 100 150 200 250 300 3 50 400

!0.2

!0.15

!0.1

!0.05

0

0.05

0.1

Bat chirp

   F  r  e  q  u  e  n  c  y

   (   H  z   )

0.5

0220044006700

15500

20000

31100

39900

51000

59900

71000

Vector spaces of signals

Page 29: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 29/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Vector spaces of signals

• Signals can be sparse in various ways

• And some are “compressible”

29

1-sparse signal space 2-sparse signal space Compressible s

Sparse approximations

Page 30: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 30/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Sparse approximations

• Represent signals using:

• Two goals:

• Analysis: Study f  through structure of a and b (seen th

• Approximation: Reconstruct f  with a minimal number

30

f   =   akb

k

k

!Bases / Dictionary

Coe "cients

Exposing sparsity via dictionaries

Page 31: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 31/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Exposing sparsity via dictionaries

• Can we use dictionaries that

produce sparse coe"cients?

• Why would they be useful and

how would we implement them?

31

Example case: Wavelets

Page 32: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 32/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Example case: Wavelets

• Note how most coe"cients are zero

• A sparse representation

32

 

A simple example

Page 33: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 33/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

A simple example

• A sinusoid with a couple of spikes

33

10 20 30 40 50 !15

!10

!5

0

5

10

15

Using a generic dictionary

Page 34: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 34/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I

    N

    G

   –

    F    A    L    L 

    2    0    1    5

Using a generic dictionary

• Analyzed via the DCT

• Resulting coe!cients are not sparse• Multiple sines are used to approximate the spikes

34

Dictionary

10 20 30 40 50 60

10

20

30

40

50

60

10 20 30 40 50 60!15

!10

!5

0

5

10

15Input

10 20 3!4

!2

0

2

4

6

8Coe

A “better” dictionary

Page 35: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 35/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

A better dictionary

• Use both sinusoids and spikes!

• Now we won’t use as many sines to represent the spik

35

20 40 60 80 100 120

10

20

30

40

50

60

Applying the dictionary

Page 36: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 36/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Applying the dictionary

• Using the straightforward decomposition wo

• Spike elements are not utilized• Minimal !2 cost penalizes the bases describing the lou

36

Dictionary

20 40 60 80 100 120

20

40

60

10 20 30 40 50 60!40

!20

0

20

40Input

20 40!10

!5

0

5

10

15

Doing it the right way

Page 37: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 37/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Doing it the right way

• This time we ask for minimum !1 coe"cients

• And we get a perfect description of the input!

37

Dictionary

20 40 60 80 100 120

20

40

60

10 20 30 40 50 60!40

!20

0

20

40Input

20 40!40

!20

0

20

40

Overcomplete dictionaries

Page 38: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 38/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Overcomplete dictionaries

• Use dictionaries that contain “everything”!

• Use compact descriptions of elements

• Some problems

• Large size/computations

• Lack of fast algorithms (e.g. FFT)• Problems with “coherence”

• This is a big one

38

Dictionary coherence

Page 39: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 39/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Dictionary coherence

• Make sure that the dictionary elements don’t

in ambiguous coe"cients• A measure of coherence:

39

5 10 15 20 25 30

5

10

15

5 10 15 20

5

10

15

Good Bad!

µ=maxi, j

di,d

 j

Examples of incoherent dictionaries

Page 40: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 40/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Examples of incoherent dictionaries

40

5 10 15

5

10

15

Fourier!like bases

5 10 15

5

10

15

Impulse bases

5 10 15

5

10

15

Gaussian noise bases

5 10 15

5

10

15

Binary noise bases

Getting greedy

Page 41: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 41/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Getting greedy

• The linear programming approach will fail no

• It is slow for large dictionaries• It looks for exact equality, not approximation

• We can instead use a greedy approach to

resolving sparse approximations

41

Matching pursuit (MP)

Page 42: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 42/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Matching pursuit (MP)

• Family of many approaches based on successi

• Each new fit explains what’s the previous ones couldn

42

Measure inputagainst dictionary

Select largest

correlation "k 

5

10

15

5 10 15

5

10

15

5

10

15

 Add torepresentatio

Repeat using residual

=

Dictionary DT Input f  Correlations "

.

ai! !

k

f  ! f "!k

dk,f    = !

k

On a familiar example

Page 43: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 43/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

On a familiar example

• Each iteration knocks o!  an element

• by 4th iteration there’s nothing significant left to repr• Faster! For N  = 1024, MP: 0.005 sec, LP: 63 sec

43

20 40 60

!10

0

10

Iteration 1

      I     n     p     u      t

20 40 60

!10

0

10

Iteration 2

20 40 60

!10

0

10

Iteration 3

!10

0

10

20 40 60 80 100120

!10

0

10

      C     o     e      f      f      i     c      i     e     n      t     s

20 40 60 80 100120

!10

0

10

20 40 60 80 100120

!10

0

10

20

!10

0

10

Dictionary

20 40 60

20

40

60

80

100

120

CoSaMP (Compressive Sensing MP)

Page 44: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 44/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

CoSaMP (Compressive Sensing MP)

• A useful variation to get

 K  non-zero coe"cients

44

Measure inputagainst dictionary

Select locations oflargest 2 K "k

5

10

15

5 10 15

5

10

15

5

10

15

 Add to sup

Truncate t

compute r

Repeat using residual

=

Dictionary DT Input f  Correlations "

.   T = sup

a= b| K 

f  ! f "

dk,f    = !

kInvert over

!= supp  (

b=D

Revisiting sampling

Page 45: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 45/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

Revisiting sampling

• Traditional acquisition samples uniformly

• e.g. constant sample rates in audio, CCD grids in came

• Foundation: Nyquist/Shannon sampling theo

• Sample at twice the highest frequency

• Projects to a complete basis that spans all the signal s

45

A redundancy in the loop

Page 46: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 46/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

y p

• Take a picture

• Using dense sampling# lots of data

• Transform to a sparse domain and quantize

• i.e. MPEG/JPEG compression # fewer data

• Process, transmit, view, etc.

46

Compressive sensing

Page 47: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 47/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

p g

• Why sample and then compress?

• Do both at once!

• Sample fewer samples and use signal sparsity

• Helps in finding a unique and plausible sparse signal

47

The compressive sensing pipeline

Page 48: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 48/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

p g p p

• Acquire signal using underdetermined measu

• e.g. linear combinations of a few samples:• Don’t sample densely, don’t sample all the data

• Reconstruct signal assuming some sparsity

• Sparsity constraints signal space w.r.t. measurements• Allows for a plausible reconstruction

48

 y i=

Pi

 !x

What’s a good measurement matrix

Page 49: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 49/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

g

• We need:

• i.e. ensure that we can distinguish di$ erent inputs

• Necessary condition: P must have at least 2 K

• Assuming noiseless and well-behaved data

49

P !x1  " P !x2   for all  K -sparse x1  " x2

Example 1-sparse case

Page 50: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 50/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

p p

• Some poor measurement matrices

50

x1

x2

x3

P=   0 1 0

0 0 1

!

"

##

$

%

&&

P=  1 1 0

0 0 1

!

"

##

$

%

&&

 y1= x

2

 y2

x1= 0

 y1=

x1=

 y2

X

X

X

Example 1-sparse case

Page 51: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 51/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

p p

• Some good measurement matrices

51

P=0 1 0

1

2  0 1

!

"

###

$

%

&&&

P=1   !   1

2  !   1

2

0   3

2  !   3

2

"

#

$$$

%

&

'''

 y1= x

2

 y2

x1

 y1=

x1

x3

  x2

x1

x2

x3

X

X

X

5

Restricted Isometry Property (RIP)

Page 52: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 52/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S

    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

• A measurement matrix P has order- K  RIP if:

• How do we check? Di"

cult ...• But we have some good choices

• Gaussian, Bernoulli, Fourier, etc.

52

1!!  K 

( )"P #x

2

2

x2

2" 1+ ! 

 K ( ), $   K -sparse x

5

Embedding viewpoint

Page 53: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 53/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

• This is similar to the subspace/manifold meth

• Find low-rank projection that preserves cluster charac

• Special case: Random projection

• For n points in p-dims there exists a q-dim projection t

preserves distances by a factor of 1+#, q $ O(# –2 log

53

5

Putting compressive sensing to work

Page 54: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 54/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N    I

    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

• The single-pixel camera

• Measure image intensity using multiple random mask• Reconstruct assuming sparsity

• In some domain ...

54

 R 

1-pixel photosensor Random mask 

(a) (b)

256 # 256 raster image65,536 measurements

Solve forsparsity

(averages allmeasurements)

(“measurement matrix”)

Object to image

5

An simpler version of that

Page 55: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 55/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1    5

• Simple 64 # 64 input

• Take a set of masked measurements, store only averag• Measure using:

55

Input

20 40 60

10

2030

40

50

60

   M  a  s   k

Measurement 1

   S  a  m  p   l  e   i  m  a  g  e

Measurement 2 Measu rement 3 Measu rement 4 Measurem

10 20 30 40 50 60

!100

0

100

Measurements

 yi= p

i

!!vec   x( ), p

i " {0,1}

    5

Resolving via the DCT

Page 56: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 56/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1

• Model is:

• Assume sparsity in z 

• Resolve:

• Reconstruct:

• Using ~2% of samples!

56

10

20

30

40

50

60

Input

20 40 60

10

20

30

40

50

60

10

20

30

40

50

60

Using 1.95% measurements

20 40 60

10

20

30

40

50

60

 y = P! !x= P! !  C!!z

( )

min   z1

+   y!P! "C!"z

x̂=C!

!z

    5

Other kinds of sparsity

Page 57: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 57/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S    S    I    N

    G

   –

    F    A    L    L 

    2    0    1

• Up to now we talked about a simple form of s

• Sparsity over all coe!

cients separately

• Some problems need more elaborate definitio

• e.g. block structure, joint structure, temporal structur

57

    5

Block sparsity

Page 58: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 58/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S

    S    I    N

    G

   –

    F    A    L    L 

    2    0    1

• Have sparsity appear in non-overlapping bloc

• Useful for some imaging operations

58

=   .

 y   !   x

M#1

Measurements

M#NMeasurement

matrix

N#Spa

represen

K non-z

1    5

 Joint sparsity

Page 59: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 59/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S

    S    I    N

    G

   –

    F    A    L    L 

    2    0    1

• Obtain multiple solutions with the same spar

• Useful for multimodal/multichannel data

59

=   .

M#1

Measurements

M#NMeasurement

matrix

N#Spa

represen

K non-z

 y   !   x

1    5

Learning and sparsity

Page 60: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 60/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S

    S    I    N

    G

   –

    F    A    L    L 

    2    0    1

• How about other models?

• Add sparsity as an extra “regularizer”

• Sparse decompositions

• Sparse NMF, sparse PCA, etc ...

• ICA is already (sort of) sparse!

60

1    5

Recap

Page 61: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 61/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S

    S    I    N

    G

   –

    F    A    L    L 

    2    0    

• Sparsity and ! p norms

• Di$ erent definition of sparsity

• Minimum-!1 coe"cient algorithms

• Linear programming

• Greedy methods

• Compressive sensing and random projections

• 1-pixel camera

61

    1    5

Reading material

Page 62: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 62/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S

    S    I    N

    G

   –

    F    A    L    L 

    2    0

• Compressive sensing• http://dsp.rice.edu/sites/dsp.rice.edu/files/cs/baraniukCSle

• Compressive Sensing page

• http://dsp.rice.edu/cs 

• Experiments with Random Projections

• http://dimacs.rutgers.edu/Research/MMS/PAPERS/rp.

62

    1    5

Next lecture

Page 63: Lecture 18 - Compressive Sensing

8/20/2019 Lecture 18 - Compressive Sensing

http://slidepdf.com/reader/full/lecture-18-compressive-sensing 63/63

    M

    A    C    H

    I    N

    E 

    L    E    A    R    N

    I    N

    G

     F    O

    R 

    S    I    G    N

    A    L 

    P    R    O

    C    E    S

    S    I    N

    G

   –

    F    A    L    L 

    2    0

• Deep learning!• aka neural nets v2.0

63