introduction to compressive sensing

27
Introduction to Compressive Sensing BY / ENG. AHMED NASSER AHMED DEMONSTRATOR AT FACULTY OF ENGINEERING SUEZ CANAL UNIVERSITY Introduction to Compressive Sensing 1

Upload: electronics-communication-staff-scu-suez-canal-university

Post on 17-Jul-2015

122 views

Category:

Education


4 download

TRANSCRIPT

Page 1: Introduction to Compressive Sensing

Introduction to

Compressive Sensing

BY / ENG. AHMED NASSER AHMED

DEMONSTRATOR AT FACULTY OF ENGINEERING

SUEZ CANAL UNIVERSITY

Intro

du

ctio

n to

Co

mp

ressiv

e S

en

sing

1

Page 2: Introduction to Compressive Sensing

Contents

1- What is compressive sensing (CS)

• Sparsity

• Incoherence Sampling

2- UNDERSAMPLING AND SPARSE SIGNAL RECOVERY

3- Robust Compressive Sensing

• Restricted Isometry Property(RIP)

• Random Sensing

• General Data Recovery From Undersampling Data

• Robust Signal Recovery From Noisy Data

4- Compressive Sensing applicationsIntroduction to Compressive Sensing

2

Page 3: Introduction to Compressive Sensing

Nyquist Rate

Introduction to Compressive Sensing

3

Sampling rate > 2 * max frequency What about 4K HD videos ?!

The solution is : Compressed Sensing

Page 4: Introduction to Compressive Sensing

What is compressive sensing (CS)

compressive sensing (CS) theory asserts that one can recover certain

signals and images from far fewer samples or measurements than

traditional methods use

CS relies on two principle

1. sparsity: which pertains to the signal of interest

2. In coherence : which pertains to the sensing modality

Introduction to Compressive Sensing

4

Page 5: Introduction to Compressive Sensing

Sparsity

Sparsity expresses the idea that the “information rate” of a continuous

time signal may be much smaller than suggested by its bandwidth, or that

a discrete-time signal depends on a number of degrees of freedom which

is comparably much smaller than its (finite) length

CS exploits the fact that many natural signals are sparse or compressible in

the sense that they have concise representations when expressed in the

proper basis Ψ.

Introduction to Compressive Sensing

5

Page 6: Introduction to Compressive Sensing

Sparsity

The basis Ψ ⋴ R𝑛 must be orthonormal basis (Orthogonal + Normalize)

2-D axis R2

1 00 1

23

=23

3-D axis R3

1 0 00 1 00 0 1

245

=245

Introduction to Compressive Sensing

6

Page 7: Introduction to Compressive Sensing

Sparsity

Introduction to Compressive Sensing

7

Page 8: Introduction to Compressive Sensing

Sparsity

Many natural signals have concise representations

when expressed in a convenient basis.

Consider, for example, the image in Figure 1(a) and

its wavelet transform in (b). Although nearly all the

image pixels have nonzero values, the wavelet

coefficients offer a concise summary: most

coefficients are small, and the relatively few large

coefficients capture most of the information.

Introduction to Compressive Sensing

8

Page 9: Introduction to Compressive Sensing

Sparsity

If we have a vector which we expand in an orthonormal basis

(such as a wavelet basis)

Where :

X : is the coefficient sequence of ƒ

Ψ : is the n x n matrix with Ψ1 , …….., Ψ𝑛 as columns

S-Sparse signal: is the signal that has S nonzero entries

Introduction to Compressive Sensing

9

Page 10: Introduction to Compressive Sensing

Incoherence Sampling

Introduction to Compressive Sensing

10

Page 11: Introduction to Compressive Sensing

Incoherence Sampling

Suppose we are given a pair (Φ, Ψ) of orthobases of R𝑛.

The first basis Φ is used for sensing the object ƒ

The second Ψ is used to represent ƒ

the coherence measures the largest correlation between any two

elements of Φ and Ψ. If Φ and Ψ contain correlated elements, the

coherence is large. Otherwise, it is small

The smaller the coherence, the fewer sample are needed.

Introduction to Compressive Sensing

11

Page 12: Introduction to Compressive Sensing

Incoherence Sampling

random matrices are largely incoherent with any fixed basis Ψ .

Select an orthobasis Φ uniformly at random, which can be done by orthonormalizing n vectors sampled independently and uniformly on the unit sphere. Then with high probability, the coherence between Φ and Ψis about .

By extension, random waveforms (ϕk (t )) with independent identically distributed (i.i.d.) entries, e.g., Gaussian or ±1 binary entries, will also exhibit a very low coherence with any fixed representation .

Introduction to Compressive Sensing

12

Page 13: Introduction to Compressive Sensing

Contents

1- What is compressive sensing (CS)

• Sparsity

• Incoherence Sampling

2- UNDERSAMPLING AND SPARSE SIGNAL RECOVERY

3- Robust Compressive Sensing

• Restricted Isometry Property(RIP)

• Random Sensing

• General Data Recovery From Undersampling Data

• Robust Signal Recovery From Noisy Data

4- Compressive Sensing applicationsIntroduction to Compressive Sensing

13

Page 14: Introduction to Compressive Sensing

UNDERSAMPLING AND SPARSE SIGNAL

RECOVERY

We can use ℓ1- normalization to recover the sparse signal

if

the proposed reconstruction ƒ∗

us given by ƒ∗= Ψ 𝑥∗

Where 𝑥∗ is the convex optimization program

Among all objects consistent with the data, we pick that whose

coefficient sequence has minimalℓ1 norm .

Introduction to Compressive Sensing

14

Page 15: Introduction to Compressive Sensing

UNDERSAMPLING AND SPARSE SIGNAL

RECOVERY

Result asserts that when f is sufficiently sparse, the recovery via ℓ1 minimization

is provably exact

ℓ1-minimization is not the only way to recover sparse solutions; other methods,

such as greedy algorithms, have also been proposed.

Introduction to Compressive Sensing

15

Page 16: Introduction to Compressive Sensing

UNDERSAMPLING AND SPARSE SIGNAL

RECOVERY

Theorem 1 :

For ƒ ⋴ R𝑛 , and the coefficient sequence x of f in the basis Ψ is

S-sparse

Then if the Selected m measurements in the Φ domain

uniformly at random equal :

the solution to (5) is exact with overwhelming probability.(the

probability of success exceeds 1-δ if

Introduction to Compressive Sensing

16

Page 17: Introduction to Compressive Sensing

UNDERSAMPLING AND SPARSE SIGNAL

RECOVERY

According to there are three comments:

1) the smaller the coherence, the fewer samples are needed, hence our emphasis on low coherence systems

2) One suffers no information loss by measuring just about any set of m coefficients which may be far less than the signal size apparently demands. If μ(Φ, Ψ) is equal or close to one, then on the order of Slog n samples suffice instead of n.

3) The signal f can be exactly recovered from our condensed data set by minimizing a convex functional which does not assume any knowledge about the number of nonzero coordinates of x, their locations, or their amplitudes which we assume are all completely unknown a priori. We just run the algorithm and if the signal happens to be sufficiently sparse, exact recovery occurs.

Introduction to Compressive Sensing

17

Page 18: Introduction to Compressive Sensing

Contents

1- What is compressive sensing (CS)

• Sparsity

• Incoherence Sampling

2- UNDERSAMPLING AND SPARSE SIGNAL RECOVERY

3- Robust Compressive Sensing

• Restricted Isometry Property(RIP)

• Random Sensing

• General Data Recovery From Undersampling Data

• Robust Signal Recovery From Noisy Data

4- Compressive Sensing applicationsIntroduction to Compressive Sensing

18

Page 19: Introduction to Compressive Sensing

Robust Compressive Sensing

in order to be really powerful, CS needs to be able to deal with both nearly sparse signals and with noise. So general objects of interest are not exactly sparse but approximately sparse. CS must deal with two issue:

1) First, is whether or not it is possible to obtain accurate reconstructions of such objects from highly undersampled measurements.

2) Second, in any real application measured data will invariably be corrupted by at least a small amount of noise as sensing devices do not have infinite precision

In general 𝑦 = 𝐴𝑥 + 𝑧

where

A=RΦΨ : is an m× n “sensing matrix” giving us information about x,

R is the m× n matrix extracting the sampled coordinates in M

z: is a stochastic or deterministic unknown error term

Introduction to Compressive Sensing

19

Page 20: Introduction to Compressive Sensing

Restricted Isometry Property(RIP)

Restricted Isometry Property(RIP) has proved to be very useful to study the general robustness of CS

Restricted Isometry Property(RIP) measure the orthogonality of all subsets of

S columns taken from A which is very important for sparse signal recovery

For each integer S = 1, 2, . . . , define the Isometry constant

δ𝑠 of a matrix A as the smallest number such that

We will loosely say that a matrix A obeys the RIP of order

S if δ𝑠 is not too close to one.

When the RIP property holds this mean

Approximately preserves the Euclidean length of S-sparse signals, which in turn implies that S-sparse vectors cannot be in the null space of A.

Or , means that that all subsets of S columns taken from A are in fact nearly orthogonalIntroduction to Compressive Sensing

20

Page 21: Introduction to Compressive Sensing

Random Sensing

we would like to find sensing matrices with the property that column vectors taken from arbitrary subsets are nearly orthogonal

the following sensing matrices can be considered

1. form A by sampling n column vectors uniformly at random on the unit sphere of Rm;

2. form A by sampling i.i.d. entries from the normal distribution with mean 0 and variance 1/m;

3. form A by sampling a random projection P as in “Incoherent Sampling” and normalize: A = n/m P

4. form A by sampling i.i.d. entries from a symmetric Bernoulli distribution or other sub-gaussian distribution.

5. RIP can also hold for sensing matrices A = ΦΨ, where Ψ is an arbitrary orthobasis and Φ is an m× n measurement matrix drawn randomly from a suitable distribution

And so we can substitute m to

Introduction to Compressive Sensing

21

Page 22: Introduction to Compressive Sensing

General Data Recovery From

Undersampling Data

THEOREM 2 :

If the RIP holds, then the following linear program gives an accurate

reconstruction of the sparse signal:

Where 𝑥∗ obey to

And

Introduction to Compressive Sensing

22

Page 23: Introduction to Compressive Sensing

General Data Recovery From

Undersampling Data

The conclusions of Theorem 2 are stronger than those of Theorem 1 as :

1. this new theorem deals with all signals. If x is not S-sparse, then theorem asserts that the quality of the recovered signal is as good as if one knew ahead of time the location of the S largest values of x and decided to measure those directly.

2. In other words, the reconstruction is nearly as good as that provided by an oracle which, with full and perfect knowledge about x, extracts the S most significant pieces of information for us.

3. Another striking difference with our earlier result is that it is deterministic; it involves no probability. If we are fortunate enough to hold a sensing matrix A obeying the hypothesis of the theorem, we may apply it, and we are then guaranteed to recover all sparse S-vectors exactly, and essentially the S largest entries of all vectors otherwise; i.e., there is no probability of failure.

Introduction to Compressive Sensing

23

Page 24: Introduction to Compressive Sensing

Robust Signal Recovery From Noisy

Data

THEOREM 3:

If We are given noisy data as in and use 1 minimization with relaxed constraints

for reconstruction:

Where

𝑥∗ obey to

And , 𝐶1, 𝐶2 are typically Small

Introduction to Compressive Sensing

24

Page 25: Introduction to Compressive Sensing

Applications

Data Compression

Channel Coding

Inverse Problem

Data Acquisition

Wireless Channel Estimation

Introduction to Compressive Sensing

25

Page 26: Introduction to Compressive Sensing

References

[1] Candès, Emmanuel J., and Michael B. Wakin. "An introduction to

compressive sampling." Signal Processing Magazine, IEEE 25.2 (2008): 21-30.

[2] Baraniuk, Richard G. "Compressive sensing." IEEE signal processing magazine 24.4 (2007).

Introduction to Compressive Sensing

26

Page 27: Introduction to Compressive Sensing

Introduction to Compressive Sensing

27

Thank You

Contact me:

Web site: www.ahmed_nasser_eng.staff.scuegypt.edu.eg

Email:

[email protected]

[email protected]