connecting bayesian and denoising-based approximate …cam6.web.rice.edu/talks/asilomar_v4.pdfd-amp...

Post on 29-May-2020

7 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Connecting Bayesian and

Denoising-Based Approximate

Message

Chris Metzler, Richard Baraniuk

Rice University

Arian Maleki

Columbia University

Compressive Sensing Problem

Solution: Assume Structure

• Sparse: OMP [Tropp 04], IST [Figueiredo et al. 07], AMP [Donoho et al. 09]

• Minimal Total Variation: TVAL3 [Li et al. 09], TV-AMP [Donoho et al. 13]

• Tree-Sparse: Model-CoSaMP [Baraniuk et al. 10],Turbo-AMP [Som and Schniter 12]

• Group-Sparse: NLR-CS [Dong et al. 14]

Using Structure is Hard

• Write as penalty or constraint

• How to efficiently solve for non-convex R(x)?

• What is R(x) for natural images?

• What is R(x) for RF, microscopy, and other applications?

This Talk

• Develop algorithm that can easily use almost any structure

• Predict performance with accurate state evolution

• Derive theoretical guarantees• Measurements required

• Robust to noise

• Optimality and suboptimality

• Demonstrate state-of-the-art performance• 10dB better than wavelet sparsity

• 50x faster than group-sparsity

Insight: Denoisers Use Structure

• Gaussian Kernel• Smooth

• Soft Wavelet Thresholding [Donoho and Johnstone 94]

• Sparse wavelet representation

• BLS-GSM [Portilla et al. 03]

• Coefficients follow Gaussian Mixture Model

• NLM [Baudes et al. 05]

• Correlated structures

• BM3D [Dabov et al. 07]

• Group-sparse in DCT/Wavelet representation

• BM3D-SAPCA [Dabov et al. 09]

• Group-sparse in adaptive basis

Denoisers as Black Boxes

Denoiser

Denoisers as Projections

C

Naïve Algorithm: Denoising-based

Iterative Thresholding (D-IT)

Naïve Algorithm: D-IT

Our prior on x

Naïve Algorithm: D-IT

Naïve Algorithm: D-IT

Naïve Algorithm: D-IT

Naïve Algorithm: D-IT

Failure of D-IT: Systematic Errors

Systematic Errors: Overshooting

Systematic Errors: Overshooting

Too High

Systematic Errors: Overshooting

Too High Too Low

Too High Too Low

Systematic Errors: Overshooting

Too High Too Low

Too High Too Low

Systematic Errors: Overshooting

Too High Too Low

Too High Too Low

Systematic Errors: Overshooting

Too High Too Low

Too High Too Low

Use residuals from

previous iterations to

avoid over/under-

shooting

New Algorithm: D-AMP

Onsager Correction

D-AMP Benefits

D-IT

• Updates proportional to residual (P-controller).

• 5dB improvement over L1

D-AMP

• Updates proportional to previous residual (PI-controller).

• 10dB improvement over L1

(state-of-the-art)

• Onsager Correction

Onsager Correction:

• Where did it come from?• Approximation of message passing algorithm

Onsager Correction:

• Where did it come from?• Approximation of message passing algorithm

• Why does it help?• zt stores residuals over many iterations (momentum)

• Corrects for bias in denoiser solutions

• Makes errors uncorrelated (Gaussian) and thus easy to remove

Onsager Correction:

• Where did it come from?• Approximation of message passing algorithm

• Why does it help?• zt stores residuals over many iterations (momentum)

• Corrects for bias in denoiser solutions

• Makes errors uncorrelated (Gaussian) and thus easy to remove

• How is it calculated?• Approximation from Monte Carlo SURE [Ramani et al. 08]

D-AMP Avoids Systematic Errors

D-AMP Theoretical Properties

• State evolution predicts performance

• Explicit phase transition

• Robust to noise

• No algorithm can uniformly outperform D-AMP

• Single-class suboptimal

• Easy to tune

Bayesian-AMP and Bayesian

State Evolution

• Algorithm

• State Evolution

Denoising-based AMP and

Deterministic State Evolution

• Algorithm

• State Evolution

State Evolution Comparison

• With minimax denoiser, suprema of Deterministic and Bayesian state evolutions are equivalent:

• Significance of deterministic state evolution: • Can apply without knowing x’s distribution

• Can apply to natural images and other complex signals

State Evolution of D-IT and D-AMP

State Evolution is Accurate for

Many Denoisers

State Evolution for Discontinuous

Denoisers

State Evolution for Smoothed

Discontinuous Denoisers

Main Theoretical Results

• Denoiser Performance

• Phase Transition: Determined by denoiser

• Noise Sensitivity: Graceful failure

• No algorithm can uniformly outperform D-AMP

• D-AMP is single-class suboptimal

Parameter Tuning

• Denoiser parameters

• Problem: Tune denoiser parameters over multiple iterations

• Result: Greedy parameter selection is optimal

3x Under-Sampling

20x Under-Sampling

Wavelet Sparse (L1) BM3D-AMP (our algorithm)

10x Under-Sampling with Noise

NLR-CS BM3D-AMP

Performance without Noise

Computation Time

30x Faster

70x Faster

Performance with Noise

D-AMP Summary

• Arbitrary denoiser• NLM

• BM3D

• Useful state evolution

• State-of-the-art performance

• Resilient to noise

• >97% reduction in average computation time

C. Metzler, A. Maleki, R. G. Baraniuk, “From Denoising to

Compressed Sensing,” arXiv:1406.4175.pdf

D-AMP vs. AMP1, G-AMP

2, Turbo-

AMP3, TV-AMP

4, GrAMPA

5, etc.

Similarities

• Same basic AMP iterations

• Solve a series of denoisingproblems

• Better denoisers lead to better phase transition and noise sensitivity

Differences

• Separable denoisers without scale invariance

• Signal x can be denoised but need not have generalized-sparsity nor known px

• Approximate Onsager correction

• New deterministic state evolution

• State evolution holds for separable, but continuous, denoisers

• Derive phase transition and noise sensitivity of non-sparse signals

• Derive optimality/sub-optimality

• Optimal tuning strategy

1. Donoho et al. 09

2. Rangan 12

3. Som and Schniter 12

4. Donoho et al. 13

5. Borgerding et al. 14

Near Proper Denoiser

• Denoiser Performance

• Phase Transition: Determined by denoiser

• Noise Sensitivity: Graceful failure

top related