tutorial: sparse signal recovery - home | institute for ... · tutorial: sparse signal recovery ......
TRANSCRIPT
![Page 1: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/1.jpg)
Tutorial: Sparse Signal Recovery
Anna C. Gilbert
Department of MathematicsUniversity of Michigan
![Page 2: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/2.jpg)
(Sparse) Signal recovery problem
signal or
population
length N
k important
features
x = yΦ
measurements
or tests:
length m
Under-determined linear system: Φx = y
Given Φ and y , recover information about x
Solving LS: arg min ‖Φx − y‖2 returns a non-sparse x
![Page 3: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/3.jpg)
Two main examples: group testing and compressed sensing
xΦ
= y
Group testingΦ binary = pooling designx binary, 1 =⇒ defective, 0 =⇒ acceptableOUTPUT: defective setsuccess = find defective set exactlyArithmetic: OR
![Page 4: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/4.jpg)
Example: Combinatorial group testing
Rat dies only 1 week after drinking poisoned wine
![Page 5: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/5.jpg)
Being good (computer) scientists, they do the following:
![Page 6: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/6.jpg)
![Page 7: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/7.jpg)
Unique encoding of each bottle
![Page 8: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/8.jpg)
If bottle 5 were poison...
![Page 9: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/9.jpg)
...after 1 week
![Page 10: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/10.jpg)
Two main examples: group testing and compressed sensing
Φ x = y
Compressed sensingΦ = measurement matrixx signal, sparse/compressibleOUTPUT: x good approximation to xsuccess = ‖x − x‖p versus ‖x − xk‖qArithmetic: R or C
Head
Tail
k N
xk
x! xk
![Page 11: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/11.jpg)
Biological applications are a combination of these examples
• Sparse (binary?) matrices
• Quantitative (non-Boolean) measurements
• Noisy or missing measurements
• Various signal models
![Page 12: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/12.jpg)
Theoretical design problems: jointly design matrices andalgorithms
Design/Characterize Φ with m < N rows and recoveryalgorithm s.t. output x is close to good approximation of x :
‖x − x‖p ≤ C‖x − xk‖q.
![Page 13: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/13.jpg)
Algorithmic resources
• Computational time: return x in time proportional to N orto m? (impacts number of items returned)
• Computational memory: use working space proportional toN or to m?
• Small-space random constructions: can we generaterows/cols of Φ from collections of random vars with limitedindependence?
• Explicit constructions: how do we construct Φ? Randomlyor deterministically? Time to calculate each entry?
• Matrix density: number of non-zero entries in Φ?
![Page 14: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/14.jpg)
Signal models
• Adversarial or “for all” recover all x with guarantee
‖x − x‖2 ≤C√k‖x − xk‖1.
When x is compressible, or ‖x − xk‖1 ≤√
k‖x − xk‖2, obtainbetter result
‖x − x‖2 ≤ C‖x − xk‖2.
• Probabilistic or “for each”: recover all x that satisfy astatistical constraint
fixed signal x , recover with high probability over constructionof Φuniform distribution over k-sparse signalsi.e., random signal model
![Page 15: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/15.jpg)
Empirical algorithmic performance
• Polynomial time algorithms: typically fewer measurementsrequired for desired accuracy, by constant factor
1 Convex optimization: constrained minimization (e.g., Bregmaniteration)
arg min ‖x‖1 s.t. ‖y − Φx‖2 ≤ δ.
Unconstrained minimization (e.g., `1-regularized LS)
minimize L(x ; γ, y) =1
2‖y − Φx‖2
2 + γ ‖x‖1 .
2 Greedy algorithms: iterative thresholding, subspace iteration
z = Φ∗y then threshold z , keep largest T entries
• Sublinear time algorithms: for sparse, binary matrices only(currently)
Strive to work in space and time proportional to m, speciallydesigned matrices/distributions and algorithms
![Page 16: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/16.jpg)
Sparse matrices: Expander graphs
S N(S)
• Adjacency matrix Φ of a d regular (1, ε) expander graph• Graph G = (X ,Y ,E ), |X | = n, |Y | = m• For any S ⊂ X , |S | ≤ k, the neighbor set
|N(S)| ≥ (1− ε)d |S |• Probabilistic construction:
d = O(log(n/k)/ε),m = O(k log(n/k)/ε2)
• Deterministic construction:
d = O(2O(log3(log(n)/ε))),m = k/ε 2O(log3(log(n)/ε))
![Page 17: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/17.jpg)
Bipartite graphAdjacency matrix
1 0 1 1 0 1 1 0
0 1 0 1 1 1 0 1
0 1 1 0 0 1 1 1
1 0 0 1 1 0 0 1
1 1 1 0 1 0 1 0
Measurement matrix(larger example)
50 100 150 200 250
5
10
15
20
![Page 18: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/18.jpg)
RIP(p)
A measurement matrix Φ satisfies RIP(p, k , δ) property if for anyk-sparse vector x ,
(1− δ)‖x‖p ≤ ‖Φx‖p ≤ (1 + δ)‖x‖p.
Examples:
• RIP(2): Gaussian random matrix, random rows of discreteFourier matrix
• RIP(1): adjacency matrix of expander
![Page 19: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/19.jpg)
RIP(p) ⇐⇒ expander
Theorem(k , ε) expansion implies
(1− 2ε)d‖x‖1 ≤ ‖Φx‖1 ≤ d‖x‖1
for any k-sparse x. d = degree of expander.
TheoremRIP(1) + binary sparse matrix implies (k, ε) expander for
ε =1− 1/(1 + δ)
2−√
2.
Joint work with Berinde, Indyk, Karloff, Strauss
![Page 20: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/20.jpg)
RIP(2) and LP decoding
Let Φ be an RIP(2) matrix and collect measurements Φx = y .Then LP recovers a good approximation to the top k entries of x :
arg min ‖x‖1 s.t. Φx = y
LP returns x with ‖x − ‖2 ≤ C√k‖x − xk‖1. And, noise-resilient
version too. [Candes, Tao, + Romberg; Donoho]
![Page 21: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/21.jpg)
Expansion =⇒ LP decoding
TheoremΦ adjacency matrix of (2k , ε) expander. Consider two vectors x, x∗such that Φx = Φx∗ and ‖x∗‖1 ≤ ‖x‖1. Then
‖x − x∗‖1 ≤2
1− 2α(ε)‖x − xk‖1
where xk is the optimal k-term representation for x andα(ε) = (2ε)/(1− 2ε).
• Guarantees that Linear Program recovers good sparseapproximation
• Noise-resilient version too
![Page 22: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/22.jpg)
RIP(1) 6= RIP(2)
• Any binary sparse matrix which satisfies RIP(2) must haveΩ(k2) rows [Chandar ’07]
• Gaussian random matrix m = O(k log(n/k)) (scaled) satisfiesRIP(2) but not RIP(1)
xT =(0 · · · 0 1 0 · · · 0
)yT =
(1/k · · · 1/k 0 · · · 0
)‖x‖1 = ‖y‖1 but ‖Gx‖1 ≈
√k‖Gy‖1
• For biological applications, it may be useful to use sparse,binary RIP(1)+RIP(2) matrices with many rows.
![Page 23: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/23.jpg)
RIP(1) =⇒ LP decoding
• `1 uncertainty principle
LemmaLet y satisfy Φy = 0. Let S the set of k largest coordinates of y .Then
‖yS‖1 ≤ α(ε)‖y‖1.
• LP guarantee
TheoremConsider any two vectors u, v such that for y = u − v we haveAy = 0, ‖v‖1 ≤ ‖u‖1. S set of k largest entries of u. Then
‖y‖1 ≤2
1− 2α(ε)‖uSc‖1.
![Page 24: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/24.jpg)
Greedy, iterative algorithms for sparse matrices
• Polynomial time algorithms: several variants, similar to IHT[Berinde, Indyk, Ruszic]
• Sublinear time algorithms: several variants, depend on norms,signal models, [Gilbert, Li, Porat, Strauss; Porat, Strauss; Ngo, Porat, Rudra; Price, Woodruff]
![Page 25: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/25.jpg)
High Throughput Screening (HTS)
HTS is an essential step in drug discovery(and elsewhere in biology)
• Large chemical libraries screened on abiological target for activity
• Basic 0, 1 type biological assays to findactive compounds
• Usually a small number of compounds found
• One-at-a-time screening: automation andminiaturization
• Noisy assays with false positives andnegative errors
Motivation Pooling in HTS QUAPO Results Conclusions
High Throughput Screening (HTS)
HTS is an essential step in drug discovery (and elsewhere in biology).
Large chemical libraries screened ona biological target for activity.
Basic yes-no type biological assaysto find active compounds .
Usually a small number ofcompounds found.
One-at-a-time screening – powercomes from automation andminiaturization.
Noisy Assays with false positive andnegative errors.
![Page 26: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/26.jpg)
Current HTS uses one-at-a-time testing scheme (with repeatedtrials).
Motivation Pooling in HTS QUAPO Results Conclusions
Current HTS Design
Current HTS uses a one-at-a-time testing scheme.
![Page 27: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/27.jpg)
Mathematical set-up
• Binary measurement matrix: adjacencymatrix of unbalanced expander graph
• Appropriate linear biochemical model
• Decoding via linear programming
Motivation Pooling in HTS QUAPO Results Conclusions
Sparse Recovery Problem
QUAPO
Binary measurement matrix : Adjacencymatrix of unbalanced expander graph.a
Appropriate linear biochemical model.
Decoding : via Linear Programming.
aBerinde et. al. (2008) – Combining geometry andcombinatorics: A unified approach to sparse signalrecovery
![Page 28: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/28.jpg)
Potential Advantages
Motivation Pooling in HTS QUAPO Results Conclusions
Pooled HTS Design
Propose the use of pooledtesting of compounds.
Uses fewer tests.
Work is moved from testing(costly) to analysis (cheap).
Handles errors in testing dueto in-built redundancy.
• Pooled testing of compounds
• Uses fewer tests
• Work moved from testing(costly) to computationalanalysis (cheap)
• Handles errors in testing betterdue to built-in replication
•• Additional quantitative
information
![Page 29: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/29.jpg)
HTS and signal recoveryMotivation Pooling in HTS QUAPO Results Conclusions
Analogy to HTS
To use compressed sensing approach we need a linear model.
![Page 30: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/30.jpg)
Experimental set-up• Deterministic binary matrix = Shifted Transversal Design
(STD)• Time-resolved FRET screen to identify inhibitors of RGS4C• Singleplex: 316 compounds screened twice• Multiplex: 316 compounds screened 4 per well using 316
pooled wells
Motivation poolHiTS QUAPO poolMC Summary
STD based screen
Time-resolved FRET screen to identify inhibitors of RGS4C
Singleplex: 316 compounds screened twice
Multiplex: 316 compounds screened 4 per well using 316 pooled wells (4X)
23 / 47
![Page 31: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/31.jpg)
Experimental results
• Multiplex results match Singleplex 1 very closely
• Decoding in the presence of large errors
Motivation poolHiTS QUAPO poolMC Summary
QUAPO screen results
Multiplex results match Singleplex 1 very closely.
Decoding in the presence of large errors.
34 / 47
![Page 32: Tutorial: Sparse Signal Recovery - Home | Institute for ... · Tutorial: Sparse Signal Recovery ... kuSc k1: Greedy, iterative algorithms for sparse matrices ... Decoding via linear](https://reader031.vdocuments.site/reader031/viewer/2022013018/5c8138e509d3f2b9438c0be9/html5/thumbnails/32.jpg)
Theoretical −→ Practical challenges in HTS applications
• Accuracy of output not speed of algorithm
• Joint matrix and algorithm design, maximize recoveredinformation within budget
• Incorporate prior information
• Incorporate noise
• Incorporate measurement device configuration