spectral methods - princeton universityyc5/ele538_math_data/lectures/spectral_method.pdf ·...

59
ELE 538B: Mathematics of High-Dimensional Data Spectral methods Yuxin Chen Princeton University, Fall 2018

Upload: others

Post on 18-Mar-2020

17 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

ELE 538B: Mathematics of High-Dimensional Data

Spectral methods

Yuxin Chen

Princeton University, Fall 2018

Page 2: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Outline

• A motivating application: graph clustering

• Distance and angles between two subspaces

• Eigen-space perturbation theory

• Extension: singular subspaces

• Extension: eigen-space for asymmetric transition matrices

Spectral methods 2-2

Page 3: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

A motivating application: graph clustering

Page 4: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Graph clustering / community detection

Community structures are common in many social networks

figure credit: The Future Buzz figure credit: S. Papadopoulos

Goal: partition users into several clusters based on theirfriendships / similarities

Spectral methods 2-4

Page 5: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

A simple model: stochastic block model (SBM)

xi = 1: 1st community xi = −1: 2nd community

• n nodes {1, · · · , n}

• 2 communities• n unknown variables: x1, · · · , xn ∈ {1,−1}

◦ encode community memberships

Spectral methods 2-5

Page 6: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

A simple model: stochastic block model (SBM)

G

• observe a graph G

(i, j) ∈ G with prob.{p, if i and j are from same communityq, else

Here, p > q and p, q & logn/n

• Goal: recover community memberships of all nodes, i.e. {xi}

Spectral methods 2-6

Page 7: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Adjacency matrix

Consider the adjacency matrix A ∈ {0, 1}n×n of G:

Ai,j ={

1, if (i, j) ∈ G0, else

• WLOG, suppose x1 = · · · = xn/2 = 1; xn/2+1 = · · · = xn = −1Spectral methods 2-7

Page 8: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Adjacency matrix

A = E[A]︸ ︷︷ ︸rank 2

+ A− E [A]

E[A] =[p11> q11>

q11> p11>

]= p+ q

2 11>︸ ︷︷ ︸uninformative bias

+ p− q2

[1−1

]︸ ︷︷ ︸

=x:=[xi]1≤i≤n

[1>,−1>

]

Spectral methods 2-8

Page 9: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Spectral clustering

A = E[A]︸ ︷︷ ︸rank 2

+ A− E [A]

1. computing the leading eigenvector u = [ui]1≤i≤n of A− p+q2 11>

2. rounding: output xi ={

1, if ui > 0−1, if ui < 0

Spectral methods 2-9

Page 10: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Spectral clustering

Rationale: recovery is reliable if A− E[A]︸ ︷︷ ︸perturbation

is sufficiently small

• if A− E[A] = 0, then

u ∝ ±[

1−1

]=⇒ perfect clustering

Question: how to quantify the effect of perturbation A−E[A] on u?

Spectral methods 2-10

Page 11: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Distance and angles between two subspaces

Page 12: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Setup and notation

Consider 2 symmetric matrices M , M = M +H ∈ Rn×n witheigen-decompositions

M =n∑i=1

λiuiu>i and M =

n∑i=1

λiuiu>i

where λ1 ≥ · · · ≥ λn; λ1 ≥ · · · ≥ λn. For simplicity, write

M = [U0,U1][

Λ0Λ1

] [U>0U>1

]

M = [U0, U1][

Λ0Λ1

] [U>0U>1

]

Here, U0 = [u1, · · · ,ur], Λ0 = diag ([λ1, · · · , λr]), · · ·

Spectral methods 2-12

Page 13: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Setup and notation

M =[u1 · · · ur︸ ︷︷ ︸

U0

ur+1 · · · un︸ ︷︷ ︸U1

]

·

λ1. . .

λr︸ ︷︷ ︸Λ0

λr+1. . .

λn︸ ︷︷ ︸Λ1

u>1...u>r

u>r+1...u>n

U>0U>1

Spectral methods 2-13

Page 14: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Setup and notation

• ‖M‖: spectral norm (largest singular value of M)

• ‖M‖F: Frobenius norm (‖M‖F =√

tr(M>M) =√∑

i,jM2i,j)

Spectral methods 2-14

Page 15: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Eigen-space perturbation theory

Main focus: how does the perturbation H affect the distancebetween U and U?

Question #0: how to define distance between two subspaces?

• ‖U − U‖F and ‖U − U‖ are not appropriate, since they fall shortof accounting for global orthonormal transformation︸ ︷︷ ︸

∀ orthonormal R∈Rr×r, U and UR represent same subspace

Spectral methods 2-15

Page 16: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Distance between two eigen-spaces

One metric that takes care of global orthonormal transformation is

dist(X0,Z0) := ‖X0X>0 −Z0Z

>0 ‖ (2.1)

This metric has several equivalent expressions:

Lemma 2.1

Suppose X := [X0,X1︸︷︷︸complement subspace

] and Z := [Z0, Z1︸︷︷︸complement subspace

] are orthonormal

matrices. Then

dist(X0,Z0) = ‖X>0 Z1‖ = ‖Z>0 X1‖

• sanity check: if X0 = Z0, then dist(X0,Z0) = ‖X>0 Z1‖ = 0

Spectral methods 2-16

Page 17: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Principal angles between two eigen-spaces

In addition to “distance”, one might also be interested in “angles”

�P⌦(M)

�i,j

=

(Mi,j , if (i, j) 2 ⌦,

0, else.264

a⇤1...

a⇤m

375

A = [a1, · · · , am]⇤

E[A] =

p11> q11>

q11> p11>

�=

p + q

211>

| {z }"direct current component"

+p � q

2

1�1

� ⇥1>,�1>⇤

xi =

(1, if ui > 0

�1, if ui < 0

u =

rp � q

2

1�1

1

�P⌦(M)

�i,j

=

(Mi,j , if (i, j) 2 ⌦,

0, else.264

a⇤1...

a⇤m

375

A = [a1, · · · , am]⇤

E[A] =

p11> q11>

q11> p11>

�=

p + q

211>

| {z }"direct current component"

+p � q

2

1�1

� ⇥1>,�1>⇤

xi =

(1, if ui > 0

�1, if ui < 0

u =

rp � q

2

1�1

✓i

1

We can quantify the similarity between two lines (represented resp. byunit vectors x0 and z0) by an angle between them

θ = arccos〈x0, z0〉

Spectral methods 2-17

Page 18: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Principal angles between two eigen-spaces

For r-dimensional subspaces, one needs r angles

Specifically, given ‖X>0 Z0‖ ≤ 1, we write SVD of X>0 Z0 ∈ Rr×r as

X>0 Z0 = U

cos θ1. . .

cos θr

︸ ︷︷ ︸

:=cos Θ

V > := U cos ΘV >

where {θ1, · · · , θr} are called the principal angles between X0 and Z0

Spectral methods 2-18

Page 19: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Relations between principal angles and dist(·, ·)

As expected, principal angles and distance are closely related

Lemma 2.2

Suppose X := [X0,X1] and Z := [Z0,Z1] are orthonormal matrices.Then

‖X>0 Z1‖ = ‖ sin Θ‖ = max{| sin θ1|, · · · , | sin θr|}

Lemmas 2.1 and 2.2 taken collectively give

dist(X0,Z0) = max{| sin θ1|, · · · , | sin θr|} (2.2)

Spectral methods 2-19

Page 20: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Lemma 2.2

‖X>0 Z1‖ =∥∥X>0 Z1Z

>1︸ ︷︷ ︸

=I−Z0Z>0

X0∥∥ 1

2

=∥∥X>0 X0 −X>0 Z0Z

>0 X0

∥∥ 12

=∥∥I −U cos2 ΘU>

∥∥ 12 (since X>0 Z0 = U cos ΘV >)

=∥∥I − cos2 Θ

∥∥ 12

= ‖ sin Θ‖

Spectral methods 2-20

Page 21: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Lemma 2.1We first claim that SVD of X>1 Z0 can be written as

X>1 Z0 = U sin ΘV > (2.3)

for some orthonormal U (to be proved later). With this claim in place, onehas

Z0 = [X0,X1][X>0X>1

]Z0 = [X0,X1]

[U cos ΘV >

U sin ΘV >

]

=⇒ Z0Z>0 = [X0,X1]

[U cos2 ΘU> U cos Θ sin Θ U>

U cos Θ sin ΘU> U sin2 Θ U>

] [X>0X>1

]As a consequence,

X0X>0 −Z0Z

>0

= [X0,X1][

I −U cos2 ΘU> −U cos Θ sin Θ U>

−U cos Θ sin ΘU> −U sin2 Θ U>

] [X>0X>1

]Spectral methods 2-21

Page 22: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Lemma 2.1 (cont.)

This further gives∥∥X0X>0 −Z0Z

>0∥∥

=∥∥∥∥[ U U

] [sin2 Θ − cos Θ sin Θ

− cos Θ sin Θ − sin2 Θ

] [U>

U>

]∥∥∥∥=

∥∥∥∥∥[

sin2 Θ − cos Θ sin Θ− cos Θ sin Θ − sin2 Θ

]︸ ︷︷ ︸

each block is diagonal matrix

∥∥∥∥∥ (‖ · ‖ is rotationally invariant)

= max1≤i≤r

∥∥∥∥[ sin2 θi − cos θi sin θi− cos θi sin θi − sin2 θi

]∥∥∥∥= max

1≤i≤r

∥∥∥∥sin θi[

sin θi − cos θi− cos θi − sin θi

]∥∥∥∥= max

1≤i≤r| sin θi| = ‖ sin Θ‖

Spectral methods 2-22

Page 23: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Lemma 2.1 (cont.)

It remains to justify (2.3). To this end, observe that

Z>0 X1X>1 Z0 = Z>0 Z0 −Z>0 X0X

>0 Z0

= I − V cos2 ΘV >

= V sin2 ΘV >

and hence right singular space (resp. singular values) of X>1 Z0 is given byV (resp. sin Θ)

Spectral methods 2-23

Page 24: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Eigen-space perturbation theory

Page 25: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Davis-Kahan sinΘ Theorem: a simple case

Chandler Davis William Kahan

Theorem 2.3

Suppose M � 0 and has rank r. If ‖H‖ < λr(M), then

dist(U0,U0

)≤

∥∥HU0∥∥

λr(M)− ‖H‖ ≤∥∥H∥∥

λr(M)− ‖H‖

• depends on the smallest non-zero eigenvalue of M︸ ︷︷ ︸eigen-gap between λr and λr+1

and perturbation

sizeSpectral methods 2-25

Page 26: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Theorem 2.3

We intend to control U>1 U0 by studying their interactions through H:∥∥U>1 HU0∥∥ =

∥∥∥U>1 (UΛU>︸ ︷︷ ︸M+H

−UΛU>︸ ︷︷ ︸ )M

U0

∥∥∥=∥∥∥Λ1U

>1 U0 − U>1 U0Λ0

∥∥∥ (since U>1 U0 = U>1 U0 = 0)

≥∥∥U>1 U0Λ0

∥∥− ∥∥Λ1U>1 U0

∥∥ (triangle inequality)≥∥∥U>1 U0

∥∥λr − ∥∥U>1 U0∥∥ ‖Λ1‖ (2.4)

In view of Weyl’s Theorem, ‖Λ1‖ ≤ ‖H‖, which combined with (2.4) gives

∥∥U>1 U0∥∥ ≤ ∥∥U>1 HU0

∥∥λr − ‖H‖

≤‖U1‖ ·

∥∥HU0∥∥

λr − ‖H‖=

∥∥HU0∥∥

λr − ‖H‖

This together with Lemma 2.1 completes the proof

Spectral methods 2-26

Page 27: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Davis-Kahan sinΘ Theorem: more general case

Theorem 2.4 (Davis-Kahan sin Θ Theorem)

Suppose λr(M) ≥ a and λr+1(M) ≤ a−∆ for some ∆ > 0. Then

dist(U0,U0

)≤∥∥HU0

∥∥∆ ≤

∥∥H∥∥∆

• immediate consequence: if λr(M) > λr+1(M) + ‖H‖, then

dist(U0,U0

)≤

∥∥H∥∥λr(M)− λr+1(M)︸ ︷︷ ︸

spectral gap

− ‖H‖(2.5)

Spectral methods 2-27

Page 28: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Back to stochastic block model ...

Let M = E[A]− p+ q

2 11>︸ ︷︷ ︸= p−q

2

[1−1

][1>,−1>]

, M = A− p+q2 11> and u = 1√

n

[1

−1

]

Then the Davis-Kahan sin Θ Theorem yields

dist(u,u) ≤ ‖M −M‖λ1(M)− ‖M −M‖

= ‖A− E[A]‖(p−q)n

2 − ‖A− E[A]‖(2.6)

Question: how to bound ‖A− E[A]‖?

Spectral methods 2-28

Page 29: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

A hammer: matrix Bernstein inequality

Consider a sequence of independent random matrices{Xl ∈ Rd1×d2

}• E[Xl] = 0 • ‖Xl‖ ≤ B for each l

• variance statistic:

v := max{∥∥∥E [∑

lXlX

>l

]∥∥∥ , ∥∥∥E [∑lX>l Xl

]∥∥∥}

Theorem 2.5 (Matrix Bernstein inequality)

For all τ ≥ 0,

P{∥∥∥∑

lXl

∥∥∥ ≥ τ} ≤ (d1 + d2) exp(−τ2/2

v +Bτ/3

)

Spectral methods 2-29

Page 30: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

A hammer: matrix Bernstein inequality

P{∥∥∥∑

lXl

∥∥∥ ≥ τ} ≤ (d1 + d2) exp(−τ2/2

v +Bτ/3

)

• moderate-deviation regime (τ is small):— sub-Gaussian tail behavior exp(−τ2/2v)

• large-deviation regime (τ is large):— sub-exponential tail behavior exp(−3τ/2B) (slower decay)

• user-friendly form (exercise): with prob. 1−O((d1 + d2)−10)∥∥∥∑lXl

∥∥∥ . √v log(d1 + d2) +B log(d1 + d2) (2.7)

Spectral methods 2-30

Page 31: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Bounding ‖A− E[A]‖

The Matrix Bernstein inequality yields

Lemma 2.6

Consider SBM with p > q & lognn . Then with high prob.

‖A− E[A]‖ .√np logn (2.8)

Spectral methods 2-31

Page 32: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Statistical accuracy of spectral clustering

Substitute (2.8) into (2.6) to reach

dist(u,u) ≤ ‖A− E[A]‖(p−q)n

2 − ‖A− E[A]‖.√np logn

(p− q)n

provided that (p− q)n�√np logn

Thus, under condition p−q√p �

√lognn , with high prob. one has

dist(u,u)� 1 =⇒ nearly perfect clustering

Spectral methods 2-32

Page 33: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Statistical accuracy of spectral clustering

p− q√p�

√lognn

=⇒ nearly perfect clustering

• dense regime: if p � q � 1, then this condition reads

p− q �

√lognn

• “sparse” regime: if p = a lognn and q = b logn

n for a, b � 1, then

a− b�√a

This condition is information-theoretically optimal (up to log factor)— Mossel, Neeman, Sly ’15, Abbe ’18

Spectral methods 2-33

Page 34: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Lemma 2.6

To simplify presentation, assume Ai,j and Aj,i are independent

(check: why this assumption does not change our bounds)

Spectral methods 2-34

Page 35: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of Lemma 2.6

Write A− E[A] as∑i,jXi,j , where Xi,j =

(Ai,j − E[Ai,j ]

)eie>j

• Since Var(Ai,j) ≤ p, one has E[Xi,jX

>i,j

]� peie>i , which gives∑

i,jE[Xi,jX

>i,j

]�∑

i,jpeie

>i � np I

Similarly,∑i,j E

[X>i,jXi,j

]� np I. As a result,

v = max{∥∥∥∑

i,jE[Xi,jX

>i,j

] ∥∥∥,∥∥∥∑i,j

E[X>i,jXi,j

] ∥∥∥} ≤ np• In addition, ‖Xi,j‖ ≤ 1 := B

• Take the matrix Bernstein inequality to conclude that with high prob.,

‖A− E[A]‖ .√v logn+B logn .

√np logn (since p & logn

n)

Spectral methods 2-35

Page 36: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Extension: singular subspaces

Page 37: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Singular value decomposition

Consider two matrices M ,M = M +H with SVD

M = [U0,U1]

Σ0 00 Σ10 0

[ V >0V >1

]

M =[U0, U1

] Σ0 0

0 Σ10 0

[ V >0V >1

]

where U0 (resp. U0) and V0 (resp. V0) represent the top-r singularsubspaces of M (resp. M)

Spectral methods 2-37

Page 38: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Wedin sinΘ Theorem

The Davis-Kahan Theorem generalizes to singular subspaceperturbation:

Theorem 2.7 (Wedin sin Θ Theorem)

Suppose σr(M)︸ ︷︷ ︸rth singular value

≥ a and σr+1(M) ≤ a−∆ for some ∆ > 0. Then

max{

dist(U0,U0

), dist

(V0,V0

)}≤

max{‖HV0‖, ‖H>U0‖

}∆︸ ︷︷ ︸

two-sided interactions

≤ ‖H‖∆

Spectral methods 2-38

Page 39: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Example: low-rank matrix completion

? ? ? ?

?

?

??

??

???

?

?

• Netflix challenge: Netflix provides highly incomplete ratings from0.5 million users for & 17,770 movies

• How to predict unseen user ratings for movies?

Spectral methods 2-39

Page 40: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Example: low-rank matrix completion

In general, we cannot infer missing ratings

X ? ? ? X ?? ? X X ? ?X ? ? X ? ?? ? X ? ? XX ? ? ? ? ?? X ? ? X ?? ? X X ? ?

— this is an underdetermined system (more unknowns thanobservations)

Spectral methods 2-40

Page 41: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Example: low-rank matrix completion

... unless rating matrix has other structure

? ? ? ?

?

?

??

??

???

?

?

A few factors explain most of the data

−→ low-rank approximation

How to exploit (approx.) low-rank structure in prediction?

Spectral methods 2-41

Page 42: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Example: low-rank matrix completion

... unless rating matrix has other structure

? ? ? ?

?

?

??

??

???

?

?

A few factors explain most of the data −→ low-rank approximation

How to exploit (approx.) low-rank structure in prediction?

Spectral methods 2-41

Page 43: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Model for low-rank matrix completion

X ? ? ? X ?? ? X X ? ?X ? ? X ? ?? ? X ? ? XX ? ? ? ? ?? X ? ? X ?? ? X X ? ?

? ? ? ?

?

?

??

??

???

?

?

figure credit: Candes

• consider a low-rank matrix M

• each entry Mi,j is observed independently with prob. p

• goal: fill in missing entries

Spectral methods 2-42

Page 44: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Spectral estimate for matrix completion

1. set M ∈ Rn×n as

Mi,j ={1pMi,j if Mi,j is observed0, else

◦ rationale for rescaling: E[M ] = M

2. compute the rank-r SVD UΣV > of M , and return (U , Σ, V )

Spectral methods 2-43

Page 45: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Statistical accuracy of spectral estimate

Let’s analyze a simple case where M = uv> with

u = 1‖u‖2

u, v = 1‖v‖2

v, u, v ∼ N (0, In)

From Wedin’s Theorem: if p� log3 n/n, then with high prob.

max {dist(u,u), dist(v,v)} ≤ ‖M −M‖σ1(M)− ‖M −M‖

� ‖M −M‖︸ ︷︷ ︸controlled by Bernstein

� 1 (nearly accurate estimates) (2.9)

Spectral methods 2-44

Page 46: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Sample complexity

For rank-1 matrix completion,

p� log3 n

n=⇒ nearly accurate estimates

Sample complexity needed for obtaining reliable spectral estimates is

n2p � n log3 n︸ ︷︷ ︸optimal up to log factor

Spectral methods 2-45

Page 47: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Proof of (2.9)Write M −M =

∑i,jXi,j , where Xi,j = (Mi,j −Mi,j)eie>j

• First,‖Xi,j‖ ≤

1p

maxi,j|Mi,j | .

lognpn

:= B (check)

• Next, E[Xi,jX

>i,j

]= Var(Mi,j)eie>i and hence

E[∑

i,jXi,jX

>i,j

]�{

maxi,j

Var(Mi,j

)}nI �

{np

maxi,j

M2i,j

}I

=⇒∥∥E[∑

i,jXi,jX

>i,j

]∥∥ ≤ n

pmaxi,j

M2i,j .

log2 n

np(check)

Similar bounds hold for∥∥E[∑i,jX

>i,jXi,j

]∥∥. Therefore,

v := max{∥∥E[∑

i,jXi,jX

>i,j

]∥∥,∥∥E[∑i,jX>i,jXi,j

]∥∥} . log2 n

np

• Take the matrix Bernstein inequality to yield: if p� log3 n/n, then

‖M −M‖ .√v logn+B logn� 1

Spectral methods 2-46

Page 48: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Extension: eigen-space for asymmetrictransition matrices

Page 49: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Eigen-decomposition for asymmetric matrices

Eigen-decomposition for asymmetric matrices is much more tricky; forexample:

1. both eigenvalues and eigenvectors might be complex-valued

2. eigenvectors might not be orthogonal to each other

This lecture focuses on a special case: probability transitionmatrices

Spectral methods 2-48

Page 50: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Probability transition matrices

1

4

52

6

3

Write M � M =P

i,j Xi,j , where Xi,j = (Mi,j � Mi,j)eie>j .

• Clearly, E⇥Xi,jX

>i,j

⇤= Var(Mi,j)eie

>i and hence

E⇥X

i,jXi,jX

>i,j

⇤�

nmax

i,jVar

�Mi,j

�onI �

nn

pmax

i,jM2

i,j

oI

=)��E

⇥Xi,j

Xi,jX>i,j

⇤�� n

pmax

i,jM2

i,j . log2 n

np(check)

Similarly,��E

⇥Pi,j X>

i,jXi,j

⇤�� . log2 nnp . Therefore,

v := maxn��E

⇥Xi,j

Xi,jX>i,j

⇤��,��E

⇥Xi,j

X>i,jXi,j

⇤��o. log2 n

np

• In addition,

kXi,jk 1

pmax

i,jMi,j . log n

pn:= B

• Take matrix Bernstein inequality to yield: if p � log3 n/n, then

kM � Mk .p

v log n + B log n ⌧ 1

• P2,1 P1,2

1

Write M � M =P

i,j Xi,j , where Xi,j = (Mi,j � Mi,j)eie>j .

• Clearly, E⇥Xi,jX

>i,j

⇤= Var(Mi,j)eie

>i and hence

E⇥X

i,jXi,jX

>i,j

⇤�

nmax

i,jVar

�Mi,j

�onI �

nn

pmax

i,jM2

i,j

oI

=)��E

⇥Xi,j

Xi,jX>i,j

⇤�� n

pmax

i,jM2

i,j . log2 n

np(check)

Similarly,��E

⇥Pi,j X>

i,jXi,j

⇤�� . log2 nnp . Therefore,

v := maxn��E

⇥Xi,j

Xi,jX>i,j

⇤��,��E

⇥Xi,j

X>i,jXi,j

⇤��o. log2 n

np

• In addition,

kXi,jk 1

pmax

i,jMi,j . log n

pn:= B

• Take matrix Bernstein inequality to yield: if p � log3 n/n, then

kM � Mk .p

v log n + B log n ⌧ 1

• P2,1 P1,2

1

...

Consider Markov chain {Xt}t≥0

• n states

• transition probability P{Xt+1 = j | Xt = i} = Pi,j

• transition matrix P = [Pi,j ]1≤i,j≤n• stationary distribution π = [π1, · · · , πn]︸ ︷︷ ︸

π1+···+πn=1

is 1st eigenvector of P

πP = π

• {Xt}t≥0 is said to be reversible if πiPi,j = πjPj,i for all i, jSpectral methods 2-49

Page 51: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Eigenvector perturbation for transition matrices

Define ‖a‖π :=√π1a2

1 + · · ·+ πna2n

Theorem 2.8 (Chen, Fan, Ma, Wang ’17)

Suppose P , P are transition matrices with stationary distributions π,π, respectively. Assume P induces reversible Markov chain. If1 > max {λ2(P ),−λn (P )}+

∥∥P − P ∥∥π

, then

‖π − π‖π ≤∥∥π(P − P )

∥∥π

1−max {λ2(P ),−λn (P )}︸ ︷︷ ︸spectral gap

−∥∥P − P ∥∥

π︸ ︷︷ ︸perturbation

• P does not need to induce reversible Markov chain

Spectral methods 2-50

Page 52: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Example: ranking from pairwise comparisons

pairwise comparisons for ranking tennis playersfigure credit: Bozoki, Csato, Temesi

Spectral methods 2-51

Page 53: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Bradley-Terry-Luce (logistic) model

i: rank wi: preference score

1

i:ra

nkw

⇤ i:

pref

eren

cesc

ore

1

Wri

teM

�M

=P

i,jX

i,j,w

here

Xi,

j=

(Mi,

j�

Mi,

j)e

ie> j

.

•C

lear

ly,E

⇥ Xi,

jX

> i,j

⇤ =Var

(Mi,

j)e

ie> i

and

henc

e

E⇥Xi,

jX

i,jX

> i,j

⇤ �n

max

i,j

Var� M

i,j

�onI�

nn p

max

i,j

M2 i,j

o I

=)

� � E⇥X

i,jX

i,jX

> i,j

⇤� �

n pm

ax

i,j

M2 i,j.

log2n

np

(che

ck)

Sim

ilarl

y,� � E

⇥ Pi,

jX

> i,jX

i,j

⇤� �.

log2

nn

p.

The

refo

re,

v:=

maxn � �

E⇥Xi,

jX

i,jX

> i,j

⇤� �,� � E

⇥Xi,

jX

> i,jX

i,j

⇤� �o.

log2n

np

•In

addi

tion

,

kXi,

jk

1 pm

ax

i,j

Mi,

j.

log

n

pn

:=B

•Tak

em

atri

xB

erns

tein

ineq

ualit

yto

yiel

d:if

p�

log3n/n,t

hen

kM�

Mk.

pv

log

n+

Blo

gn⌧

1

•P

2,1

P1,2

•w

i:

pref

eren

cesc

ore

1

• n items to be ranked

• assign a latent score {wi}1≤i≤n to each item, so that

item i � item j if wi > wj

• each pair of items (i, j) is compared independently

P {item j beats item i} = wjwi + wj

yi,jind.=

1, with prob. wj

wi+wj

0, else

Spectral methods 2-52

Page 54: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Bradley-Terry-Luce (logistic) model

i: rank wi: preference score

1

i:ra

nkw

⇤ i:

pref

eren

cesc

ore

1

Wri

teM

�M

=P

i,jX

i,j,w

here

Xi,

j=

(Mi,

j�

Mi,

j)e

ie> j

.

•C

lear

ly,E

⇥ Xi,

jX

> i,j

⇤ =Var

(Mi,

j)e

ie> i

and

henc

e

E⇥Xi,

jX

i,jX

> i,j

⇤ �n

max

i,j

Var� M

i,j

�onI�

nn p

max

i,j

M2 i,j

o I

=)

� � E⇥X

i,jX

i,jX

> i,j

⇤� �

n pm

ax

i,j

M2 i,j.

log2n

np

(che

ck)

Sim

ilarl

y,� � E

⇥ Pi,

jX

> i,jX

i,j

⇤� �.

log2

nn

p.

The

refo

re,

v:=

maxn � �

E⇥Xi,

jX

i,jX

> i,j

⇤� �,� � E

⇥Xi,

jX

> i,jX

i,j

⇤� �o.

log2n

np

•In

addi

tion

,

kXi,

jk

1 pm

ax

i,j

Mi,

j.

log

n

pn

:=B

•Tak

em

atri

xB

erns

tein

ineq

ualit

yto

yiel

d:if

p�

log3n/n,t

hen

kM�

Mk.

pv

log

n+

Blo

gn⌧

1

•P

2,1

P1,2

•w

i:

pref

eren

cesc

ore

1

• n items to be ranked

• assign a latent score {wi}1≤i≤n to each item, so that

item i � item j if wi > wj

• each pair of items (i, j) is compared independently

P {item j beats item i} = wjwi + wj

yi,jind.=

1, with prob. wj

wi+wj

0, else

Spectral methods 2-52

Page 55: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Spectral ranking method

• construct a probability transition matrix P obeying

Pi,j ={ 1

2nyi,j , if i 6= j

1−∑l:l 6=i Pi,l, if i = j

• return the score estimate as the leading left eigenvector π of P

— closely related to PageRank!

Spectral methods 2-53

Page 56: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Rationale behind spectral method

E[Pi,j ] = 12n ·

wjwi + wj

, i 6= j

• P := E[P ] obeys

wiPi,j = wjPj,i (detailed balance)

• Thus, the stationary distribution π of P obeys

π = 1∑l wl

w (reveal true scores)

Spectral methods 2-54

Page 57: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Statistical guarantees for spectral ranking

— Negahban, Oh, Shah ’16, Chen, Fan, Ma, Wang ’17

Suppose maxi,j wiwj

. 1. Then with high prob.

‖π − π‖2‖π‖2

� ‖π − π‖π‖π‖2

. 1√n→ 0︸ ︷︷ ︸

nearly perfect estimate

• consequence of Theorem 2.8 and the matrix Bernstein (exercise)

Spectral methods 2-55

Page 58: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Reference

[1] ”The rotation of eigenvectors by a perturbation,” C. Davis, W. Kahan,SIAM Journal on Numerical Analysis, 1970.

[2] ”Perturbation bounds in connection with singular value decomposition,”P. Wedin, BIT Numerical Mathematics, 1972.

[3] ”Inference, estimation, and information processing, EE 378B lecturenotes,” A. Montanari, Stanford University.

[4] ”COMS 4772 lecture notes,” D. Hsu, Columbia University.

[5] ”High-dimensional statistics: a non-asymptotic viewpoint,”M. Wainwright, Cambridge University Press, 2019.

[6] ”Principal component analysis for big data,” J. Fan, Q. Sun, W. Zhou,Z. Zhu, arXiv:1801.01602, 2018.

Spectral methods 2-56

Page 59: Spectral methods - Princeton Universityyc5/ele538_math_data/lectures/spectral_method.pdf · Adjacency matrix Consider the adjacency matrix A∈{0,1}n ... Davis-Kahan sin Theorem:

Reference[7] ”Community detection and stochastic block models,” E. Abbe,

Foundations and Trends in Communications and Information Theory,2018.

[8] ”Consistency thresholds for the planted bisection model,” E. Mossel,J. Neeman, A. Sly, ACM Symposium on Theory of Computing, 2015.

[9] ”Matrix completion from a few entries,” R. Keshavan, A. Montanari,S. Oh, IEEE Transactions on Information Theory, 2010.

[10] ”The PageRank citation ranking: bringing order to the web,” L. Page,S. Brin, R. Motwani, T. Winograd, 1999.

[11] ”Rank centrality: ranking from pairwise comparisons,” S. Negahban,S. Oh, D. Shah, Operations Research, 2016.

[12] ”Spectral method and regularized MLE are both optimal for top-Kranking,” Y. Chen, J. Fan, C. Ma, K. Wang, accepted to Annals ofStatistics, 2017.

Spectral methods 2-57