two numerical graph algorithms

Post on 11-Jun-2015

1.495 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Two matrix computations fornumerical graph problems:

PageRank and Network Alignment

David F. Gleich

Sandia National LabsLivermore, CA

IBM Almaden SeminarSan Jose, CA

January 17th, 2011

In collaboration withAndrew Gray (UBC), Chen Greif (UBC)

Tracy Lau (UBC/IBM?), Mohsen Bayati (Stanford)Ying Wang (Stanford), Margot Gerritsen (Stanford)

Amin Saberi (Stanford)

Supported by the Library of Congressand Microsoft Live Labs Fellowship

David F. Gleich (Sandia) IBM Almaden 1 / 47

Sketch of talk

É two algorithms inner-outer and belief propagationÉ two problems PageRank and network alignmentÉ big graphs for bothÉ iterative matrix computations for bothÉ multi-core parallel results inner-outer only

standard flowproblem → algorithm → theory (hopefully) → empirical resultsexcept “fun” results first

some open questions at end

David F. Gleich (Sandia) IBM Almaden 2 / 47

A PageRank algorithmInstead of the power method,

x(k+1) = αPx(k) + (1− α)v.

Use an outer iteration

(− βP)x(k+1)

= (α − β)Px(k) + (1− α)v︸ ︷︷ ︸

f

.

with the inner iteration

y(j+1) = βPy(j) + f.

It’s faster!

Web Data, α = 0.99

Nodes 105,896,555Edges 3,783,733,648

Power Method 964 its 5.15 hrs.Inner-Outer 857 its 4.45 hrs.

Network-Alignment Data, α = 0.95

Nodes 4,219,893,141Edges 91,886,357,440

Power Method 271 its 54.6 hrs.Inner-Outer 188 its 36.2 hrs.

Codes and data available.

Note Web data is uk-2006 from UNIMI’s (Univ. Milano) DSI group.

David F. Gleich (Sandia) IBM Almaden 3 / 47

Network Alignment

r

t

s

t

Square

A L B

A is about 200,000 verticesB is about 300,000 verticesL has around 5,000,000 edges5 million variable integer QP∼ 90% of optimality in minutes.

Codes and data available.

DEMO

David F. Gleich (Sandia) IBM Almaden 4 / 47

PageRankSlide 5 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

PageRank is a ...... modified Markov chain,... damped random walk on a graph,... pinball game on the reverse web, or... random surfer model.

Proposed by Brin and Page in 1998, but similar ideas fromearlier... (Sebastiano Vigna is working on tracing the history –the current history dates to 1949)

Langville and Meyer (2006) is a goodgeneral reference; Berkhin (2005) haslots of goodies; and Des Higham calledit pinball.

David F. Gleich (Sandia) PageRank IBM Almaden 6 / 47

The PageRank Random Surferimportant pages↔ highly probable to visit

1

2

3

4

5

6

1/6 1/2 0 0 0 01/6 0 0 1/3 0 01/6 1/2 0 1/3 0 01/6 0 1/2 0 0 01/6 0 1/2 1/3 0 11/6 0 0 0 1 0

︸ ︷︷ ︸

P

1. follow out-edges uniformly withprobability α, and

2. randomly jump according to v withprobability 1− α, we’ll assume = 1/n.

Induces a Markov chain model�

αP+ (1− α)veT�

x(α) = x(α)

or the linear system

(− αP)x(α) = (1− α)v

But it’s just a model.

Note I’m omitting important details about dangling nodes, I’ll mention them a bit later.

David F. Gleich (Sandia) PageRank IBM Almaden 7 / 47

What is α?

Our regimeÉ α ≥ .85 otherwise

power is fast.É P only available

for mat-vecotherwise customtechniquespossible.

Author αBrin and Page (1998) 0.85Najork et al. (2007) 0.85Litvak et al. (2006) 0.5Katz (1953) 0.5Experiment (2009) 0.63 ≈

p0.85 · 0.5

Algorithms (...) ≥ 0.85

αfr

ombro

wse

rs

Raw α

dens

ity

0.0

0.5

1.0

1.5

2.0

2.5

3.0 InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )InfBeta( 3.2 , 2.0 , 1.9e−05 , 0.0019 )

0.0 0.2 0.4 0.6 0.8 1.0

Constantine, Flaxman, Gleich, Gunawardana, Tracking the Random Surfer, WWW2010Constantine and Gleich, Random Alpha PageRank, Internet Math.

David F. Gleich (Sandia) PageRank IBM Almaden 8 / 47

PageRank AlgorithmsSlide 9 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

PageRank formulations and theoryCodes

Strongly prefer-ential PageRank PseudoRank

Graph orWeb graph

Substochasticmatrix

Weakly prefer-ential PageRank

PageRank

Sink preferentialPageRank

Eigensystems

Linear systems

Theory

Other transformations

v teleportation vectorP̄ substochastic matrix (for algorithms)d dangling node vector (d = e− PTe)

P̄+ vdT → P Strongly preferential PageRankP̄+ dT → P Weakly preferential PageRank ( 6= v)

P PageRank stochastic matrix (for theory)(− αP)x = (1− α)v PageRank linear system

David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 10 / 47

MotivationWhy another PageRank algorithm?

An ideal algorithm is1. reliable2. fast over a range of α’s

→ Use Matlab’s “\”3. efficient for big problems

→ Use a Gauss-Seidel orcustom Richardson method

4. uses only matvec products→ Use the inner-outer iteration

5. uses only 2 vectors of memory→ Use the power method simple

fancy

David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 11 / 47

Simple algorithmsThe power methodFor Ax = λx, the iteration

x(k+1) = Ax(k)/

Ax(k)

computes the largesteigenpair.The PageRank Markov chaineigenvector problem is

[αP+ (1− α)veT]x = x

If eTx(0) = 1 and j ≥ 0

x(k+1) = αPx(k)+(1−α)veTx(k)︸ ︷︷ ︸

=1

The Richardson methodFor Ax = b, the iteration

x(k+1) = x(k) +ω (b−Ax(k))︸ ︷︷ ︸

residual

computes x.The PageRank linear system is

(− αP)x = (1− α)v.

For ω = 1

x(k+1) = αPx(k) + (1− α)v

and the Richardson iteration isthe power method.

David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 12 / 47

Inner-Outer

Note PageRank is easier when α is smaller

Thus Solve PageRank with itself using β < α!

Outer (− βP)x(k+1) = (α − β)Px(k) + (1− α)v ≡ f(k)

Inner y(0) = x(k) y(j+1) = βPy(j) + f(k)

A new parameter? What is β? 0.5

How many inner iterations? Until a residual of 10−2

Gleich, Gray, Greif, Lau, SISC 2010.David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 13 / 47

Inner-Outer algorithm

Input: P,v, α, τ, (β = 0.5, η = 10−2)Output: x1: x← v2: y← Px3: while ‖αy+ (1− α)v− x‖1 ≥ τ4: f← (α − β)y+ (1− α)v5: repeat6: x← f+ βy7: y← Px8: until ‖f+ βy− x‖1 < η9: end while10: x← αy+ (1− α)v

É uses only three vectorsof memory

Convergence?É if 0 ≤ β ≤ α, with “ex-

act” iterationÉ but also (small theo-

rem) with any η!

Parameters?É β = 0.5, η = 10−2 often

faster than the powermethod(or just a titch slower)

Note Note that the inner-loop checks its condition after doing one iteration. An inexact iteration isalways at least as good as one-step of the power method.

David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 14 / 47

Inner-Outer ParametersQuestion: What parameters should we pick?

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8700

800

900

1000

1100

1200

1300

1400

1500

β

Mul

tiplic

atio

ns

in−2004, α=0.99

powerη = 1e−01η = 1e−02η = 1e−03η = 1e−04η = 1e−05

10−4

10−3

10−2

10−1

700

800

900

1000

1100

1200

1300

1400

1500

η

Mul

tiplic

atio

ns

in−2004, α=0.99

powerβ = 0.10β = 0.30β = 0.50β = 0.70

α = 0.99, in-2004 graph (1.3M nodes, 16.9M edges)

Just use β = 0.5 and η = 10−2!Note Many similar plots appear in my thesis.

David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 15 / 47

The CompetitionOur Requirement: only Px is available!É Quadratic Extrapolation (Kamvar, Haveliwala, et al.)É Aggregation/Disaggregation

(Langville and Meyer; Stewart)É Permutations/Strong Components

(Del Corso, Gulli, and Romani; Langville and Meyer)É Krylov methods (Gleich, Zhukov, Berkhin;

Del Corso, Gulli, and Romani)É Padé-type extrapolation (Brezinski and Redivo-Zaglia)

É Arnoldi methods (Greif and Golub)

É Gauss-Seidel (Arasu, Novak, Tomkins, and Tomlin)

David F. Gleich (Sandia) PageRank Algorithms IBM Almaden 16 / 47

Inner-outerPerformanceSlide 17 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

Datasets

name size nonzeros avg nz/rowubc-cs-2006 51,681 673,010 13.0ubc-2006 339,147 4,203,811 12.4eu-2005 862,664 19,235,140 22.3in-2004 1,382,908 16,917,053 12.2wb-edu 9,845,725 57,156,537 5.8arabic-2005 22,744,080 639,999,458 28.1sk-2005 50,636,154 1,949,412,601 38.5uk-2007 105,896,555 3,738,733,648 35.3

David F. Gleich (Sandia) Inner-outer Performance IBM Almaden 18 / 47

One example

10 20 30 40 50 60 70 8010

−7

10−6

10−5

10−4

10−3

10−2

10−1

100

Multiplication

Res

idua

l

wb−edu, α = 0.85

powerinout

5 10 15 20

10−2

100

200 400 600 800 1000 120010

−7

10−6

10−5

10−4

10−3

10−2

10−1

100

Multiplication

Res

idua

l

wb−edu, α = 0.99

powerinout

20 40

10−2

100

τ = 10−7, β = 0.5, η = 10−2;wb-edu graph (9.8M nodes, 57.M edges)

David F. Gleich (Sandia) Inner-outer Performance IBM Almaden 19 / 47

Advantage Inner-Outerα=0.99,β=0.5,η=10−2

tol. graph work (mults.) time (secs.)power in/out gain power in/out gain

10−3 ubc-cs-2006 226 141 37.6% 1.9 1.2 35.2%ubc 242 141 41.7% 13.6 8.3 38.4%in-2004 232 129 44.4% 51.1 30.4 40.5%eu-2005 149 150 -0.7% 26.9 28.3 -5.3%wb-edu 221 130 41.2% 291.2 184.6 36.6%

arabic-2005 213 139 34.7% 779.2 502.5 35.5%sk-2005 156 144 7.7% 1718.2 1595.9 7.1%uk-2007 145 125 13.8% 2802.0 2359.3 15.8%

10−5 ubc-cs-2006 574 432 24.7% 4.7 3.6 22.9%ubc 676 484 28.4% 37.7 27.8 26.2%in-2004 657 428 34.9% 144.3 97.5 32.4%eu-2005 499 476 4.6% 89.3 87.4 2.1%wb-edu 647 417 35.5% 850.6 572.0 32.8%

arabic-2005 638 466 27.0% 2333.5 1670.0 28.4%sk-2005 523 460 12.0% 5729.0 5077.1 11.4%uk-2007 531 463 12.8% 10225.8 8661.9 15.3%

10−7 ubc-cs-2006 986 815 17.3% 8.0 6.8 15.4%ubc 1121 856 23.6% 62.5 49.0 21.6%in-2004 1108 795 28.2% 243.1 179.8 26.0%eu-2005 896 814 9.2% 159.9 148.6 7.1%wb-edu 1096 777 29.1% 1442.9 1059.0 26.6%

arabic-2005 1083 843 22.2% 3958.8 3012.9 23.9%sk-2005 951 828 12.9% 10393.3 9122.9 12.2%uk-2007 964 857 11.1% 18559.2 16016.7 13.7%

David F. Gleich (Sandia) Inner-outer Performance IBM Almaden 20 / 47

Parallelizationparallel Pxxi=x[i]/degree(i); for (j in edges of i) { atomic(y[j]+=xi); }.

1 2 3 4 5 6 7 80

1

2

3

4

5

6

7

8

Number of processors

Spe

edup

rel

ativ

e to

bes

t 1 p

roce

ssor

linearpower relativeinout relative1e−3 power1e−3 inout1e−5 power1e−5 inout1e−7 power1e−7 inout

84

5

6

David F. Gleich (Sandia) Inner-outer Performance IBM Almaden 21 / 47

Network AlignmentMotivationSlide 22 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

David F. Gleich (Sandia) Network Alignment Motivation IBM Almaden 23 / 47

David F. Gleich (Sandia) Network Alignment Motivation IBM Almaden 24 / 47

Alignment and overlap: The goal

Health organizations

Psychiatric hospitals

Health

Mental health

Educational psychology

Wikipedia LCSH

b

a1

2

3

b1

b2

is better than

1

2

3

b1

b2

r

t

s

t

Square

A L BMaximize squares/overlap in 1-1 matching

Find a good mapping to investigate similarity!

David F. Gleich (Sandia) Network Alignment Motivation IBM Almaden 25 / 47

Network alignmentSlide 26 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

Integrating Matching and Overlap: A QP

Squares produce overlap → bonus for some and j →∑

j

r

t

s

t

Square

A L B

Variables, Data

= edge indicator = weight of edgesSj squares in S

e ∈ Le = (t, ) =t

Problem

mximize

:e∈L +

,j∈Sj

subject to is a matching↔

mximizex

wTx+ 12x

TSx

subject to Ax ≤ e ∈ {0,1}

David F. Gleich (Sandia) Network alignment IBM Almaden 27 / 47

An example with overlap

1 1′

2 2′

3 3′

4 4′

5 5′

6

1.0

0.6

0.6

0.50.30.3

0.1 0.10.4

0.5

0.90.9

(2,2′)(2,1′)(2,3′)(2,4′)(1,2′)(1,1′)(3,2′)(3,3′)(4,2′)(4,4′)(5,5′)(6,1′)︸ ︷︷ ︸

edge order

0 0 0 0 0 1 0 1 0 1 1 10 0 0 0 1 0 1 0 1 0 0 00 0 0 0 1 0 1 0 1 0 0 00 0 0 0 1 0 1 0 1 0 0 00 1 1 1 0 0 0 0 0 0 0 11 0 0 0 0 0 0 0 0 0 0 00 1 1 1 0 0 0 0 0 0 0 01 0 0 0 0 0 0 0 0 0 0 00 1 1 1 0 0 0 0 0 0 0 01 0 0 0 0 0 0 0 0 0 0 01 0 0 0 0 0 0 0 0 0 0 01 0 0 0 1 0 0 0 0 0 0 0

︸ ︷︷ ︸

S

,

0.60.90.30.10.90.60.30.50.10.40.51.0

︸ ︷︷ ︸

w

,

A =

1 1 1 1 0 0 0 0 0 0 0 00 0 0 0 1 1 0 0 0 0 0 00 0 0 0 0 0 1 1 0 0 0 00 0 0 0 0 0 0 0 1 1 0 00 0 0 0 0 0 0 0 0 0 1 00 0 0 0 0 0 0 0 0 0 0 11 0 0 0 1 0 1 0 1 0 0 00 1 0 0 0 1 0 0 0 0 0 10 0 1 0 0 0 0 1 0 0 0 00 0 0 1 0 0 0 0 0 1 0 00 0 0 0 0 0 0 0 0 0 1 0

David F. Gleich (Sandia) Network alignment IBM Almaden 28 / 47

Network alignment

NETWORK ALIGNMENT

mximize αwTx+ β2x

TSx

subject to Ax ≤ e, ∈ {0,1}

History

É QUADRATIC ASS IGNMENT

É MAXIMUM COMMON SUBGRAPH

É PATTERN RECOGNIT ION

É ONTOLOGY MATCHING

É B IOINFORMATICS

Sparse problemsSparse L often ignored (afew exceptions).Our paper tackles thatcase explicitly.We do large problems,too.

Conte el al. Thirty years of graph matching, 2004.; Melnik et al. Similarity flooding, 2004; Blondel et al. SIREV 2004;Singh et al. RECOMB 2007; Klau, BMC Bioinformatics 10:S59, 2009.

David F. Gleich (Sandia) Network alignment IBM Almaden 29 / 47

Network AlignmentAlgorithmsSlide 30 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

Algorithms

1. LP Convert to LP, relax, solve (Skipped)2. TIGHTLP Improve the LP (Skipped)

3. I SORANK Use a PageRank heuristic (Singh et al. 2007)

4. BP Max-product belief propagation for the LP5. TIGHTBP BP for the TIGHTLP (skipped)6. MR Sub-gradient descent on TIGHTLP (Klau 2009;

skipped)

Note Not discussed: early heuristic: Flannick et al. Genome Research 16:1169–1181, 2006; anindependent BP algorithm: Bradde et al. arXiv:0905.1893, 2009

Singh et al. RECOMB2007; Klau, 2009David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 31 / 47

IsoRankmximize αwTx+ (β/2)xTSxsubject to 0 ≤ Ax ≤ e, ∈ 0,1

Solve PageRank on S and w!

1. Normalize S to stochastic P2. Normalize w to stochastic v3. Compute power iterations and round at each4. Output best solution

É Need to evaluate a range of PageRank αÉ Designed for complete bipartite L

Singh et al. RECOMB2007; Ninove Ph.D. Thesis Louvain, 2008David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 32 / 47

Inner-outer for this problem?

Only on the cores of the two graphs.Dataset Size Non-Zeros

LCSH-2 59,849 227,464WC-3 70,509 403,960

Product Graph 4,219,893,141 91,886,357,440

α = 0.95, w from text similarity

Inner-Outer 188 mat-vec 36.2 hoursPower 271 mat-vec 54.6 hours

Caveat: I’m ignoring all the details ofactually using this technique.

David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 33 / 47

Belief propagation: Our algorithm

Summary

É Construct a probabilitymodel where the mostlikely state is the solution!

É Locally update informationÉ Like a generalized dynamic

program

É It works

É Most likely, it won’tconverge

History

É BP used for computingmarginal probabilities andmaximum aposteroriprobability

É Wildly successful at solvingsatisfiability problems

É Convergent algorithm formax-weight matching

Bayati et al. 2005;David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 34 / 47

variables functionsÉ max-product of function nodesÉ variables have state 0 or 1É function nodes compute a

productÉ messages are the belief (local

objective) about a node for astate

j

iM→j{ = s} =∏

j′∈{N()\j}

Mj′→{ = s}

variable tells function j what it thinksabout being in state s. This is just theproduct of what all the other functions tell about being in state s.

j

i Mj→{ = s} =mximmy:all possible choicesfor variables ′∈N(j)

ƒj(y)

′∈{N(j)\}

M′→j{′ = y′}

function j tells variable what it thinksabout being in state s. This means that wehave to locally maxamize ƒj among allpossible choices. Note y = s always (toocumbersome to include in notation.)

David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 35 / 47

NetAlign factor graph: Loopy BP

1

2

1′

2′

3′

A B 11′

12′

22′

23′

11′22′

ƒ1

ƒ2

g′1g′2g′3

h11′22′

Variables Functions

Note It’s pretty hairy to put all the stuff I should put here on a single slide. Most of it is in the paper.The rest is just “turning the crank” with standard tricks in BP algorithms.

David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 36 / 47

Get tropical

In the max-plus sense.

David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 37 / 47

Belief propagation: A viewA :m× n

A =�

ArAc

x : n× 1

A� x ≡

mxj 1,jjmxj 2,jj

...mxj m,jj

bond,b z≡min(b,mx(, z))

=

z < z ≤ z ≤ bb z > b

NETALIGNBP ALGORITHM

y(0) = 0,z(0) = 0,S(0) = 0, β̃ = β/2while t = 1, . . . dod = bond0,β̃(S

(t−1)T + β̃S) · e

y(t) = αw− bond0,∞[(ATr Ar − )� z(t−1)] + d

z(t) = αw− bond0,∞[(ATcAc − )� y(t−1)] + d

S(t) = (Y(t) + Z(t) − αW−D) · S− bond0,β̃(S(t−1)T + β̃S)

end while

Note α = 1, β = 2, γ = 0.99 damping, max-weight matching rounding gives 15,214 overlap, 56,361weight in 10 mins.

David F. Gleich (Sandia) Network Alignment Algorithms IBM Almaden 38 / 47

ResultsSlide 39 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

Synthetic experiments: BP does well!

0 5 10 15 200

0.2

0.4

0.6

0.8

1

roun

ded

obje

ctiv

e va

lues

expected degree of noise in L (p ⋅ n)

MR−upperMRBPBPSCIsoRank

0 5 10 15 200

0.2

0.4

0.6

0.8

1

roun

ded

obje

ctiv

e va

lues

expected degree of noise in L (p ⋅ n)

0 5 10 15 200

0.2

0.4

0.6

0.8

1

frac

tion

corr

ect

expected degree of noise in L (p ⋅ n)

MRBPBPSCIsoRank

0 5 10 15 200

0.2

0.4

0.6

0.8

1

frac

tion

corr

ect

expected degree of noise in L (p ⋅ n)

David F. Gleich (Sandia) Results IBM Almaden 40 / 47

Biological data: A close tie

0 100 200 300 400 500 600 700 0

100

200

300

376

400

Weight

Ove

rlap

max weight671.551

overlap upper bound381

BPSCBPIsoRankMR

0 100 200 300 400 500 600 700 0

100

200

300

376

400

Weight

Ove

rlap

max weight671.551

overlap upper bound381

0 500 1000 1500 2000 2500 0

200

400

600

800

1000

1076

1200

WeightO

verla

p

max weight2733

overlap upper bound1087

BPSCBPIsoRankMR

0 500 1000 1500 2000 2500 0

200

400

600

800

1000

1076

1200

WeightO

verla

p

max weight2733

overlap upper bound1087

Problem |VA| |EA| |VB| |EB| |EL|

dmela-scere 9459 25636 5696 31261 34582Mus M.-Homo S. 3247 2793 9695 32890 15810

David F. Gleich (Sandia) Results IBM Almaden 41 / 47

Real dataset

0 10000 20000 30000 40000 50000 60000 70000 0

5000

10000

15000

16836

20000

Weight

Ove

rlap

max weight60119.8

overlap upper bound17608

BPSCBPIsoRankMR

0 10000 20000 30000 40000 50000 60000 70000 0

5000

10000

15000

16836

20000

Weight

Ove

rlap

max weight60119.8

overlap upper bound17608

Problem |VA| |EA| |VB| |EB| |EL|

lcsh2wiki 297,266 248,230 205,948 382,353 4,971,629

David F. Gleich (Sandia) Results IBM Almaden 42 / 47

Matching results: A little too hot!LCSH WC

Science fiction television series Science fiction television programsTuring test Turing test

Machine learning Machine learningHot tubs Hot dog

David F. Gleich (Sandia) Results IBM Almaden 43 / 47

Foreign subject headingsÉ The US uses LCSH for subj. headings (342k verts, 258k edges).

É France uses Rameau for subj. headings (155k verts, 156k edges).

É Generate L by automatic translation and text matching.

É Used Google’s automatic translation service(translate.google.com).

É Produces 22,195,304 possible links based on text.

cardinality overlap correctManual 54,259 39,749MWM 125,609 17,134 29,133 50.54%NetAlignBP 121,316 46,534 32,467 56.32%NetAlignMR 119,120 45,977 25,086 43.52%Upper 50,753

Note NetAlignBP with α = 1, β = 2, γ = 0.99 for 100 iterations; NetAlignMR with α = 0, β = 1 for 1000iterations.

David F. Gleich (Sandia) Results IBM Almaden 44 / 47

ConclusionSlide 45 of 47

PageRank

PageRank Algorithms

Inner-outer Performance

Network AlignmentMotivation

Network alignment

Network AlignmentAlgorithms

Results

Conclusion

Philosophy

Why matrix computations?É Simple, iterative methodsÉ “Easy” to codeÉ “Easy” to parallelizeÉ “Often” apply to graph problems

David F. Gleich (Sandia) Conclusion IBM Almaden 46 / 47

Summary and Future ideasInner-outer iterations forPageRank

É Robust analysisÉ Good for general graphsÉ Can combine with other

techniquesÉ Works for Gauss-SeidelÉ Works for non-stationary

iterations

Future workÉ Gauss-Seidel performance?É OPEN Asymptotic

performance of inner-outer?É Dynamic β and η?

BP algorithms for networkalignment

É Fast and scalableÉ Good results on biology PPI

networksÉ Reasonable results with

Rameau to LCSH

Future workÉ No vertex label information

for matches?É Are “overlap” scores

significant?É Are LCSH and Wikipedia

really similar?É OPEN An approx. algorithm?

David F. Gleich (Sandia) Conclusion IBM Almaden 47 / 47

PAPER 1 stanford.edu/~dgleich/publications/2009/

gleich-2009-inner-outer.html

SIAM J. Scientific ComputingGoogle “inner outer gleich”

CODE stanford.edu/~dgleich/publications/2009/innout

Google “innout gleich”

PAPER 2 arxiv.org/abs/0907.3338

ICDM 2009Google “network alignment gleich”

CODE stanford.edu/~dgleich/publications/2009/netalign

Google “netalign gleich”

top related