lower bounds for nns and metric expansion

Post on 23-Feb-2016

40 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Lower Bounds for NNS and Metric Expansion. Rina Panigrahy Kunal Talwar Udi Wieder Microsoft Research SVC. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A. Nearest Neighbor Search. Given points in a metric space - PowerPoint PPT Presentation

TRANSCRIPT

Lower Bounds for NNS and Metric Expansion

Rina Panigrahy Kunal TalwarUdi Wieder

Microsoft Research SVC

Nearest Neighbor Search

Given points in a metric spacePreprocess into a small data structure

Given a query point Quickly retrieve the closest to

Many Applications

Decision Version. Given search radius r

• Find a point in distance r of query point• Relation to Approximate NNS:– If second neighbor is at distance cr– Then this is also a c-approximate NN

r

cr

Cell Probe Model

Preprocess into data structure with– words– bits per word

Query algorithm gets charged t if it probes words of – All computation is free

Study tradeoff between and In this talk

m

w

mws

Many different lower boundsMetric space

Approximation

Randomized?

Ref

Exact yes PT[06], BR[02]

no PT[06], Liu[04]

yes AIP[06]

yes PTW[08]

no ACP[08]

n.exp(ϵ3 d)

Lower bounds from Expansion

Show a unified approach for proving cell probe lower bounds for near neighbor and other similar problems.

Show that all lower bounds stem from the same combinatorial property of the metric space

Expansion : |number of points near A|/|A|(show some new lower bounds)

Graphical Nearest Neighbor

• Convert metric space to Graph• Place an edge if nodes are within

distance r• Return a neighbor of the query. Now

r=1

Graphical Nearest Neighbor

• Assume uniform degree • Use a random data set• Assume W.h.p the n balls are disjoint.

Deterministic Bounds via Expansion•  

Deterministic Bound

•  sdddddddddddddddlklkj

Example Application( 𝑠𝑡𝑛 )𝑡≥Φ (G )

•  

n.exp(ϵ2d)

Proof Idea when t=1 Shattering( 𝑠𝑡𝑛 )

𝑡≥Φ (G )

• F : V → [m] partitions V into m regions

• Split large regions• A random ball is

shattered into many parts: about ф(G)

• ф(G) replication in space

Proof Idea when t=1• determines

which cell in is read

• Select a fraction of cells such

• it is likely that cantains a quarter of the data set points

• So, and

( 𝑠𝑡𝑛 )𝑡≥Φ (G )

Generalizing for larger t• Select a fraction of

each table such • Continue as before

– Non adaptive algorithms

• Adaptive alg. depend upon content of selected cells– Subexp. number of

algs– Union bound

Randomized Bounds• So far we assumed the algorithm is

correct on –What if only of are good query point?

Need to relax the definition of vertex expansion

Randomized Bounds

• Robust Expansion

AN(A)

• N(A) captures all edges from A

• Expansion =|N(A)|/|A|

• Capture only ¾ of the edges from A

Robust Exapnsion• Small set vertex expansion:

• In other words:We can cover all the edges incident on with a set of size

• We can cover of the edges incident on with a set of size

– Robust expansion is at least the edge expansion

Bound for Randomized Data Structure

• Theorem: if is weakly Independent, then a randomized data structure that answers GNS queries with space and queries must satisfy

and

Proof Idea when t=1 Shattering

• Most of a random ball is shattered into many parts: about фr

• фr replication in space

Generalizing for larger t

• Sample 1/фr1/t

fraction from each table.

• A random ball, good part survives in all tables.

• Union bound for adaptive is trickier.

Applications

• We know how to calculate robust expansion of graphs derived from:– when (known) – when (new)– when (natural input dist.)

• Don’t know the robust expansion of:– – when

General Upper Bound• Say is a Cayley

Graph• Take • Take with r.e. • Use random

translations of to define the access function

• For rand. input success prob. is constant

Conclusions and Open Problems

Unified approach to NNS cell probe lower bounds– often characterized by expansion – Average case with natural distributions

• Higher lower bounds?– Improve dependency on (very hard)– Dynamic NNS, tight bound for special

cases shown in the paper

Approximate Near Neighbor Search

• sdfsdfsffjlaskdjffj

• gdgsgsdfgdfffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffkffffsdfgddddddjffjdfgdfg

Graphical Nearest Neighbor

•  

Randomized Bounds• So far we assumed the algorithm is

correct on –What if only of are good query point?

Need to relax the definition of vertex expansion and independence

is weakly independent if for random it holds that

Deterministic Bounds via Expansion

•  

Proof Idea

• Can we plug the new definitions in the old proof?– Conceptually – yes!– Actually….well no

• Dependencies everywhere – the set of good neighbors of a data point depends upon the rest of the data set

• Solving this is the technical crux of the paper

top related