the equivalence of sampling and searching scott aaronson mit

16
The Equivalence of Sampling and Searching Scott Aaronson MIT

Upload: hayden-stewart

Post on 26-Mar-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The Equivalence of Sampling and Searching Scott Aaronson MIT

The Equivalence of Sampling and Searching

Scott AaronsonMIT

Page 2: The Equivalence of Sampling and Searching Scott Aaronson MIT

In complexity theory, we love at least four types of problems

Languages / Decision Problems. Decide if xL or xL

Promise Problems. Decide if xYES or xNO

Search Problems. Output an element of a (nonempty) set Ax{0,1}m, with probability 1-, in poly(n,1/) time

Sampling Problems. Sample from a probability distribution Dx over m-bit strings, with error in variation distance, in poly(n,1/) time

Given an input x{0,1}n…

Page 3: The Equivalence of Sampling and Searching Scott Aaronson MIT

Suppose we want to know whether quantum computers are stronger than

classical computers

BPP vs. BQP?

PromiseBPP vs. PromiseBQP?

FBPP vs. FBQP?

SampBPP vs. SampBQP?

(To pick a random example of a complexity question)

Then which formal question do we “really” mean to ask?

Page 4: The Equivalence of Sampling and Searching Scott Aaronson MIT

Easy ImplicationsSampBPP=SampBQP FBPP=FBQP

PromiseBPP=PromiseBQP

BPP=BQP

Crucial question: Can these implications be reversed?

We show that at least one of them can:

FBPP=FBQP SampBPP=SampBQP

Page 5: The Equivalence of Sampling and Searching Scott Aaronson MIT

Application to Linear Optics[A.-Arkhipov, STOC’11] study a rudimentary type of quantum computer based entirely on linear optics: identical, non-interacting photons passing through a network of beamsplitters

Our model doesn’t seem to be universal for quantum computing (or even classical computing)—but it can solve sampling problems that we give evidence are hard classically

Using today’s result, we automatically also get a search problem solvable with linear optics that ought to be hard classically

Page 6: The Equivalence of Sampling and Searching Scott Aaronson MIT

But the QC stuff is just one application of a much more general result…

Informal Statement:

Let S={Dx}x be any sampling problem.

Then there exists a search problem RS={Ax}x that’s equivalent to S, in the following sense:

For any “reasonable” complexity class C (BPP, BQP, BPPSPACE, etc.),

RSFC SSampC

Page 7: The Equivalence of Sampling and Searching Scott Aaronson MIT

IntuitionSuppose our sampling problem is to sample uniformly from a set A{0,1}n

First stab at an “equivalent” search problem: output any element of A

That clearly doesn’t work—finding an A element could be much easier than sampling a random element!

Better idea: output an element yA whose Kolmogorov complexity K(y) is close to log2|A|

Page 8: The Equivalence of Sampling and Searching Scott Aaronson MIT

Clearly, if we can sample a random yA, then with high probability K(y)log2|A|

But conversely, if a randomized machine M outputs a y with K(y)log2|A|, it can only do so by sampling y almost-uniformly from A. For otherwise, M would yield a succinct description of y, contrary to assumption!

Technical part: Generalize to nonuniform distributions

Requires notion of a universal randomness test from algorithmic information theory

Page 9: The Equivalence of Sampling and Searching Scott Aaronson MIT

Comments

Our “reduction” from sampling to search is non-black-box: it requires the assumption that we have a Turing machine to solve RS!

Our result provides an extremely natural application of Kolmogorov complexity to “standard” complexity: one that doesn’t just amount to a counting argument

If we just wanted a search problem at least as hard as S, that would be easy: Kolmogorov complexity only comes in because we need RS to be equivalent to S

Page 10: The Equivalence of Sampling and Searching Scott Aaronson MIT

Kolmogorov ReviewK(y | x): Prefix-free Kolmogorov complexity of y, conditioned on x

,11log2 ODK

pyK

y

Kolmogorentropy Lemma: Let D={py} be a distribution, and let y be in its support. Then

where K(D) is the length of the shortest program to sample from D. Same holds if we replace K(y) by K(y|x) and K(D) by K(D|x).

Page 11: The Equivalence of Sampling and Searching Scott Aaronson MIT

Constructing the Search ProblemWe’re given a sampling problem S={Dx}x, where on input x{0,1}n, >0, the goal is to sample an m-bit string from a distribution C that’s -close to D=Dx, in poly(n,1/) time. Let

,|1

log::

,1

log1:,1,0,,,:,Pr:

1

2,

211.2

xYKpp

YA

yyYm

Nyp

N

x

yyx

NmN

Dy

Then the search problem RS is this: on input x{0,1}n, >0, output an N-tuple Y=y1,…,yNAx, with probability 1-, in poly(n,1/) time

Page 12: The Equivalence of Sampling and Searching Scott Aaronson MIT

Equivalence ProofLemma: Let C be any distribution over {0,1}m such that |C-Dx|. Then .

2

1Pr ,~

ONAY x

CY N

In other words, any algorithm that solves the sampling problem also solves the search problem w.h.p.

Proof: Counting argument.

Page 13: The Equivalence of Sampling and Searching Scott Aaronson MIT

Lemma: Given a probabilistic Turing machine B, suppose

Let C be the distribution over m-bit strings obtained by running B(x,), then picking one its N outputs y1,…,yN randomly. Then there exists a constant QB such that

Proof Sketch: Use Kolmogorentropy Lemma to show B(x,)’s output distribution has small KL-divergence from DN. Similar to Parallel Repetition Theorem, this implies C has small KL-divergence from D. By Pinsker’s Inequality, this implies |C-Dx| is small.

.1,Pr , xAxB

.N

QDC Bx

In other words: if B solves the search problem w.h.p., then it

also solves the sampling problem

Page 14: The Equivalence of Sampling and Searching Scott Aaronson MIT

Wrapping UpTheorem:

Let O be any oracle that, given x, 01/, and a random string r, outputs a sample from a distribution C such that |C-Dx|. Then RSFBPPO.

Let B be any probabilistic Turing machine that, given x,01/, outputs a YAx, with probability 1-. Then SSampBPPB.

Page 15: The Equivalence of Sampling and Searching Scott Aaronson MIT

Application to Quantum Complexity

Suppose FBPP=FBQP.

Let SSampBQP.

Then RSFBQP [RSS reduction]

RSFBPP [by hypothesis]

SSampBPP. [SRS reduction]

Therefore SampBPP=SampBQP.

Page 16: The Equivalence of Sampling and Searching Scott Aaronson MIT

Open Problems

The converse direction: Given a search problem, can we construct an equivalent sampling problem?

Can we show there’s no black-box equivalence between search and sampling problems? (I.e., that our use of Kolmogorov complexity was necessary?)

What if we want the search problem to be checkable?

Can redo proof with space-bounded Kolmogorov complexity to put search problem in PSPACE, but seems hard to do betterMore equivalence theorems—ideally, involving

decision and promise problems?