yang liu shengyu zhang the chinese university of hong kong fast quantum algorithms for least squares...

17
Yang Liu Shengyu Zhang The Chinese University of Hong Kong Fast quantum algorithms for Least Squares Regression and Statistic Leverage Scores

Upload: clarence-harrison

Post on 02-Jan-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Yang Liu Shengyu Zhang

The Chinese University of Hong Kong

Fast quantum algorithms for Least Squares Regression and Statistic Leverage Scores

• Part I. Linear regression– Output a “quantum sketch” of solution.

• Part II. Computing leverage scores and matrix coherence. – Output the target numbers.

Part I: Linear regression

• Solve overdetermined linear system

where , , .• Goal: compute .

– Least Square Regression (LSR)

Closed-form solution

• Closed-form solution known:

– : Moore-Penrose pseudo-inverse of .– If the SVD of is where , then .

• Classical complexity: • Prohibitively slow for big matrices .

Relaxations

• Relaxation: – Approximate: output .– Important special case: Sparse and low-rank :

*1,2, where • # non-zero entries in each row/column.• .

• Quantum speedup? Even writing down the solution takes linear time.

*1. K. Clarkson, D. Woodruff. STOC, 2013.*2. J. Nelson, H. Nguyen. FOCS, 2013.

Quantum sketch

• Similar issue as solving linear system for full-rank . – Closed-form solution:

• [HHL09]*1 Output in time

• Condition number , where are ’s singular values.

• : sparsity. • proportional

*1. A. Harrow, A. Hassidim, S. Lloyd, PRL, 2009.

Controversy

• Useless? Can’t read out each solution variable ’s.

• Useful? As intermediate steps, e.g. when some global info of is needed. – can be obtained from by SWAP test.

• Classically also ? Impossible unless

LSR results

• Back to overdetermined system: .• [WBL12]*1: Output in time .• Ours:

– Same approx. in time – Simpler algorithm. – Can also estimate , which is used for, e.g.

computing .– Extensions: Ridge Regression, Truncated

SVD *1. N. Wiebe, D. Braun, S. Lloyd, PRL, 2012.

Our algorithm for LSR

• Input: Hermition ,. Assume with , and the rest ’s are 0.– Non-Hermition reduces to Hermition.

• Output: w/ , and .• Note: Write as , then the desirable output

is .

Algorithm

where

// attach , rotate if

// “select” component

, which is just .

Tool: Phase Estimation quantum algorithm. Output eigenvalue for a given eigenvector.

Extension 1: Ridge regression

• For ill-conditioned (i.e. large ) input?• Two classical solutions.• Ridge regression: .

– Closed-form solution: – Previous algorithms: ,

• for sparse and low rank.

• Ours: , for .

Extension 2: Truncated SVD

• Goal: , where with singular values truncated.

• Ours: – , where .

Part II. statistic leverage scores

• has SVD . The -th leverage score – : the -th row of .

• Matrix coherence: .• Leverage score measures the importance

of row .– A well-studied measure.– Very useful in large scale data analysis,

matrix algorithms, outlier detection, low-rank matrix approximation, etc. *1

*1. M. Mahoney, Randomized Algorithms for Matrices and Data, Foundations & Trends in Machine Learning, 2010.

Computing leverage scores

• Classical algo.*1 finding all : .• No better algorithm for finding • Our quantum algorithms for

– finding each : .– finding all : .– finding : .

*1. P. Drineas, M. Magdon-Ismail, M. Mahoney, D. Woodruff. J. MLR, 2012.

Algorithm for LSR

• Input: rank- Hermition , , . – with

• Output: .

• Key Lemma: If , then //

Algorithm

• where // rotate to if .

• Estimate the prob of observing 1 when measuring the last qubit.

• , the target.

Summary

• We give efficient quantum algorithms for two canonical problems on sparse inputs – Least squares regression– Statistical leverage score

• The problems are linear algebraic, not group/number/polynomial theoretic