assign 3

3
 CS648 : Randomized Algorithms Semester II, 2014-15, CSE, IIT Kanpur Assignment - 3 (due on 7th February 6:00 PM) Note:  Be very rigorous in providing any mathematical detail in support of you r arguments. Also menti on any Lemma/Theorem you use. 1.  Two-dimensional Pattern Matching (a) First we shall discuss a dierent nger printin g technique to solve one-dimensional pattern matc h- ing problem. The idea is to map any bit string  s  into a 2 × 2 matrix  M (s), as follows. i. For the empty string   ,  M () = 1 0 0 1 ii.  M (0) = 1 0 1 1 iii.  M (1) = 1 1 0 1 iv. For non-empty strings  x and  y ,  M (xy) =  M (x) × M (y). Show that this ngerprint function has the following properties. i.  M (x) is well dened for all  x ∈ {0, 1} . ii.  M (x) =  M (y)  ⇒ x  = y . iii. For  x  ∈ {0, 1} n , the entries in  M (x) are bounded by Fibonacci number  F n . By consi dering the matrices  M (x) modulo a suitable prime  p, show how you would perform ecient randomized pattern matching in 1-dimension. Explain how you would implement this as a “real-time” algorithm. (b) Consider the two-dimensional version of the pattern matching problem. The text is an  n ×  n matrix  X , and the pattern is an  m × m  matrix  Y . A pattern match occurs if  Y  appears as a (contiguous) sub-matrix of  X . Get inspired from the randomized algorithm for part (a) above to design a randomized algorithm for this 2-dimensional pattern matching. The running time should be  O (n 2 + m 2 ) and the error probability should be inverse polynomial in terms of  n  and  m. The hint is:  How to convert 2-dimensional pattern matching to 1-dimensional pattern matching. This hint will be expanded in the doubt clearing session on this weekend. Till then keep pondering over it. 2. This quest ion deals with Cherno bound. (a)  How well did you understand the proof of Chernobound ? Consider a collection  X 1 , ··· X n  of  n  independent geometrically distributed random variables with mean 2. Let  X  = Σ n i=1 X i  and  δ > 0. i. Derive a bound on  P(X   (1 +  δ )(2n)) by appyling the Chernobound to a sequence of (1 + δ )(2n) fair coin tosses. ii. Direc tly deriv e a Cher nolike bound on  P(X  ≥ (1 + δ )(2n)). iii. Whic h bound is better? 1

Upload: vipul-gupta

Post on 07-Oct-2015

11 views

Category:

Documents


0 download

DESCRIPTION

problems

TRANSCRIPT

  • CS648 : Randomized Algorithms

    Semester II, 2014-15, CSE, IIT Kanpur

    Assignment - 3 (due on 7th February 6:00 PM)

    Note: Be very rigorous in providing any mathematical detail in support of your arguments. Also mentionany Lemma/Theorem you use.

    1. Two-dimensional Pattern Matching

    (a) First we shall discuss a different finger printing technique to solve one-dimensional pattern match-ing problem. The idea is to map any bit string s into a 2 2 matrix M(s), as follows.

    i. For the empty string , M() =[1 00 1

    ]ii. M(0) =

    [1 01 1

    ]iii. M(1) =

    [1 10 1

    ]iv. For non-empty strings x and y, M(xy) = M(x)M(y).

    Show that this fingerprint function has the following properties.

    i. M(x) is well defined for all x {0, 1}.

    ii. M(x) = M(y) x = y.

    iii. For x {0, 1}n, the entries in M(x) are bounded by Fibonacci number Fn.By considering the matrices M(x) modulo a suitable prime p, show how you would performefficient randomized pattern matching in 1-dimension. Explain how you would implement this asa real-time algorithm.

    (b) Consider the two-dimensional version of the pattern matching problem. The text is an n nmatrix X, and the pattern is an m m matrix Y . A pattern match occurs if Y appears as a(contiguous) sub-matrix of X. Get inspired from the randomized algorithm for part (a) above todesign a randomized algorithm for this 2-dimensional pattern matching. The running time shouldbe O(n2 +m2) and the error probability should be inverse polynomial in terms of n and m.The hint is: How to convert 2-dimensional pattern matching to 1-dimensional pattern matching.This hint will be expanded in the doubt clearing session on this weekend. Till then keep ponderingover it.

    2. This question deals with Chernoff bound.

    (a) How well did you understand the proof of Chernoff bound ?Consider a collection X1, Xn of n independent geometrically distributed random variables withmean 2. Let X = ni=1Xi and > 0.

    i. Derive a bound on P(X (1 + )(2n)) by appyling the Chernoff bound to a sequence of(1 + )(2n) fair coin tosses.

    ii. Directly derive a Chernoff like bound on P(X (1 + )(2n)).iii. Which bound is better?

    1

  • (b) Estimating the biasness of a coinSuppose you are given a biased coin that has P[HEADS] = p a, for some fixed a. This isall that you know about p. Devise a procedure for estimating p by a value p such that you canguarantee that

    P[|p p| > p] < for any choice of the constants 0 < a, , < 1. Let N be the number of times you need to flip thebiased coin to obtain this estimate. What is the smallest value of N for which you can still givethis guarantee?

    2