# introduction to algorithms

Post on 02-Jan-2016

18 views

Embed Size (px)

DESCRIPTION

Introduction to Algorithms. Jiafen Liu. Sept. 2013. Today’s task. Develop more asymptotic notations How to solve recurrences. Θ- notation. Math: Θ( g(n)) = { f(n) : there exist positive constants c 1 , c 2 , and n 0 such that 0 ≤ c 1 g(n) ≤ f (n) ≤ c 2 g(n) for all n ≥ n 0 } - PowerPoint PPT PresentationTRANSCRIPT

Introduction to Algorithms

Jiafen LiuSept. 2013

Todays taskDevelop more asymptotic notationsHow to solve recurrences

-notation

Math: (g(n)) = { f(n) : there exist positive constants c1, c2, and n0 such that 0 c1g(n) f (n) c2g(n) for all n n0}Engineering:Drop low-order terms; ignore leading constants.

O-notationAnother asymptotic symbol used to indicate upper bounds.Math:

Ex : 2n2= O(n3)(c= 1, n0= 2)= means one-way equality

Set definition of O-notation

Math:

Looks better?EX : 2n2 O(n3)O-notation corresponds roughly to less than or equal to.

Usage of O-notationMacro substitution: A set in a formula represents an anonymous function in the set, if O-notation only occurs on the right hand of formula.Ex:f(n) = n3+ O(n2) Means f(n) = n3+ h(n)for some h(n) O(n2), here we can think of h(n) as error term.

Usage of O-notation

If O-notation occurs on the left hand of formula, such as n2+ O(n) = O(n2), which means for any f(n) O(n):there exists some h(n) O(n2), makes n2+ f(n) = h(n)O-notation is an upper-bound notation. It makes no sense to say f(n) is at least O(n2).

-notation How can we express a lower bound?

Ex:

-notation

O-notation is like -notation is like .And -notation is like =Now we can give another definition of -notation

We also call -notation as tight bound.

A Theorem on Asymptotic Notations Theorem 3.1 For any two functions f(n) and g(n), we say f(n)=(g(n)) if and only if f(n)=O(g(n)) and f(n)=(g(n)).

Ex: , how can we prove that?

Two More Strict NotationsWe have just said that:O-notation and -notation are like and .o-notation and -notation are like < and >.

Difference with O-notation : This inequality must hold for all c instead of just for 1.

What does it mean? for any constant c With o-notation, we mean that: no matter what constant we put in front of g(x), f(x) will be still less than cg(x) for sufficiently large n. No matter how small c is.Ex: 2n2= o(n3)

An counter example:1/2n2 = (n2) does not hold for each c.(n0= 2/c)

Two More Strict NotationsWe have just said that:O-notation and -notation are like and .o-notation and -notation are like < and >.

Difference with -notation : This inequality must hold for all c instead of just for 1.

Solving recurrencesThe analysis of merge sort from Lecture 1 required us to solve a recurrence. Lecture 3: Applications of recurrences to divide-and-conquer algorithms.We often omit some details inessential while solving recurrences:n is supposed to be an integer because the size of input is an integer typically .Ignore the boundary conditions for convenience.

Substitution methodThe most general method : Substitution methodGuess the form of the solution.Verify by induction.Solve for constants.Substitution method is used to determine the upper or lower bounds of recurrence.

Example of SubstitutionEXAMPLE: T(n) = 4T(n/2) + n (n1)(Assume that T(1) = (1) )Can you guess the time complexity of it?Guess O(n3). (Prove O and separately.)We will prove T(n) cn3 by induction.First ,we assume that T(k) ck3 for k < n T(k) ck3 holds while k=n/2 by assumption.

Example of Substitution

All we need is this holds as long as c 2 for n 1

Base Case in InductionWe must also handle the initial conditions, that is, ground the induction with base cases.Base: T(n) = (1) for all n < n0, where n0 is a suitable constant.For 1 n< n0, we have (1) cn3, if we pick c big enough.

Here we are!

BUT this bound is not tight !

A tighter upper bound?We shall prove that T(n) = O(n2) .Assume that T(k) ck2 for k < n.

Anybody can tell me why? Now we need n 0 But it seems impossible for n 1

A tighter upper bound!IDEA: Strengthen the inductive hypothesis. Subtract a low-order term.Inductive hypothesis: T(k) c1k2 c2k for k< n.

Now we need (c2-1) n 0, it holds if c2 1

For the Base CaseWe have proved now that for any value of c1, and provided c2 1.Base: We need T(1) c1 c2Assumed T(1) is some constant.We need to choose c1 to be sufficiently larger than c2, and c2 has to be at least 1.To handle the initial conditions, just pick c1 big enough with respect to c2.

About Substitution methodWe have worked for upper bounds, and the lower bounds are similar.Try it yourself.Shortcomings: We had to know the answer in order to find it, which is a bit of a pain. It would be nicer to just figure out the answer by some procedure, and that will be the next two techniques.

Recursion-tree methodA recursion tree models the costs (time) of a recursive execution of an algorithm.The recursion-tree method can be unreliable, just like any method that uses dot,dot,dots ().The recursion-tree method promotes intuition, however. The recursion tree method is good for generating guesses for the substitution method.

Example of recursion tree

Solve T(n) = T(n/4) + T(n/2)+ n2:

Example of recursion tree

Solve T(n) = T(n/4) + T(n/2)+ n2:

Example of recursion tree

Solve T(n) = T(n/4) + T(n/2)+ n2:

Example of recursion tree

Solve T(n) = T(n/4) + T(n/2)+ n2:?How many leaves?We just need an upper bound.

Example of recursion tree

Solve T(n) = T(n/4) + T(n/2)+ n2:

2n2 So T(n) = (n2)

????Recall 1+1/2+1/4+1/8+

The Master MethodIt looks like an application of the recursion tree method but with more precise.The sad part about the master method is it is pretty restrictive. It only applies to recurrences of the form: T(n) = a T(n/b) + f(n) , where a 1, b >1, and f(n) is asymptotically positive.aT(n/b) means every problem you recurse on should be of the same size.

Three Common Cases

Three Common Cases

Case 1:

Three Common Cases

Case 1:

Case 2:

Three Common Cases

Case 3:

ExamplesEx: T(n) = 4T(n/2) + n a = 4, b= 2 and

ExamplesEx: T(n) = 4T(n/2) + n2 a = 4, b= 2 and

ExamplesEx: T(n) = 4T(n/2) + n3 a = 4, b= 2 and

HomeworkRead Chapter 3 and 4 to be prepared for applications of recurrences.

Proof of Master MethodHeight = logbn#(leaves) = ?

Proof of Master MethodHeight = logbn#(leaves) = ?

CASE1: The weight increases geometrically from the root to the leaves. The leaves hold a constant fraction of the total weight.

Proof of Master MethodHeight = logbn#(leaves) = ?

CASE2(k= 0) :The weight is approximately the same on each of the logbn levels.

Proof of Master MethodHeight = logbn#(leaves) = ?

CASE3: The weight decreases geometrically from the root to the leaves. The root holds a constant fraction of the total weight.

**Recall 1+1/2+1/4+1/8+*