1 analysis of algorithms & orders of growth rosen 6 th ed., §3.1-3.3

60
1 Analysis of Algorithms & Orders of Growth Rosen 6 Rosen 6 th th ed., §3.1-3.3 ed., §3.1-3.3

Post on 21-Dec-2015

241 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

1

Analysis of Algorithms &Orders of Growth

Rosen 6Rosen 6thth ed., §3.1-3.3 ed., §3.1-3.3

Page 2: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

2

Analysis of Algorithms

• An An algorithmalgorithm is a finite set of precise instructions is a finite set of precise instructions for performing a computation or for solving a for performing a computation or for solving a problem.problem.

• What is the goal of analysis of algorithms?What is the goal of analysis of algorithms?– To compare algorithms mainly in terms of running time To compare algorithms mainly in terms of running time

but also in terms of other factors (e.g., memory but also in terms of other factors (e.g., memory requirements,requirements, programmer's effort etc.)programmer's effort etc.)

• What do we mean by running time analysis?What do we mean by running time analysis?– Determine how running time increases as the size of Determine how running time increases as the size of

the problem increasesthe problem increases..

Page 3: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

3

Example: Searching

• Problem of Problem of searchingsearching an ordered list an ordered list..– Given a list Given a list L L of of nn elements that are sorted into elements that are sorted into

a definite order (a definite order (e.g.e.g., numeric, alphabetical),, numeric, alphabetical),– And given a particular element And given a particular element xx,,– Determine whether Determine whether xx appears in the list, and if appears in the list, and if

so, return its index (position) in the list.so, return its index (position) in the list.

Page 4: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

4

Search alg. #1: Linear Search

procedureprocedure linear searchlinear search((xx: integer, : integer, aa11, , aa22, …, , …, aann: distinct integers): distinct integers)

ii :=:= 1 1whilewhile ( (ii nn xx aaii))

ii :=:= ii + 1 + 1ifif ii n n then then locationlocation :=:= iielseelse locationlocation :=:= 0 0return return location location {index or 0 if not found}{index or 0 if not found}

Page 5: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

5

Search alg. #2: Binary Search

• Basic idea: On each step, look at the Basic idea: On each step, look at the middlemiddle element of the remaining list to eliminate element of the remaining list to eliminate half of it, and quickly zero in on the desired half of it, and quickly zero in on the desired element.element.

<x >x<x <x

Page 6: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

6

Search alg. #2: Binary Search

procedureprocedure binary searchbinary search((xx:integer:integer, a, a11, , aa22, …, , …, aann: distinct integers): distinct integers)

ii :=:= 1 {left endpoint of search interval} 1 {left endpoint of search interval}jj :=:= nn {right endpoint of search interval} {right endpoint of search interval}whilewhile i i<<j j beginbegin {while interval has >1 item} {while interval has >1 item}

mm :=:= ((ii++jj)/2)/2 {midpoint} {midpoint}ifif xx>>aamm thenthen i i :=:= m m+1 +1 else else j j :=:= m m

endendifif xx = = aaii thenthen locationlocation :=:= ii elseelse locationlocation :=:= 0 0

returnreturn locationlocation

Page 7: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

7

Is Binary Search more efficient? • Number of iterationsNumber of iterations::

– For a list of For a list of nn elements, Binary Search can elements, Binary Search can execute at most logexecute at most log22 nn times!! times!!

– Linear Search, on the other hand, can execute up Linear Search, on the other hand, can execute up to to nn times !! times !!

Average Number of IterationsAverage Number of Iterations

LengthLength Linear SearchLinear Search Binary SearchBinary Search

1010 5.55.5 2.92.9

100100 50.550.5 5.85.8

1,0001,000 500.5500.5 9.09.0

10,00010,000 5000.55000.5 12.012.0

Page 8: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

8

Is Binary Search more efficient?

• Number of computations per iterationNumber of computations per iteration:: – Binary search does more computations than Binary search does more computations than

Linear Search per iteration.Linear Search per iteration.

• OverallOverall::– If the number of components is small (say, less If the number of components is small (say, less

than 20), then Linear Search is faster.than 20), then Linear Search is faster.– If the number of components is large, then If the number of components is large, then

Binary Search is faster.Binary Search is faster.

Page 9: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

9

How do we analyze algorithms?

• We need to define a number of objective measures.We need to define a number of objective measures.

(1)(1) Compare execution times? Compare execution times? Not goodNot good: times are specific to a particular computer !!: times are specific to a particular computer !!

(2)(2) Count the number of statements executed?Count the number of statements executed?

Not goodNot good: number of statements vary with the : number of statements vary with the programming language programming language as well as the style of the as well as the style of the individual individual programmer.programmer.

Page 10: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

10

Example (# of statements)

Algorithm 1 Algorithm 2Algorithm 1 Algorithm 2

  

arr[0] = 0; for(i=0; i<N; i++)arr[0] = 0; for(i=0; i<N; i++)

arr[1] = 0; arr[i] = 0;arr[1] = 0; arr[i] = 0;

arr[2] = 0;arr[2] = 0;

......

arr[N-1] = 0;arr[N-1] = 0;

Page 11: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

11

How do we analyze algorithms?

(3) Express running time as a function of (3) Express running time as a function of the input size the input size nn (i.e., (i.e., f(n)f(n)))..

– To compare two algorithms with running times To compare two algorithms with running times f(n)f(n) and and g(n),g(n), we need a we need a rough measurerough measure of of how fast a function growshow fast a function grows..

– Such an analysis is independent of machine Such an analysis is independent of machine time, programming style, etc.time, programming style, etc.

Page 12: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

12

Computing running time• Associate a "cost" with each statement and find the Associate a "cost" with each statement and find the

"total cost“"total cost“ by finding the total number of times by finding the total number of times each statement is executed.each statement is executed.

• Express running time in terms of the size of the Express running time in terms of the size of the problem. problem. Algorithm 1 Algorithm 2Algorithm 1 Algorithm 2

Cost CostCost Cost arr[0] = 0; c1 for(i=0; i<N; i++) c2arr[0] = 0; c1 for(i=0; i<N; i++) c2 arr[1] = 0; c1 arr[i] = 0; c1arr[1] = 0; c1 arr[i] = 0; c1 arr[2] = 0; c1arr[2] = 0; c1 ...... arr[N-1] = 0; c1 arr[N-1] = 0; c1  ----------- ------------------------ ------------- c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2(c2 + c1) x N + c2

Page 13: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

13

Computing running time (cont.)

Cost Cost

   sum = 0; c1 sum = 0; c1

for(i=0; i<N; i++) c2for(i=0; i<N; i++) c2

for(j=0; j<N; j++) c2 for(j=0; j<N; j++) c2

sum += arr[i][j]; c3sum += arr[i][j]; c3

------------------------

cc1 + 1 + cc2 2 x x ((NN+1) + +1) + cc2 2 x N x x N x ((NN+1) + +1) + cc3 3 x N x Nx N x N

Page 14: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

14

Comparing Functions UsingRate of Growth

• Consider the example of buying Consider the example of buying elephantselephants and and goldfish:goldfish:

CostCost: cost_of_elephants + cost_of_goldfish: cost_of_elephants + cost_of_goldfishCostCost ~ cost_of_elephants ( ~ cost_of_elephants (approximationapproximation))

• The low order terms in a function are relatively The low order terms in a function are relatively insignificant for insignificant for largelarge nn

nn44 + 100 + 100nn22 + 10 + 10nn + 50 ~ + 50 ~ nn44

i.e., ni.e., n44 + 100 + 100nn22 + 10 + 10nn + 50 and + 50 and nn44 have the same have the same rate of growthrate of growth

Page 15: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

15

Rate of Growth ≡Asymptotic Analysis

• Using Using rate of growthrate of growth as a measure to compare as a measure to compare different functions implies comparing them different functions implies comparing them asymptoticallyasymptotically..

• If If ff((xx) is ) is faster growing faster growing than than gg((xx), then ), then ff((xx) ) always eventually becomes larger than always eventually becomes larger than gg((xx) ) in in the limitthe limit (for large enough values of (for large enough values of xx).).

Page 16: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

16

Example

• Suppose you are designing a web site to process Suppose you are designing a web site to process user data (user data (e.g.e.g., financial records)., financial records).

• Suppose program A takes Suppose program A takes ffAA((nn)=30)=30n+n+88

microseconds to process any microseconds to process any nn records, while records, while program B takes program B takes ffBB((nn)=)=nn22+1+1 microseconds to microseconds to

process the process the nn records. records.• Which program would you choose, knowing Which program would you choose, knowing

you’ll want to support millions of users?you’ll want to support millions of users?

Page 17: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

17

Visualizing Orders of Growth

• On a graph, asOn a graph, asyou go to theyou go to theright, a fasterright, a fastergrowinggrowingfunctionfunctioneventuallyeventuallybecomesbecomeslarger... larger...

fA(n)=30n+8

Increasing n

fB(n)=n2+1

Val

ue o

f fu

ncti

on

Page 18: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

18

Big-O Notation

• We say We say ffAA((nn)=30)=30n+n+88 is is order norder n, or O(, or O(nn). ).

It is, It is, at mostat most, roughly , roughly proportionalproportional to to nn..

• ffBB((nn)=)=nn22+1 is +1 is order norder n22, or O(, or O(nn22). It is, ). It is, at at

mostmost, roughly proportional to , roughly proportional to nn22..

• In general, an O(In general, an O(nn22) algorithm will be ) algorithm will be slowerslower than O than O(n)(n) algorithm. algorithm.

• WarningWarning: an O(: an O(nn22) function will grow ) function will grow fasterfaster than an O than an O(n)(n) function. function.

Page 19: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

19

More Examples …

• We say that We say that nn44 + 100 + 100nn22 + 10 + 10nn + 50 + 50 is of the is of the order of order of nn44 or or OO((nn44))

• We say that We say that 1010nn33 + 2 + 2nn22 is is OO((nn33) )

• We say that We say that nn33 - - nn22 is is OO((nn33))

• We say that 10 is We say that 10 is OO(1), (1),

• We say that 1273 is We say that 1273 is OO(1)(1)

Page 20: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

20

Big-O Visualization

Page 21: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

21

Computing running time

Algorithm 1 Algorithm 2Algorithm 1 Algorithm 2 Cost CostCost Cost arr[0] = 0; c1 for(i=0; i<N; i++) c2arr[0] = 0; c1 for(i=0; i<N; i++) c2 arr[1] = 0; c1 arr[i] = 0; c1arr[1] = 0; c1 arr[i] = 0; c1 arr[2] = 0; c1arr[2] = 0; c1 ...... arr[N-1] = 0; c1 arr[N-1] = 0; c1  ----------- ------------------------ ------------- c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2(c2 + c1) x N + c2

O(n)

Page 22: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

22

Computing running time (cont.)

Cost Cost

   sum = 0; c1 sum = 0; c1

for(i=0; i<N; i++) c2for(i=0; i<N; i++) c2

for(j=0; j<N; j++) c2 for(j=0; j<N; j++) c2

sum += arr[i][j]; c3sum += arr[i][j]; c3 ------------------------

cc1 + 1 + cc2 2 x x ((NN+1) + +1) + cc2 2 x N x x N x ((NN+1) + +1) + cc3 3 x N x Nx N x NO(n2)

Page 23: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

23

Running time of various statements

while-loop for-loop

Page 24: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

24

i = 0;i = 0;

while (i<N) {while (i<N) {

X=X+Y; X=X+Y; // O(1)// O(1)

result = mystery(X); result = mystery(X); // O(N), just an example...// O(N), just an example...

i++;i++; // O(1)// O(1)

}}

• The body of the while loop: O(N)The body of the while loop: O(N)

• Loop is executed:Loop is executed: N timesN times

N x O(N) = O(NN x O(N) = O(N22))

Examples

Page 25: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

25

if (i<j)if (i<j)

for ( i=0; i<N; i++ )for ( i=0; i<N; i++ )

X = X+i;X = X+i;

elseelse

X=0;X=0;

Max ( O(N), O(1) ) = O (N)Max ( O(N), O(1) ) = O (N)

O(N)

O(1)

Examples (cont.’d)

Page 26: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

26

Asymptotic Notation

• O notation: asymptotic “less than”: O notation: asymptotic “less than”:

– f(n)=O(g(n)) implies: f(n) “f(n)=O(g(n)) implies: f(n) “≤≤” g(n)” g(n)

notation: asymptotic “greater than”: notation: asymptotic “greater than”:

– f(n)= f(n)= (g(n)) implies: f(n) “ (g(n)) implies: f(n) “≥≥” g(n)” g(n)

notation: asymptotic “equality”: notation: asymptotic “equality”:

– f(n)= f(n)= (g(n)) implies: f(n) “=” g(n) (g(n)) implies: f(n) “=” g(n)

Page 27: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

27

Definition: O(g), at most order g

Let Let f,gf,g are functions are functions RRRR..

• We say that We say that “f“f is is at at most most order gorder g”, if: ”, if:

cc,,kk: : ff((xx) ) cgcg((xx), ), x>kx>k– ““Beyond some point Beyond some point kk, function , function ff is at most a is at most a

constant constant cc times times gg ( (i.e., i.e., proportional to proportional to gg).”).”

• ““ff is is at most order gat most order g”, or ”, or “f“f is O( is O(gg))”,”, or or ““ff=O(=O(gg)” all just mean that )” all just mean that ffO(O(gg).).

• Sometimes the phrase “at most” is omitted.Sometimes the phrase “at most” is omitted.

Page 28: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

28

Big-O Visualization

k

Page 29: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

29

Points about the definition

• Note that Note that ff is O( is O(gg) as long as ) as long as anyany values of values of cc and and kk exist that satisfy the definition. exist that satisfy the definition.

• But: The particular But: The particular cc, , kk, values that make , values that make the statement true are the statement true are notnot unique: unique: Any Any larger value of larger value of cc and/or and/or k k will also work.will also work.

• You are You are notnot required to find the smallest required to find the smallest cc and and kk values that work. (Indeed, in some values that work. (Indeed, in some cases, there may be no smallest values!)cases, there may be no smallest values!)

However, you should prove that the values you choose do work.

Page 30: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

30

“Big-O” Proof Examples

• Show that 30Show that 30nn+8 is O(+8 is O(nn).).– Show Show cc,,kk: 30: 30nn+8 +8 cn, cn, nn>>k k ..

• Let Let c=c=31, 31, kk=8. Assume =8. Assume nn>>kk=8. Then=8. Thencncn = 31 = 31nn = 30 = 30nn + + nn > 30 > 30nn+8, so 30+8, so 30nn+8 < +8 < cncn..

• Show that Show that nn22+1 is O(+1 is O(nn22).).– Show Show cc,,kk: : nn22+1 +1 cncn22, , nn>>kk:: ..

• Let Let cc=2, =2, kk=1. Assume =1. Assume nn>1. Then >1. Then

cncn22 = 2 = 2nn22 = = nn22++nn22 > > nn22+1, or +1, or nn22+1< +1< cncn22..

Page 31: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

31

• Note 30Note 30nn+8 isn’t+8 isn’tless than less than nnanywhere anywhere ((nn>0).>0).

• It isn’t evenIt isn’t evenless than 31less than 31nneverywhereeverywhere..

• But it But it isis less than less than3131nn everywhere toeverywhere tothe right of the right of nn=8=8. .

n>k=8

Big-O example, graphically

Increasing n

Val

ue o

f fu

ncti

on

n

30n+8cn =31n

30n+8O(n)

Page 32: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

32

Common orders of magnitude

Page 33: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

33

Page 34: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

34

Order-of-Growth in Expressions

• ““O(O(ff)” can be used as a term in an arithmetic )” can be used as a term in an arithmetic expression .expression .

E.g.E.g.: we can write “: we can write “xx22+x+1” as “+x+1” as “xx22+O(+O(xx)” meaning )” meaning ““xx22 plus some function that is O( plus some function that is O(xx)”.)”.

• Formally, you can think of any such expression as Formally, you can think of any such expression as denoting a set of functions:denoting a set of functions:

“ “xx22+O(+O(xx)” :)” : { {gg | | ffO(O(xx): ): gg((xx)= )= xx22++ff((xx)})}

Page 35: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

35

Useful Facts about Big O

• Constants ... Constants ...

cc>0, O(>0, O(cfcf))==O(O(f+cf+c)=O()=O(ffc)c)==O(O(ff))

• Sums:Sums: - If - If ggO(O(ff) and ) and hhO(O(ff), then ), then gg++hhO(O(ff).).

- - If If ggO(O(ff11) and ) and hhO(O(ff22), then ), then

gg++hhO(O(ff11+f+f22) =O(max() =O(max(ff11,f,f22))))

(Very useful!)(Very useful!)

Page 36: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

36

More Big-O facts

• Products:Products: If If ggO(O(ff11) and ) and hhO(O(ff22), then ), then ghghO(O(ff11ff22) )

• Big O, as a relation, is Big O, as a relation, is transitivetransitive: : ffO(O(gg) ) ggO(O(hh) ) ffO(O(hh))

Page 37: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

37

More Big O facts

f,gf,g & constants & constants aa,,bbRR, with , with bb0,0,– af af = O(= O(ff) () (ee..g. g. 33xx2 2 = O(= O(xx22))))– f+f+O(O(ff)) = = O(O(ff) () (ee..g. xg. x22+x = +x = O(O(xx22))))– |f||f|1-1-b b = O(= O(ff) () (e.g.e.g. xx1 1 = O(= O(xx))))

– (log(logbb | |f|f|))aa = O(= O(ff) () (e.g.e.g. log log xx = O( = O(xx))))

– gg=O(=O(fgfg)) ( (ee..g.g. x = x = O(O(xx log log xx))))– fgfg O( O(gg)) ( (e.g.e.g. xx log log xx O( O(xx))))– a=a=O(O(ff) () (e.g.e.g. 3 = O( 3 = O(xx))))

Page 38: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

38

Definition: (g), at least order g

Let Let f,gf,g be any function be any function RRRR..• We say that “We say that “f isf is atat least least order gorder g”, written ”, written ((gg), if ), if

cc,,kk: : ff((xx) ) cgcg((xx), ), xx>>kk– ““Beyond some point Beyond some point kk, function , function ff is at least a constant is at least a constant cc

times times gg ( (i.e., i.e., proportional to proportional to gg).”).”

– Often, one deals only with positive functions and can Often, one deals only with positive functions and can ignore absolute value symbols.ignore absolute value symbols.

• ““ff is is at least order gat least order g”, or ”, or “f“f is is ((gg))”,”, or “ or “ff= = ((gg)” )” all just mean that all just mean that ff ((gg).).

Page 39: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

39

Big- Visualization

Page 40: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

40

Definition: (g), exactly order g

• If If ffO(O(gg) and ) and ggO(O(ff) then we say “) then we say “g and f g and f are of the are of the samesame order order” or “” or “f is (exactly) order f is (exactly) order gg” and write ” and write ff((gg).).

• Another equivalent definition:Another equivalent definition: cc11cc22,k: : cc11gg((xx))ff((xx))cc22gg((xx), ), xx>>kk

• ““Everywhere beyond some point Everywhere beyond some point kk, , ff((xx)) lies in lies in between two multiples of between two multiples of gg((xx).”).”

((gg) ) O( O(gg) ) ((gg) ) (i.e., (i.e., ffO(O(gg) and ) and ff((gg) )) )

Page 41: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

41

Big- Visualization

Page 42: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

42

Rules for

• Mostly like rules for O( ), except:Mostly like rules for O( ), except: f,gf,g>0 & constants >0 & constants aa,,bbRR, with , with bb>0,>0,

afaf ((ff) ) Same as with O. Same as with O. f f ((fgfg) unless ) unless g=g=(1) (1) UnlikeUnlike O. O.||f| f| 1-1-bb ((ff), and ), and UnlikeUnlike with O. with O. (log(logbb | |f|f|))cc ((ff). ). Unlike Unlike with O.with O.

• The functions in the latter two cases we say The functions in the latter two cases we say are are strictly of lowerstrictly of lower order order than than ((ff).).

Page 43: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

43

example

• Determine whether:Determine whether:

• Quick solution:Quick solution:)( 2

?

1

nin

i

1

2

( 1) / 2

( ) / 2

( )

( )

n

i

i n n

n n

n n

n

Page 44: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

44

Other Order-of-Growth Relations

• o(o(gg) = {) = {ff | | cc kk: : ff((xx) < ) < cgcg((xx), ), x>kx>k}}“The functions that are “The functions that are strictly lowerstrictly lower order order than gthan g.” o(.” o(gg) ) O( O(gg) ) ((gg).).

((gg) = {) = {ff | | cc kk: : cgcg((xx) < ) < ff((xx), ), x>kx>k } }“The functions that are “The functions that are strictly higherstrictly higher order order than gthan g.” .” ((gg) ) ((gg) ) ((gg).).

Page 45: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

45

• Subset relations between order-of-growth Subset relations between order-of-growth sets.sets.

Relations Between the Relations

RR( f )O( f )

( f ) ( f )o( f )• f

Page 46: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

46

Strict Ordering of Functions

• Temporarily let’s write Temporarily let’s write ffgg to mean to mean ffo(o(gg),), ff~~gg to mean to mean ff((gg))

• Note that Note that

• Let Let kk>1. Then the following are true:>1. Then the following are true:1 1 log log log log nn log log nn ~ log ~ logkk nn log logkk nn

nn1/1/kk n n n n log log nn nnkk kknn n n! ! nnnn … …

.0)(

)(lim

xg

xfgf

x

Page 47: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

47

Common orders of magnitude

Page 48: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

48

Review: Orders of Growth

Definitions of order-of-growth sets, Definitions of order-of-growth sets, gg::RRRR• O(O(gg) ) { {ff || c,kc,k: : ff((xx) ) cgcg((xx), ), xx>>k k }}

• o(o(gg) ) { {ff | | cc k: fk: f((xx) < ) < cgcg((xx),),x>kx>k } }((gg) ) { {ff|| | | cc,,kk : : f f((xx) ) cgcg((xx),),xx>>kk } } ((gg) ) { {ff | | cc kk: : ff((xx) >) >cgcg((xx), ), x>kx>k}}((gg) ) { {ff | | cc11cc22,k: : cc11gg((xx))ff((xx)|)|cc22gg((xx), ), xx>>kk}}

Page 49: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

49

Algorithmic and Problem Complexity

Rosen 6Rosen 6thth ed., §3.3 ed., §3.3

Page 50: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

50

Algorithmic Complexity

• The The algorithmic complexityalgorithmic complexity of a of a computation is some measure of how computation is some measure of how difficultdifficult it is to perform the computation. it is to perform the computation.

• Measures some aspect of Measures some aspect of costcost of of computation (in a general sense of cost).computation (in a general sense of cost).

Page 51: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

51

Problem Complexity

• The complexity of a computational The complexity of a computational problemproblem or or tasktask is the complexity of is the complexity of the algorithm the algorithm with the lowest order of growth of with the lowest order of growth of complexitycomplexity for solving that problem or for solving that problem or performing that task.performing that task.

• E.g. E.g. the problem of searching an ordered the problem of searching an ordered list has list has at most logarithmicat most logarithmic time time complexity. (Complexity is O(log complexity. (Complexity is O(log nn).)).)

Page 52: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

52

Tractable vs. Intractable Problems

• A problem or algorithm with A problem or algorithm with at mostat most polynomial polynomial time complexity is considered time complexity is considered tractabletractable (or (or feasiblefeasible). ). PP is the set of all tractable problems. is the set of all tractable problems.

• A problem or algorithm that has more than A problem or algorithm that has more than polynomial complexity is considered polynomial complexity is considered intractable intractable (or (or infeasibleinfeasible))..

• NoteNote– nn1,000,0001,000,000 is is technicallytechnically tractable, but really impossible. tractable, but really impossible.

nnlog log log log log log nn is is technicallytechnically intractable, but easy. intractable, but easy.

– Such cases are rare though.Such cases are rare though.

Page 53: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

53

Dealing with Intractable Problems

• Many times, a problem is intractable for a small Many times, a problem is intractable for a small number of input cases that do not arise in practice number of input cases that do not arise in practice very often.very often.– Average running time is a better measure of problem Average running time is a better measure of problem

complexity in this case.complexity in this case.

– Find approximate solutions instead of exact solutions.Find approximate solutions instead of exact solutions.

Page 54: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

54

Unsolvable problems

• It can be shown that there exist problems that no It can be shown that there exist problems that no algorithm exists for solving them.algorithm exists for solving them.

• Turing discovered in the 1930’s that there are Turing discovered in the 1930’s that there are problems problems unsolvableunsolvable by by anyany algorithm. algorithm.

• Example: the Example: the halting problem (see page 176)halting problem (see page 176)– Given an arbitrary algorithm and its input, will that Given an arbitrary algorithm and its input, will that

algorithm eventually halt, or will it continue forever in algorithm eventually halt, or will it continue forever in an “an “infinite loopinfinite loop?”?”

Page 55: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

55

NP and NP-complete

• NPNP is the set of problems for which there exists a is the set of problems for which there exists a tractable algorithm for tractable algorithm for checking solutionschecking solutions to see if to see if they are correct.they are correct.

• NP-completeNP-complete is a class of problems with the is a class of problems with the property that if any one of them can be solved by a property that if any one of them can be solved by a polynomial worst-case algorithm, then all of them polynomial worst-case algorithm, then all of them can be solved by polynomial worst-case can be solved by polynomial worst-case algorithms.algorithms.– Satisfiability problemSatisfiability problem: find an assignment of truth : find an assignment of truth

values that makes a compound proposition true.values that makes a compound proposition true.

Page 56: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

56

P vs. NP

• We know We know PPNPNP, but the most famous , but the most famous unproven conjecture in computer science is unproven conjecture in computer science is that this inclusion is that this inclusion is properproper ( (i.e.i.e., that , that PPNPNP rather than rather than PP==NPNP).).

• It is generally accepted that no It is generally accepted that no NP-NP-completecomplete problem can be solved in problem can be solved in polynomial time.polynomial time.

• Whoever first proves it will be Whoever first proves it will be famousfamous!!

Page 57: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

57

Questions

• Find the best big-O notation to describe the Find the best big-O notation to describe the complexity of following algorithms:complexity of following algorithms:– A binary search of n elementsA binary search of n elements– A linear search to find the smallest number in a A linear search to find the smallest number in a

list of n numberslist of n numbers– An algorithm that lists all ways to put the An algorithm that lists all ways to put the

numbers 1,2,3,…,n in a row.numbers 1,2,3,…,n in a row.

Page 58: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

58

Questions (cont’d)

– An algorithm that prints all bit strings of length nAn algorithm that prints all bit strings of length n– An iterative algorithm to compute n!An iterative algorithm to compute n!– An algorithm that finds the average of n numbers An algorithm that finds the average of n numbers

by adding them and dividing by nby adding them and dividing by n– An algorithm that prints all subsets of size three An algorithm that prints all subsets of size three

of the set {1,2,3,…,n}of the set {1,2,3,…,n}– The best case analysis of a linear search of a list The best case analysis of a linear search of a list

of size n.of size n.

Page 59: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

59

Questions (cont’d)

– The wors-case analysis of a linear search of a list The wors-case analysis of a linear search of a list of size nof size n

– The number of print statements in the followingThe number of print statements in the followingwhile n>1 {while n>1 {

print “hello”print “hello”

n=n/2n=n/2

}}

Page 60: 1 Analysis of Algorithms & Orders of Growth Rosen 6 th ed., §3.1-3.3

60

Questions (cont’d)

– The number of print statements in the followingThe number of print statements in the following for (i=1, ifor (i=1, i≤n; i++)≤n; i++)

for (j=1, j ≤n; j++)for (j=1, j ≤n; j++) print “hello”print “hello”– The number of print statements in the followingThe number of print statements in the following for (i=1, ifor (i=1, i≤n; i++)≤n; i++)

for (j=1, j ≤i; j++)for (j=1, j ≤i; j++) print “hello”print “hello”