1 computer algorithms lecture 3 asymptotic notation some of these slides are courtesy of d....

35
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR

Upload: debra-berry

Post on 29-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

1

Computer AlgorithmsLecture 3

Asymptotic Notation

Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR

2

Mathematical Model of a Computer

• Our algorithms – Mentally execute on computer model– Evaluate time

• Algorithms consist of operations– What time should we assign to each of the operations used in

our model of a computer

• What should be the input data– Execution time depends on the input

• How does our model related to real computers– If the model is very different from real computers then we will not

be able to generalize our approach to real-life problems

3

Review

• Define basic terms• Present mathematical mode• Analysis strategy• Examples of algorithms and their analysis• Model limitations

– How well will our conclusions relate to the real problems

4

What is a Problem

• Finding the shortest path on a map• Sort an input array or real numbers• Finding a word definition in a dictionary

• An input is supplied and an output is generated

• Def: problem is a specification of what are valid inputs, and what constitutes an acceptable output for each of the valid inputs.

• Input instance: a valid input• Instance size

– Formal: number of bits needed to represent an instance input– Informal (but more useful for our course): any parameter that roughly

grows with the formal notion of size • Instance size ~ time• Algorithm evaluation/comparison ~ time for the same instance size

5

What is a Algorithm

• Def: algorithms is a computational abstract procedure which– Given value(s) as input– Produces value(s) as output

• A program ~ expression of an algorithm– Program – concrete– Algorithm – abstract

• In this class we are not interested in a detailed syntax of a particular programming language

6

Computer Model

• RAM: random access machine– Memory– Processor

• Memory: (think of an array where each location is assigned an address)

• Processor: all instructions execute in one step– Arithmetic and logical operations (A = B+C: one instruction)– Jumps and conditional jumps (if A>B then goto: one instruction)– Array operations (if A = B[i]: one instruction)

7

General Analysis Strategy• Def : TA(n) – maximum time taken by algorithm A to solve any

instance of size n

This is not a number, this is a function

• Conservative definition (worst-case scenario)• Functional form of T(n)

– Linear– Polynomial (quadratic, cubic)– …

We are interested in the shape of the function

• Often we are interested in T(n) estimation (lower bound/upper bound)

• We are interested in large n (n ∞)

8

Running Time

• Best-case: the input array is in the correct order• Worst-case: the input array is in the reverse order• Average-case

• Best-case: linear function (an + b)• Worst-case: quadratic function (an2 + bn + c)• a, b, c – constants• For large n, constants a, b, c can be disregarded• Best-case: Θ(n)• Worst-case: Θ(n2)

Insertion sort running time

9

Growth of Functions

• As algorithm runs, the running time grows in terms of the input size

• Time complexity of an algorithm is a function of the input

10

Algorithm Analysis

• The amount of resources used by the algorithm– Space– Computational time

• Running time:– The number of primitive operations (steps) executed

before termination

• Order of growth – The leading term of a formula– Expresses the behavior of a function toward infinity

11

Asymptotic Notations

• A way to describe behavior of functions in the limit

– How we indicate running times of algorithms

– Describe the running time of an algorithm as n grows to

• O notation: asymptotic “less than”: f(n) “≤”

g(n)

• notation: asymptotic “greater than”: f(n) “≥” g(n)

• notation: asymptotic “equality”: f(n) “=” g(n)

12

Asymptotic Complexity

• Running time of an algorithm as a function of input size n for large n.

• Expressed using only the highest-order term in the expression for the exact running time.– Instead of exact running time, say Q(n2).

• Describes behavior of function in the limit.• Written using Asymptotic Notation.

13

Asymptotic Notation

• Q, O, W, o, w• Defined for functions over the natural numbers.

– Ex: f(n) = Q(n2).– Describes how f(n) grows in comparison to n2.

• Define a set of functions.• The notations describe different rate-of-growth relations between the

defining function and the defined set of functions.

14

Little o-notation (opt)

f(n) becomes insignificant relative to g(n) as n approaches infinity:

lim [f(n) / g(n)] = 0

n

g(n) is an upper bound for f(n) that is not asymptotically tight.

!!! Later compare later with big O.

o(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have 0 f(n) < cg(n)}.

For a given function g(n), the set little-o:

15

w(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have 0 cg(n) < f(n)}.

Little w –notation (opt)

f(n) becomes arbitrarily large relative to g(n) as n approaches infinity:

lim [f(n) / g(n)] = .

n

g(n) is a lower bound for f(n) that is not asymptotically tight.

For a given function g(n), the set little-omega:

16

Big O-notation

O(g(n)) = {f(n) : positive constants c and n0, such that n n0,

we have 0 f(n) cg(n) }

For function g(n), we define O(g(n)), big-O of n, as the set:

g(n) is an asymptotic upper bound for f(n).

Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n).

O(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have 0 f(n) < cg(n)}.

17

Example

• f(n) = 100 n2, g(n) = n4, the following table and figure show that g(n) grows faster than f(n) when n > 10. We say f is big-Oh of g.

n f(n) g(n)

5 2,500 625

10 10,000 10,000

50 250,000 6,250,000

100 1,000,000 100,000,000

150 2,250,000 506,250,000

100 n2<= c*n4

c=1; n>10

18

Examples

– 2n2 = O(n3):

– n2 = O(n2):

– 1000n2+1000n = O(n2):

– n = O(n2):

2n2 ≤ cn3 2 ≤ cn c = 1 and n0= 2

n2 ≤ cn2 c ≥ 1 c = 1 and n0= 1

1000n2+1000n ≤ 1000n2+ 1000n2 = 2000n2 c=2000 and n0 = 1

n ≤ cn2 cn ≥ 1 c = 1 and n0= 1

O(g(n)) = {f(n) : positive constants c and n0, such that n n0, we have 0 f(n) cg(n) }

19

-notation

g(n) is an asymptotic lower bound for f(n).

Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n).

(g(n)) = {f(n) : positive constants c and n0, such that n n0,

we have 0 cg(n) f(n)}

For function g(n), we define (g(n)), big-Omega of n, as the set:

w(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have 0 cg(n) < f(n)}.

20

Examples– 5n2 = (n)

– 100n + 5 ≠ (n2)

– n = (2n), n3 = (n2), n = (logn)

c, n0 such that: 0 cn 5n2 cn 5n2 c = 1 and n0 = 1

c, n0 such that: 0 cn2 100n + 5

100n + 5 100n + 5n ( n 1) = 105n

cn2 105n n(cn – 105) 0

Since n is positive cn – 105 0 n 105/c

contradiction: n cannot be smaller than a constant

21

-notation

(g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0,

we have 0 c1g(n) f(n) c2g(n)}

For function g(n), we define (g(n)), big-Theta of n, as the set:

g(n) is an asymptotically tight bound for f(n).

Intuitively: Set of all functions thathave the same rate of growth as g(n).

f(n) = (g(n)) f(n) = O(g(n)).(g(n)) O(g(n)).

f(n) = (g(n)) f(n) = (g(n)).(g(n)) (g(n)).

22

-notation

(g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0,

we have 0 c1g(n) f(n) c2g(n)}

For function g(n), we define (g(n)), big-Theta of n, as the set:

Technically, f(n) (g(n)).Older usage, f(n) = (g(n)).I’ll accept either…

f(n) and g(n) are nonnegative, for large n.

23

Examples

– 6n3 ≠ (n2): c1 n2 ≤ 6n3 ≤ c2 n2

only holds for: n ≤ c2 /6

– n ≠ (logn): c1 logn ≤ n ≤ c2 logn

c2 ≥ n/logn, n≥ n0 – impossible

24

More on Asymptotic Notations

• There is no unique set of values for n0 and c in proving the

asymptotic bounds

• Prove that 100n + 5 = O(n2)

– 100n + 5 ≤ 100n + n = 101n ≤ 101n2

for all n ≥ 5

n0 = 5 and c = 101 is a solution

– 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2

for all n ≥ 1

n0 = 1 and c = 105 is also a solution

Must find SOME constants c and n0 that satisfy the asymptotic notation relation

25

Relations Between Q, O, W

26

Comparisons of Functions

• Theorem:

f(n) = (g(n)) f = O(g(n)) and f = (g(n))• Transitivity:

– f(n) = (g(n)) and g(n) = (h(n)) f(n) = (h(n))– Same for O and

• Reflexivity:– f(n) = (f(n))– Same for O and

• Symmetry:– f(n) = (g(n)) if and only if g(n) = (f(n))

• Transpose symmetry:– f(n) = O(g(n)) if and only if g(n) = (f(n))

27

Properties Summary• Transitivity

f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))

f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n))f(n) = w(g(n)) & g(n) = w(h(n)) f(n) = w(h(n))

• Reflexivityf(n) = (f(n))

f(n) = O(f(n)) f(n) = (f(n))

28

Properties Summary

• Symmetry

f(n) = (g(n)) iff g(n) = (f(n))

• Complementarity

f(n) = O(g(n)) iff g(n) = (f(n))

f(n) = o(g(n)) iff g(n) = w((f(n))

29

Asymptotic Notations in Equations

• On the right-hand side– (n2) stands for some anonymous function in (n2)2n2 + 3n + 1 = 2n2 + (n) means:

There exists a function f(n) (n) such that

2n2 + 3n + 1 = 2n2 + f(n)• On the left-hand side

2n2 + (n) = (n2)No matter how the anonymous function is chosen on the left-hand

side, there is a way to choose the anonymous function on the right-hand side to make the equation valid.

30

Comparison of Functions

f g a b

f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b

f (n) = o(g(n)) a < b

f (n) = w (g(n)) a > b

31

Limits

• lim [f(n) / g(n)] = 0 Þ f(n) Î o(g(n)) n

• lim [f(n) / g(n)] < Þ f(n) Î O(g(n)) n

• 0 < lim [f(n) / g(n)] < Þ f(n) Î Q(g(n)) n

• 0 < lim [f(n) / g(n)] Þ f(n) Î W(g(n)) n

• lim [f(n) / g(n)] = Þ f(n) Î w(g(n)) n

• lim [f(n) / g(n)] undefined Þ can’t say n

32

ExerciseExpress functions in A in asymptotic notation using functions in B.

A B

5n2 + 100n 3n2 + 2

A (n2), n2 (B) A (B)

log3(n2) log2(n3)

logba = logca / logcb; A = 2lgn / lg3, B = 3lgn, A/B =2/(3lg3)

nlg4 3lg n

alog b = blog a; B =3lg n=nlg 3; A/B =nlg(4/3) as n

lg2n n1/2

lim ( lga n / nb ) = 0 (here a = 2 and b = 1/2) A O (B) n

A (B)

A (B)

A Ω(B)

A O (B)

33

Example

• Insertion sort takes Q(n2) in the worst case.

• Any sort algorithm must look at each item, so sorting is W(n).

34

Insertion Search

)()(

)1()1(

)1()1()(

854285421

85

421

ccccnccccc

ncnc

ncncncnT

Best case:

)(

)222

()222

(

)1()12

)1(()1

2

)1((

)12

)1(()1()1()(

6542

8765

4212765

876

5221

cccc

ncccc

cccnccc

ncnn

cnn

c

nncncncncnT

)1()1()1(

)1()1()(

82

72

6

25421

nctctc

tcncncncnT

n

jj

n

jj

n

jj

Worst case:

35

Reading Assignment

• Chapter 4 of CLRS.