asymptotic notations

23
Presentation on Time and Space Complexity, Average and worst case analysis, and Asymptotic Notations Presented By – Mr. Rishabh Soni Guided By – Mr. M.A. Rizvi

Upload: rishabh-soni

Post on 14-Jun-2015

1.032 views

Category:

Engineering


1 download

DESCRIPTION

Description of why we need asymptotic analysis and description various asymptotic notations with their properties

TRANSCRIPT

Page 1: Asymptotic Notations

Presentation on Time and Space Complexity,

Average and worst case analysis,and Asymptotic Notations

Presented By – Mr. Rishabh SoniGuided By – Mr. M.A. Rizvi

Page 2: Asymptotic Notations

Roadmap• Algorithmic Complexity• Time and space complexity• Need for Complexity Analysis• Average and worst case analysis• Why worst case analysis?• The importance of Asymptotics• Asymptotic Notations - Q, O, , W etc.• Relations between Q, O, W• Comparison of functions

Page 3: Asymptotic Notations

Algorithmic Complexity

Algorithmic complexity is a very important topic in computer science. Knowing the complexity of algorithms allows you to answer questions such as• How long will a program run on an input?• How much space will it take?• Is the problem solvable?

These are important bases of comparison between different algorithms. An understanding of algorithmic complexity provides programmers with insight into the efficiency of their code. Complexity is also important to several theoretical areas in computer science, including algorithms, data structures, and complexity theory.

Page 4: Asymptotic Notations

Time and Space Complexity

Time Complexity – The time complexity is the amount of time required by an algorithm to execute.It is measured in terms of number of operations rather than computer time; because computer time is dependent on the hardware, processor, etc..Some general order that we may consider O(c) < O(log n ) < O(n) < O(n log n) < O(nc) < O(cn) < O(n!) < O(nn), where c is some constant.

Big-O Notation Examples of Algorithms

O(1) Push, Pop, Enqueue (if there is a tail reference), Dequeue, Accessing an array element

O(log(n)) Binary search

O(n) Linear search

O(n log(n)) Heap sort, Quick sort (average), Merge sort

O(n2) Selection sort, Insertion sort, Bubble sort

O(n3) Matrix multiplication O(2n) Towers of Hanoi

Page 5: Asymptotic Notations

Space Complexity –

The space complexity of an algorithm is the amount of memory it needs to run to completion.

Space complexity can be defined as : Amount of computer memory required during the program execution, as the function of input size.

The difference between space complexity and time complexity is that the space can be reused.

Page 6: Asymptotic Notations

Complexity : Why Bother?• Estimation/Prediction :

When you write/run a program, you need to able to predict its requirements.

• Usual Requirements :- execution time- memory space

• Quantities to estimate :- execution time -> time complexity- memory space -> space complexity

• It is pointless to run a program that requires:- 64TB of RAM on a desktop machine.- 10,000 years to run

• You do not want to wait for an hour :- for the result of your query on Google.- when you are checking for your bank account online.- when you are opening a picture file on Photoshop.

It is important to write efficient algorithms.

Page 7: Asymptotic Notations

Average and Worst case Analysis

Worst-case complexity:The worst case complexity is the complexity of an algorithm when the input is the worst possible with respect to complexity.

Average Complexity:The average complexity is the complexity of an algorithm that is averaged over all possible inputs (assuming a uniform distribution over the inputs).

Input

1 ms

2 ms

3 ms

4 ms

5 ms

A B C D E F G

worst-case

best-case}average-case?

Page 8: Asymptotic Notations

Why Worst Case Analysis?

Worst case running time : It is the longest running time for any input of size n. We usually concentrate on finding only the worst-case running time, that is, the longest running time for any input of size n, because of the following reasons:

• The worst-case running time of an algorithm gives an upper bound on the running time for any input. Knowing it provides a guarantee that the algorithm will never take any longer.

• For some algorithms, the worst case occurs fairly often. For example, in searching a database for a particular piece of information, the searching algorithm’s worst case will often occur when the information is not present in the database.

• The “average case” is often roughly as bad as the worst case.

Page 9: Asymptotic Notations

The Importance of Asymptotics

• Asymptotic notation has many important benefits, which might not be immediately obvious.

• An algorithm with asymptotically low running time (for example, one that is O(n2) is beaten in the long run by an algorithm with an asymptotically faster running time (for example, one that is O(n log n)), even if the constant factor for the faster algorithm is worse.

Running Time Maximum Problem Size (n) 1 second 1 minute 1 hour

400 n 2,500 1,50,000 9,000,000

20n [log n] 4,096 166,666 7,826,087

2n2 707 5,477 42,426

n4 31 88 244

2n 19 25 31

Page 10: Asymptotic Notations

Asymptotic Analysis

• Goal : to simplify the analysis of running time by getting rid of “details” which may be affected by specific implementation and hardware like “rounding” : 1,000,001 = 1,000,000 3n2 = n2.

• Capturing the essence : how the running time of an algorithm increases with the size of the input in the limit. Asymptotically more efficient algorithms are best for all but

small inputs.

Page 11: Asymptotic Notations

Asymptotic Notations

• Q, O, W, o, w• Defined for functions over the natural numbers.

– Ex: f(n) = Q(n2).– Describes how f(n) grows in comparison to

n2.• Define a set of functions; in practice used to

compare two function sizes.• Asymptotic notation is useful because it allows

us to concentrate on the main factor determining a functions growth.

Page 12: Asymptotic Notations

-notation• notation bounds a function to

within constant factors• Definition:For a given function g(n), we denote (g(n)) the set of functions(g(n)) = { f(n) : there exists positive constants c1, c2 and n0 such that 0 ≤ c1

g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0. }• Explanation: We write f(n) = (g(n)), if there exist positive constants n0, c1, and c2 such that at the right of n0, the value of f(n) always lies between c1 g(n) and c2g(n) inclusive.• We say that g(n) is asymptotically

tight bound for f(n).

Page 13: Asymptotic Notations

O-notation

• We use O-notation to give an upper bound on a function , to within a constant factor.

• Definition: For a function g(n), we denote by O(g(n)) the set of functionsO(g(n)) = {f(n) : there exists positive constants c and n0 such that 0 ≤ f(n) ≤ c g(n) for all n ≥ n0.• Explanation:We Write f(n) = O(g(n)) if there are positive constants n0 and c such that at and to the right of n0, the value of f(n) always lies on or below c g(n).

Page 14: Asymptotic Notations

-notation

• -notation provides an asymptotic lower bound on a function.

• Definition:For a given function g(n), we denote (g(n)) the set of functions(g(n)) = {f(n) : there exists positive constants c and n0 such that 0 ≤ c g(n) ≤ f(n) for all n ≥ n0.

• Explanation:We write f(n) = (g(n)) if there are positive constants n0 and c such that at and the right of n0, the value of f(n) always lies on or above c g(n).

Page 15: Asymptotic Notations

Relations Between Q, O, W

Page 16: Asymptotic Notations

o-notation• The upper bound provided by O-notation may or may not be

asymptotically tight. We use o-notation to denote an upper bound that is not asymptotically tight.

• We formally define o(g(n)) as the seto(g(n)) = { f(n) : for any positive constant c>0, there exists a constant n0 such that 0 ≤ f(n) < c g(n) for all n ≥ n0 }.

• In the o-notation, the function f(n) becomes insignificant relative to g(n) as n approaches infinity; that is,

Lim [f(n) / g(n)] = 0. n

Page 17: Asymptotic Notations

w -notation

• We use - w notation to denote a lower bound that is not asymptotically tight.

• Formal definition: w (g(n)) = { f(n) : for any positive constant c > 0,

there exists a constant n0 such that 0 ≤ c g(n) < f(n) for all n ≥ n0 }.

• The relation f(n) = w (g(n)) implies that

Lim [f(n) / g(n)] = . n

Page 18: Asymptotic Notations

Relations Between Q, O, W

• Theorem : For any two functions g(n) and f(n), f(n) = (Q g(n)) iff

f(n) = O(g(n)) and f(n) = (W g(n)).• i.e. (g(n)) = O(g(n)) Ç W(g(n))• In practice, asymptotic tight bounds are

obtained from asymptotic upper and lower bounds.

Page 19: Asymptotic Notations

Comparison of Functions

f g a b

f (n) = O(g(n)) a bf (n) = (g(n)) a bf (n) = (g(n)) a = b

f (n) = o(g(n)) a < bf (n) = w (g(n)) a > b

Page 20: Asymptotic Notations

Comp 122

PropertiesTransitivity

f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))

f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n))f(n) = w(g(n)) & g(n) = w(h(n)) f(n) = w(h(n))

Reflexivity

f(n) = (f(n)) f(n) = O(f(n)) f(n) = (f(n))

Page 21: Asymptotic Notations

Comp 122

Properties

Symmetry

f(n) = (g(n)) iff g(n) = (f(n))

Complementarity

f(n) = O(g(n)) iff g(n) = (f(n)) f(n) = o(g(n)) iff g(n) = w((f(n))

Page 23: Asymptotic Notations

Thank You