java bootcamp - mcgill university
TRANSCRIPT
CSUS Help desk is hosting a
JAVA BOOTCAMPThursday September 17th from 5:30pm to 7h30pm
This bootcamp is aimed to programmers who donโt know the particularities of Java. There will be an overview of syntactic and semantic difference between java and other coding languages, with a focus on OOP (including polymorphism and inheritance).
The zoom link is: https://mcgill.zoom.us/j/92101531362
COMP251: Running time analysis and the Big O notation
Jรฉrรดme WaldispรผhlSchool of Computer Science
McGill UniversityBased on slides from M. Langer and M. Blanchette
Outline
โข Motivationsโข The Big O notation
o Definitiono Exampleso Rules
โข Big Omega and Big Thetaโข Applications
Measuring the running โtimeโโข Goal: Analyze an algorithm written in
pseudocode and describe its running timeโ Without having to write codeโ In a way that is independent of the computer
usedโข To achieve that, we need to
โ Make simplifying assumptions about the running time of each basic (primitive) operations
โ Study how the number of primitive operations depends on the size of the problem solved
Simple computer operation that can be performed in time that is always the same, independent of the size of the bigger problem solved (we say: constant time)
โ Assigning a value to a variable: x ยฌ1 Tassignโ Calling a method: Expos.addWin() Tcall
โ Note: doesnโt include the time to execute the methodโ Returning from a method: return x; Treturnโ Arithmetic operations on primitive types Tarith
x + y, r*3.1416, x/y, etc.โ Comparisons on primitive types: x==y Tcomp
โ Conditionals: if (...) then.. else... Tcondโ Indexing into an array: A[i] Tindex
โ Following object reference: Expos.losses Tref
Note: Multiplying two Large Integers is not a primitive operation, because the running time depends on the size of the numbers multiplied.
Primitive Operations
FindMin analysisAlgorithm findMin(A, start, stop)Input: Array A, index start & stopOutput: Index of the smallest element of A[start:stop]
minvalue ยฌ A[start]
minindex ยฌ start
index ยฌ start + 1
while ( index <= stop ) do {
if (A[index]<minvalue)
then {
minvalue ยฌ A[index]
minindex ยฌ index
}
index = index + 1
}
return minindex
Tindex + TassignTassignTarith + TassignTcomp+ TcondTindex + Tcomp + Tcond
Tindex + TassignTassign
Tassign + TarithTcomp+ Tcond (last check of loop)Treturn
repeated
stop-start
times
Running time
Worst case running time
โข Running time depends on n = stop โ start + 1โ But it also depends on the content of the array!
โข What kind of array of n elements will give the worst running time for findMin?
โข The best running time?
5 4 3 2 1 0Example:
0 1 2 3 4 5Example:
More assumptions
โข Counting each type of primitive operations is tedious
โข The running time of each operation is roughly comparable:
Tassign ยป Tcomp ยป Tarith ยป ...ยป Tindex=1primitive operation
โข We are only interested in the number of primitive operations performed
Worst-case running time for findMin becomes:
T(n)=8+10*n
Algorithm SelectionSort(A,n)
Input: an array A of n elements
Output: the array is sorted
iยฌ 0
while (i<n) do {
minindex ยฌ findMin(A,i,n-1)
t ยฌ A[minindex]
A[minindex] ยฌ A[i]
A[i] ยฌ t
i ยฌ i + 1
}
Primitive operations(worst case) :
1
2
3 + TFindMin(n-1-i+1)=3 +(10 (n-i) - 2)
2
3
2
2
2 (last check of loop condition)
Selection Sort
Selection Sort: adding it up
Total: T(n) = 1 + ( รฅ 12 + 10 (n - i) ) + 2
= 3 + (12 n + 10 รฅ (n-i) )
= 3 + 12 n + 10 ( รฅ n ) โ 10 ( รฅ i)
= 3 + 12 n + 10 n * n - 10 ( (n-1)*n) / 2)
= 3 + 12 n + 10 n2 - 5 n2 + 5 n
= 5 n2 + 17 n + 3
n-1i=0
n-1i=0
n-1i=0
n-1i=0
More simplifications
We have: T(n) = 5 n2 + 17 n + 3
Simplification #1:When n is large, T(n) ยป 5 n2
Simplification #2:When n is large, T(n) grows approximately like n2
We will write T(n) is O(n2)โT(n) is big O of n squaredโ
Asymptotic behavior
Towards a formal definition of big O
Let t(๐) be a function that describes the time it takes for some algorithm on input size ๐.
We would like to express how t(๐) grows with ๐, as ๐becomes large i.e. asymptotic behavior.
Unlike with limits, we want to say that t(๐) grows like certain simpler functions such as ๐, log! ๐ , ๐, ๐!, 2"โฆ
Preliminary Definition
Let ๐ก(๐) and ๐(๐) be two functions, where ๐ โฅ 0. We say ๐ก(๐) is asymptotically bounded above by ๐(๐) if there exists ๐#such that, for all ๐ โฅ ๐#,
๐ก(๐) โค ๐(๐)
WARNING: This is not yet a formal definition!
๐๐๐ ๐๐๐ ๐ โฅ ๐!, ๐ก(๐) โค ๐(๐)
Example
Claim: 5๐ + 70 is asymptotically bounded above by 6๐.
Proof:(State definition) We want to show there exists an ๐# such that, for all ๐ โฅ ๐#, 5 2 ๐ + 70 โค 6 2 ๐.
5๐ + 70 โค 6๐โ 70 โค ๐
Thus, we can use ๐# = 70
Symbol โโบ " means โif and only ifโ i.e. logical equivalence
Example
Choosing a function and constants
(A) (B) (C)
Motivation
We would like to express formally how some function ๐ก(๐) grows with ๐, as ๐ becomes large.
We would like to compare the function ๐ก(๐) with simpler functions , g(๐), such as as ๐, log! ๐ , ๐, ๐!, 2"โฆ
Formal Definition
Let ๐ก ๐ and ๐ ๐ be two functions, where ๐ โฅ 0.
We say ๐ก(๐) is ๐(๐(๐)) if there exists two positive constants ๐# and ๐ such that, for all ๐ โฅ ๐#,
๐ก(๐) โค ๐ 2 ๐(๐)
Note: ๐(๐) will be a simple function, but this is not required in the definition.
Intuitionโ๐ ๐ ๐๐ ๐(๐ ๐ )โ if and only if there exists a point ๐#beyond which ๐ ๐ is less than some fixed constant times๐(๐).
Example (1)
Claim: 5 " ๐ + 70 ๐๐ ๐(๐)
Proof(s)Claim: 5 2 ๐ + 70 ๐๐ ๐(๐)
Proof 1: 5 2 ๐ + 70 โค 5 2 ๐ + 70 2 ๐ = 75 2 ๐, ๐๐ ๐ โฅ 1Thus, take ๐ = 75 and ๐# = 1.
Proof 2: 5 2 ๐ + 70 โค 5 2 ๐ + 6 2 ๐ = 11 2 ๐, ๐๐ ๐ โฅ 12Thus, take ๐ = 11 and ๐# = 12.
Proof 3: 5 2 ๐ + 70 โค 5 2 ๐ + ๐ = 6 2 ๐, ๐๐ ๐ โฅ 70Thus, take ๐ = 6 and ๐# = 70.
All these proofs are correct and show that 5 2 ๐ + 70 is ๐(๐)
Visualization
(A) (B) (C)
Example (2)
Claim: 8 2 ๐! โ 17 2 ๐ + 46 is ๐ ๐! .
Proof 1: 8๐! โ 17๐ + 46 โค 8๐! + 46๐!, if ๐ โฅ 1โค 54๐!
Thus, we can take ๐ = 54 and ๐# = 1.
Proof 2: 8๐! โ 17๐ + 46 โค 8๐!, if ๐ โฅ 3Thus, we can take ๐ = 8 and ๐# = 3.
What does O(1) mean?
We say ๐ก(๐)is ๐(1), if there exist two positive constants ๐#and ๐ such that, for all ๐ โฅ ๐#.
๐ก ๐ โค ๐
So, it just means that ๐ก(๐)is bounded.
Tips
Never write ๐ 3๐ , ๐ 5 log! ๐ , ๐๐ก๐.
Instead, write ๐ ๐ , ๐ log! ๐ , ๐๐ก๐.
Why? The point of the big O notation is to avoid dealing with constant factors. Itโs technically correct but we donโt do itโฆ
Other considerations
โข ๐# and ๐ are not uniquely defined. For a given ๐# and ๐that satisfies ๐(), we can increase one or both to againsatisfy the definition. There is not โbetterโ choice ofconstants.
โข However, we generally want a โtightโ upper bound(asymptotically), so functions in the big O gives us moreinformation (Note: This is not the same as smaller ๐# or ๐).For instance, ๐(๐) that is ๐ ๐ is also ๐ ๐! and ๐ 2" .But ๐ ๐ is more informative.
Growth of functions
(from stackoverflow)
Tip: It is helpful to memorize the relationship between basic functions.
Practical meaning of big Oโฆ
If the unit is in seconds, this would make ~1011 yearsโฆ
Constant Factor rule
Suppose f(n)is O(g(n))and a is a positive constant.Then, ๐ 2 ๐ ๐ is also ๐(๐ ๐ )
Proof: By definition, if f(n)is O(g(n))then there exists two positive constants ๐# and ๐ such that for all ๐ โฅ ๐#,
๐(๐) โค ๐ 2 ๐(๐)
Thus, a 2 ๐(๐) โค a 2 ๐ 2 ๐(๐)
We use the constant ๐ 2 ๐ to show that ๐ 2 ๐ ๐ is ๐(๐ ๐ ).
Sum rule
Suppose ๐$ ๐ ๐๐ ๐ ๐(๐ ) and ๐! ๐ ๐๐ ๐ ๐(๐ ).Then, ๐$ ๐ + ๐!(๐) ๐๐ ๐ ๐(๐ ).
Proof: Let ๐$, ๐$ and ๐!, ๐! be constants such that
๐$(๐) โค ๐$๐(๐), for all ๐ โฅ ๐$๐!(๐) โค ๐!๐(๐), for all ๐ โฅ ๐!
So, ๐$ ๐ + ๐! ๐ โค (๐$+๐!)๐(๐), for all ๐ โฅ max(๐$, ๐!).
We can use the constants ๐$ + ๐! and max(๐$, ๐!) to satisfythe definition.
Generalized Sum rule
Suppose ๐$ ๐ ๐๐ ๐ ๐(๐ ) and ๐! ๐ ๐๐ ๐ ๐(๐ ).
Then, ๐$ ๐ + ๐! ๐ ๐๐ ๐ ๐$(๐ + ๐!(๐)).
Proof: Exerciseโฆ
Product Rule
Suppose ๐$ ๐ ๐๐ ๐ ๐(๐ ) and ๐! ๐ ๐๐ ๐ ๐(๐ ).
Then, ๐$ ๐ 2 ๐! ๐ ๐๐ ๐ ๐$(๐ 2 ๐!(๐)).
Proof: Let ๐$, ๐$ and ๐!, ๐! be constants such that
๐$(๐) โค ๐$๐$(๐), for all ๐ โฅ ๐$๐!(๐) โค ๐!๐!(๐), for all ๐ โฅ ๐!
So, ๐$ ๐ 2 ๐! ๐ โค (๐$2 ๐!) 2 (๐$(๐) 2 ๐!(๐)) , for all ๐ โฅmax(๐$, ๐!).We can use the constants ๐$ 2 ๐! and max(๐$, ๐!) to satisfythe definition.
Transitivity Rule
Suppose ๐ ๐ ๐๐ ๐ ๐(๐ ) and ๐ ๐ ๐๐ ๐ โ(๐ ).
Then, ๐ ๐ ๐๐ ๐(โ(๐)).
Proof: Let ๐$, ๐$ and ๐!, ๐! be constants such that
๐(๐) โค ๐$๐(๐), for all ๐ โฅ ๐$๐(๐) โค ๐!โ(๐), for all ๐ โฅ ๐!
So, ๐ ๐ โค (๐$2 ๐!)โ(๐), for all ๐ โฅ max(๐$, ๐!).
We can use the constants ๐$ 2 ๐! and max(๐$, ๐!) to satisfythe definition.
NotationsIf ๐(๐) is ๐ ๐ ๐ , we often write ๐(๐) โ ๐(๐ ๐ ). That is a member of the functions that are ๐ ๐ ๐ .
For n sufficiently large we have, 1 < log! ๐ < ๐ < ๐ log! ๐โฆAnd we write ๐ 1 โ ๐ log! ๐ โ ๐ ๐ โ ๐ ๐ log! ๐ โฆ
The Big Omega notation (ฮฉ)
Let ๐ก ๐ and ๐ ๐ be two functions with ๐ โฅ 0.
We say ๐ก ๐ is ฮฉ(๐ ๐ ), if there exists two positives constants ๐# and ๐ such that, for all ๐ โฅ ๐#,
๐ก(๐) โฅ ๐ 2 ๐(๐)
Note: This is the opposite of the big O notation. The function ๐ is now used as a "lower boundโ.
Example
Claim: "("&$)!
is ฮฉ(๐!).
Proof: We show first that "("&$)!
โฅ "!
(.
โ 2๐(๐ โ 1) โฅ ๐!
โ ๐! โฅ 2๐
โ ๐ โฅ 2
Thus, we take ๐ = $(
and ๐# = 2.
(Exercise: Prove that it also works with ๐ = $) and ๐# = 3.
Intuition
Andโฆ big Theta!Let ๐ก(๐) and ๐(๐) be two functions, where ๐ โฅ 0.
We say ๐ก(๐) is ฮ(๐(๐)) if there exists three positive constants ๐# and ๐$, ๐! such that, for all ๐ โฅ ๐#,
๐$ 2 ๐(๐) โค ๐ก(๐) โค ๐! 2 ๐(๐)
Note: if ๐ก ๐ is ฮ(๐(๐)). Then, it is also ๐(๐(๐)) and ฮฉ(๐(๐)) .
Example
Let ๐ก ๐ = 4 + 17 log! ๐ + 3๐ + 9๐ log! ๐ +"("&$)
!
Claim: ๐ก(๐) is ฮ(๐!)
Proof: ๐!
4โค ๐ก ๐ โค (4 + 17 + 3 + 9 +
12) 2 ๐!
Big vs. little
(from geekforgeek.org)
The big O (resp. big ฮฉ) denotes a tight upper (resp. lower) bounds, while the little o (resp. little ๐) denotes a lose upper (resp. lower) bounds.
Back to running time analysis
The time it takes for an algorithm to run depends on:
โข constant factors (often implementation dependent)โข the size ๐ of the input โข the values of the input, including arguments if applicableโฆ
Q: What are the best and worst cases?
Example (Binary Search)
Best case: The value is exactly in the middle of the array.โ ฮฉ(1)
Worst case: You recursively search until you reach an array of size 1 (Note: It does not matter if you find the key or not).
โ ๐(log! ๐)
(from Wikipedia)