amortized analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - amortized analysis.pdf · when...

12
Amortized Analysis Course: CS 5130 - Advanced Data Structures and Algorithms Instructor: Dr. Badri Adhikari

Upload: others

Post on 27-May-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Amortized Analysis

Course: CS 5130 - Advanced Data Structures and Algorithms Instructor: Dr. Badri Adhikari

Page 2: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Three common techniquesWhen analyzing a given algorithm's time complexity (or any other resources), looking at the worst-case run time per operation can be too pessimistic.

While certain operations for a given algorithm may have a significant cost in resources, other operations may not be as costly.

Amortized analysis considers both the costly and less costly operations together over the whole series of operations of the algorithm.

Three most common techniques used in amortized analysis:

(a) Aggregate analysis(b) Accounting method(c) Potential method

Page 3: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Example 1 - Stack operationsTwo fundamental stack operations:

● PUSH(S, x) pushes object x into stack S● POP(S) pops the top of stack S and returns the popped object. Calling POP on

an empty stack generates an error.

Both PUSH and POP run in O(1) time. Therefore, total cost of n PUSH and POP operations is n and the running time is Θ(n).

Page 4: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

MULTIPOP (another stack operation)

MULTIPOP(S, k) removes k top objects of stack S. It pops the entire stack if the stack contains less than k objects.

What is the running time of MULTIPOP?

=> cost of MULTIPOP = min(s, k)

i.e. the number of times the while loop executes!

Initial stack

After MULTIPOP(S,4)

After MULTIPOP(S,7)

Page 5: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Sequence of n PUSH, POP, and MULTIPOPOn an initially empty stack, say, we have a sequence of n PUSH, POP, and MULTIPOP operations.

What is the cost?

The worst-case cost of a MULTIPOP operation in the sequence is O(n), since the stack size is at most n.

The worst-case time of any stack operation is therefore O(n), and hence the sequence of n operations costs O(n2). This is because we may have O(n) MULTIPOP operations costing O(n) each.

This analysis is correct, but the O(n2) running time is not tight.

Page 6: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Aggregate analysisWhen we have a sequence of n PUSH, POP, and MULTIPOP operations, can all MULTIPOPs take O(n) time?

Consider this - we can pop each object from the stack at most once for each time we have pushed it onto the stack.

Therefore, the number of times that POP can be called on a nonempty stack, including calls within MULTIPOP, is at most the number of PUSH operations, which is at most n.

For any value of n, any sequence of n PUSH, POP, and MULTIPOP operations takes a total of O(n) time.

The average cost of an operation is O(n)/n = O(1). In aggregate analysis, we assign the amortized cost of each operation to be the average cost.

All three operations have amortized cost of O(1).

Page 7: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Aggregate analysis ≠ Average-case analysisAverage-case analysis => average of (too many PUSH; too many POP; etc.)

Example of average-case analysis: A sorting algorithm takes O(n2) time when the input is reverse-sorted and O(n) when the input is already sorted. So, on average, the algorithm takes O(n2 + n)/2. Is this correct?

Aggregate analysis does not consider the probability. i.e. it does not probabilistically average how many times PUSH or MULTIPOP might occur on average.

Aggregate analysis guarantees the average performance of each operation in the worst case.

Page 8: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Example 2 - Incrementing a binary counterWe would like to implement a k-bit binary counter that counts upwards from 0.

We can use an array of bits as a counter → A[0 . . k-1] (A.length = k)0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0

0 0 0 0 0 0 0A[k-1] A[0]Highest bit Lowest bit

//Increments the counter by decimal 1INCREMENT(A){

..

..

..}

Page 9: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

A cursory analysisIn the worst case, all bits of A may contain 1s, so that we need to flip all bits.

A single execution of INCREMENT(A) takes time Θ(k) in the worst case.

Thus, a sequence of n INCREMENT operations on an initially zero counter takes time O(n k) in the worst case.

Once again, this analysis is correct, but not tight.

Page 10: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Aggregate analysisDo all the bits flip each time INCREMENT is called?

A[0] does flip each time.

A[1] flips only every other time. We can observe that a sequence of n INCREMENT operations on an initially zero counter causes A[1] to flip⎣n/2⎦times.

A[2] flips⎣n/4⎦times.

A[i] flips⎣n/2i⎦times.

Total number of flips =

Therefore, worst-case running time on an initially zero counter is O(n). The amortized (average) cost per operation is O(n)/n = O(1).

Page 11: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

Example 3 - Dijkstra’s algorithmCursory analysis of running time:

The for loop can execute RELAX maximum E times because each vertex may have maximum E edges (if the graph is dense). So the running time is O(V E).

Aggregate analysis:

Number of total RELAX operations = EV1 + EV2 + .. + EVV = E

On average, we can say that the ‘for loop’ executes E/V RELAX statements. So that the cost of RELAX per vertex is O(E)/V.

Total running time = O(E/V) * V = O(E).

Intuitively: In total RELAX is executed E times (that is all the edges we have).

DIJKSTRA(G,w,s) INITIALIZE-SINGLE-SOURCE(G,s) S = ɸ Q = G.V while Q ≠ ɸ u = EXTRACT-MIN(Q) S = S ∪ {u} for each vertex v ∈ G.Adj[u] RELAX(u, v, w)

Page 12: Amortized Analysisumsl.edu/~adhikarib/cs4130-fall2017/slides/09 - Amortized Analysis.pdf · When analyzing a given algorithm's time complexity (or any other resources), looking at

SummaryOften times, our analysis of running time of an algorithm may be correct, but not tight.

Aggregate analysis is one of the ‘amortized analysis’ methods to calculate a more tight running time.

If, through, aggregate analysis we conclude that an algorithm takes O(n2) time, then, no matter how your input data is, the algorithm will never take more than O(n2) time. i.e. aggregate analysis is not average-case analysis.

In the stack operations example, when we have a sequence of n PUSH, POP, and MULTIPOP operations, the time needed per operation is O(1) and not O(n).

In the binary counter increment example, an operation takes O(1) time and not O(k) time.