greedy algorithms intuition: at each step, make the choice that is locally optimal. does the...

52
Greedy Algorithms Intuition: At each step, make the choice t hat is locally optimal. Does the sequence of locally optimal choic es lead to a globally optimal solution? Depends on the problem – Sometimes guarantees only an approximate solution • Examples: – Shortest paths: Dijkstra – Minimum spanning trees: Prim and Kruskal – Compression: Human coding – Memory allocation: First fit, Best fit

Upload: derrick-payne

Post on 11-Jan-2016

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Greedy Algorithms• Intuition: At each step, make the choice that is locally op

timal.• Does the sequence of locally optimal choices lead to a gl

obally optimal solution?– Depends on the problem– Sometimes guarantees only an approximate solution

• Examples:– Shortest paths: Dijkstra– Minimum spanning trees: Prim and Kruskal– Compression: Human coding– Memory allocation: First fit, Best fit

Page 2: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Greedy Method • Greedy algorithms are typically used to solve optimization

problem. Most of these problems have n inputs and require us to obtain a subset that satisfies some constraints. Any subset that satisfies these constraints is called a feasible solution. We are required to find a feasible solution that either minimizes or maximizes a given objective function. In the most common situation we have:

• C: A set (or list) of candidates;• S: The set of candidates that have already been used;• feasible(): A function that checks if a set is a feasible

solution;• solution(): A function that checks if a set provides a solution;• select(): A function for choosing most promising candidates;• An objective function that we are trying to optimize.

Page 3: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

The Generic Procedure1. function greedy(C: set): set;2. begin3. S := Ø; /* S is the set in which we construct the

solution */4. while (not solution(S) and C ≠ Ø ) do5. begin6. x := select(C); 7. C := C - {x};8. if feasible(S{x}) then S := S {x};9. end;10. if solution(S) then return(S) else return(Ø);11. end;

• The selection function use the objective function to choose the most promising candidates from C

Page 4: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example: Coin Change• We want to give change to a customer using the smallest

possible number of coins (of units 1, 5, 10 and 25, resp.).

• Greedy algorithm will always find the optimal solution in this case.

• If 12-unit coins are added, it will not necessarily find the optimal solution.

• Greedy method might even fail to find a solution despite the fact that one exist. (Consider coins of 2, 3, and 5 units.)

E.g., 6 = 5 + ? (3, 3) is optimal

E.g., 15 = (12, 1, 1, 1), (10, 5) is optimal

Page 5: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

The Knapsack Problems• 0-1 knapsack problem: A thief robbing a store finds n ite

ms, the ith item has weight ci and is worth vi dollars (ci & vi are integers) If the thief can carry at most B weight in his knapsack, what items should he take to make the most profitable?

• Fractional knapsack problem: same as above, except that the thief can take fractions of items (e.g., an item might be gold dust)

• Integer knapsack problem: same as 0-1 knapsack problem, except that the number of each item is unlimited.

What is the input size of the problem? n * log B

Page 6: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

The Knapsack Problems• Optimal substructure property: consider the most valuabl

e load that weights pounds in 0-1 problem, if we remove item j from this load, the remaining load must be the most valuable load weighting at most B - cj that can be taken from the n - 1 original items excluding j.

• In fractional problem, if we remove w pounds of item j, the remaining load must be the most valuable load weighting at most B - w that can be taken from the n - 1 original items plus cj - w pounds of item j.

Page 7: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

The Knapsack Problems• The Integer Knapsack Problem

Maximize vi ≥ 0, xi: nonnegative integers

Subject to ≤ B ci ≥ 0, B > 0

• The 0-1 Knapsack Problem: same as integer knapsack except that the values of xi's are restricted to 0 or 1.

• The Fractional Knapsack Problem

Maximize vi ≥ 0, 1 ≥ xi ≥ 0

Subject to ≤ B ci ≥ 0, B > 0

n

iii xv

1

n

iii xc

1

k

iii xv

1

n

iii xc

1

Page 8: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

The Knapsack Problems

• Let f(k, a) = , 0 ≤ k ≤ n,

0 ≤ a ≤ B.

• For integer knapsack problems, we have

f(k, a) = max{f(k - 1, a), f(k, a - ck) + vk}, and

• For 0-1 knapsack problems, we have

f(k, a) = max{f(k - 1, a), f(k - 1, a - ck) + vk}.

• What we want to compute is f(n, B), which depends recursively on at most nB previous terms:

f(k, a), 0 ≤ k ≤ n, 0 ≤ a ≤ B.

k

i

k

iiiii xcxv

1 1

a:max

Page 9: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

The Knapsack Problems• For the fractional knapsack problem, a greedy approach

can solve it: Rearrange the objects so that

Then, for items i= 1 to n, take as much of item i as there is while not exceeding weight limit B.

• Running time is O(n log n)

• Remark: Dynamic programming is not applicable to the fractional knapsack problem (why?), while greedy method may fail to find optimal solutions for the integer/0-1 knapsack problems.

n

n

c

v

c

v

c

v

2

2

1

1

Page 10: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Greedy does not work for 0-1 problem!

Fractional Knapsack: Optimal solution is $240.

Page 11: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Job Scheduling• We want to schedule given n jobs. Job i requires running

time ti

• Find the best order to execute the jobs to minimize the average completion time

• Sample input: (J1, 10), (J2, 4), (J3, 5), (J4, 12), (J5, 7)

• One possible schedule: J3, J2, J1, J5, J4.

Total completion time = 5+9+19+26+38= 97

• Optimal schedule: J2, J3, J5, J1, J4

Total completion time = 4+9+16+26+38 = 93

Page 12: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Greedy Scheduling Algorithm• Schedule the job with smallest running time first.• Schedule jobs in increasing order of running times.

• Correctness: Suppose scheduling order is i1, , in.

If tij > tij+1

, then swap jobs ij and ij+1:

– Completion times for jobs before ij and ij+1 do not change

– Time of job ij increases by tij+1

– Time of job ij+1 decreases by tij

– So total completion time decreases by tij- tij+1

• Thus, optimal completion time can happen only if the list i1, , in is sorted.

Page 13: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Multiprocessor Case• Suppose that jobs can be scheduled on k processors.

• Same intuition: since running times of earlier jobs contribute to completion times of latter jobs, schedule shorter jobs first.

• Algorithm:– Sort jobs in increasing order of running times: i1, , in. – Schedule i1 on processor P1, i2 on Pi2, , ik on Pk, ik+1 o

n Pk+1, and so on in a cycle

• Sample input: (J1, 10), (J2, 4), (J3, 5), (J4, 12), (J5, 7) and 2 processors

• Solution: J2, J5, J4 on P1; J3, J1 on P2Total completion time: (4+11+23)+(5+15)=58

Page 14: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Final Completion Time• Suppose we want to minimize the maximum completion

time instead of total (or average) completion time– makes sense only in multiprocessor case

• Scheduling shorter job first does not seem to have any advantage

• In fact, previous greedy algorithm does not give optimal solution

• Optimal solution: J1, J2, J3 on P1; J4, J5 on P2. Final completion time =19

• In fact, no greedy strategy works. We may be forced to try out all possible ways to split jobs among processors. This is an NP-complete problem!

Page 15: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

More on Scheduling• We have looked at a very simple variant

• In practice, there are many complications:– Jobs are not known a priori, but they arrive in real-

time– Different jobs have different priorities– It may be OK to preempt jobs– Jobs have different resource requirements, not just

processing time

• A hot research topic: scheduling for multimedia applications

Page 16: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Minimum Spanning Trees (MST)• A tree is a connected graph with no cycles. • A spanning tree of a connected graph G is a subgraph of

G which has the same set of vertices of G and is a tree.• A minimum spanning tree of a weighted graph G is the s

panning tree of G whose edges sum to minimum weight. (If G is not connected, we can talk about a minimum spanning forest.)

• There can be more than one minimum spanning tree in a graph (consider a graph with identical weight edges.)

• The minimum spanning tree problem has a long history, the 1st algorithm dates back at least to 1926!.

Page 17: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Minimum Spanning Trees (MST)• Minimum spanning tree is always taught in algorithm

courses since (1) it arises in many applications, (2) it is an important example where greedy algorithms always give the optimal answer, and (3) Clever data structures are necessary to make it work.

• A set of edges is a solution if it constitutes a spanning tree, and it is feasible if it does not include a cycle. A feasible set of edges is promising if it can be completed to form an optimal solution.

• An edge touches a set if exactly one end of the edge is in it.

Page 18: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Lemma• Let G = (V, E) be a connected undirected graph where

the length of each edge is given. Let V' ⊆V and E' ⊆E be a promising set of edges such that no edges in E' touches V'. Let e be the shortest edge that touches V'. Then E' ∪{e} is promising.

E’

Page 19: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Kruskal's Algorithm• Idea: Using union and find algorithms.

1. T = ;

2. Sort all edges by weight.

3. Make each node a singleton set.

4. For all e = (u, v) E in sorted order do:

5. If find(u) find(v) then add e to T and union(u; v).

(else discard e)

Page 20: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Implementation

Page 21: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example

sorted

Page 22: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Analysis of Kruskal's Algorithm• Let m = |E|.• O(m log m) = O(m log n) to sort edges.• O(n) for initialize n sets.• The repeat-loop is executed at most m times.• In the worst case O((2m+n-1)log*n) for all the find and

union operations, since there are at most 2m find operations and n-1 union operations.

• At worst, O(m) for the remaining operations.• Total time complexity is: O(m log n).

Page 23: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Exercises

• Prove that Kruskal's algorithm works correctly. The proof which uses the lemma in the previous page, is by induction on the number of edges selected until now.

• What happens, if by mistake, we run the algorithm on a graph that is not connected?

• What is the complexity of the algorithm if the list of edges is replaced by an adjacent matrix?

Page 24: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Kruskal's Algorithm

Page 25: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Kruskal's Algorithm

Page 26: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Kruskal's Algorithm

Page 27: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Prim's AlgorithmSelect an arbitrary vertex to start.

While (there are fringe vertices)

select minimum weight edge between tree and fringe

add the selected edge and vertex to the tree• The main loop of the algorithm is executed n-1 times;

each iteration takes O(n) time. Thus Prim's algorithm takes O(n2) time.

• Compare the above two algorithm according to the density of the graph G = (V, E).

• What happens to the above two algorithms if we allow edges with negative lengths?

Page 28: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Prim's Algorithm

Page 29: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example

Page 30: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Prim's Algorithm

Page 31: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Prim's Algorithm

Page 32: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Correctness

Page 33: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Single-Source Shortest Paths (SSSP)

• The Problem: Given an n-vertex weighted graph G = (V, E) and a vertex v in V, find the shortest paths from v to all other vertices.

• Edges in G may have positive, zero or negative weights, but there is no cycle of negative weight.

otherwisej) (i, edge of weight the

connectednot are j and i if],[ jiD

Page 34: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

D&C approach

1. procedure SP(i, j, d);

2. if i ≠j

3. then begin

4. d := D[i, j];

5. for k := 1 to n do d := min(d, SP(i,k)+SP(k,j));

6. end

7. else d := 0;

• Needs EXPONENTIAL time.

Page 35: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Floyd's algorithm: DP approach Idea: the shortest path from i to j without

passing through nodes numbered > k

1. for k := 1 to n do

2. for i: = 1 to n do

3. for j := 1 to n do

4. D[i, j] := min(D[i, j], D[i, k]+D[k, j]);

• Dynamic programming approach.• Takes O(n3) time.• Solve all-pairs shortest path problem.

kijd

Page 36: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Dijkstra's algorithm: Greedy approach

1. C := {2, 3, ..., n};

2. for i := 2 to n do near[i] := D[l, i];

3. repeat n-2 times

4. v := some element of C minimizing near[v];

5. C := C - {v};

6. for each w in C do near[w] := min(near[w], near[v]

+D[v,w]);

• Takes O(n2) time.

Page 37: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Dijkstra's Algorithm

Page 38: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Dijkstra's Algorithm

Page 39: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Dijsktra's Shortest-Path Algotithm1. Algoritllm SHORTEST- PATH(u)2. begin3. for i = 1 to n do4. near[i] = D[u, i];5. P[i] = u;6. V' = V - {u};7. near[u] = 0;8. while (V' is not empty) do9. Select v such that near[v] = min{near[w]: w ∈

V'};10. V' = V' - {v};11. for w ∈ V' do

Page 40: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Algorithm Continued12. if (near[w] > near[v] + D[v, w])13. then near[w] = near[v] + D[v, w]14. P[w] = v; (* P[w] is the parent of w

*)15. for w ∈V do16. (* print the shortest path from w to u. *)17. print w;18. q = w;19. while (q ≠u) do20. q = P[q]; print q;21 print u;22. end.

Page 41: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Greedy Heuristics

• Often used in situations where we can (or must) accept an approximate solution instead of an exact optimal solution.

• Graph Coloring Problem: Given G = (V, E), use as few colors as possible to color the nodes in E so that adjacent nodes are of different color.

Page 42: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Greedy Solutions• Algorithm 1:

1. Arrange the colors according to some order.2. For each node v in V, find the smallest color which

has not yet been used to paint any neighbors of v and paint v by this color.

• Algorithm 2:1. Choosing a color and an arbitrary starting node, and

then considering each other node in turn, painting it with this color if possible.

2. When no further nodes can be painted, we choose a new color and a new starting node that has not yet been painted. Then we repeat as in step 1 until every node is painted.

Page 43: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Graph Coloring

Page 44: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Graph Coloring

Page 45: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Graph Coloring

Page 46: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example of Graph Coloring• Greedy: 5 colors

Optimal: 2 colors

• Greedy approach may

find optimal solution in

some cases, but it may

also give an arbitrary

answer.

Page 47: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Traveling Salesperson Problem• The Problem: Find, in an undirected graph with weights

on the edges, a tour (a simple cycle that includes all the vertices) with the minimum sum of edge-weights.

• Algorithm:1. Choose the edge with minimum weight.2. Accept the edge (under the consideration together with

already selected edges) if it– does not cause a vertex to have degree 3 or more,

and– does not form a cycle, unless the number of selected

edges equals the number of the vertices of the graph.3. Repeat the above steps until n edges have been selected.

Page 48: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example• Edges are chosen in

the order:

3 4 5 6

(1, 2) (3, 5) (4, 5) (2, 3)

7 8 9 10 (1, 5) (2, 5) (3, 4) (1, 3)

11 12 15 25(1, 4) (2, 4) (4, 6) (1, 6)• Thus, the solution is 1-2-3-5-4-6-1 with length = 58.• The optimal solution is 1-2-3-6-4-5-1 with length = 56.

Page 49: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Matroids• Pair (S, I) where S is a nonempty, finite set, and I is a

family of subsets of S such that

1. I;2. If J I and I J, then I I (hereditary property)

3. If I, J I and |I| < |J|, then there exists an x J - I such that I {x} I (exchange property).

• Elements of I : independent sets

• Subsets of S not in I : dependent sets

Page 50: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Examples• Graphic matroid: G = (V, E) connected undirected grap

h– S := E– I := set of forests (=acyclic subgraph) in G

• Claim: (S; I) is a matroid.• Proof:

1. The graph (V, ) is a forest.

2. Any subset of a forest is a forest.

3. Let I and J be forests with |I| < |J|:

Page 51: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example Continued• Subclaim 1: There exist 2 nodes u and v that are connecte

d in J, but not in I.

• Proof: Assume by contradiction that if 2 nodes are connected in J then they are also connected in I. It follows that the connected components of J can be spanned with at most |I| edges. Since |J| > |I|, J must contain a cycle, which is a contradiction.

Page 52: Greedy Algorithms Intuition: At each step, make the choice that is locally optimal. Does the sequence of locally optimal choices lead to a globally optimal

Example Continued• Subclaim 2: There exists an edge (x, y) on the path in J fr

om u to v such that x and y belong to different connected components of I.

• Proof: Assume by contradiction that for every edge (x, y) on the path from u to v, x and y belong to the same connected components of I. Then u and v belong to the same connected component of I.

Thus I e is a forest.

• Thus, (S, I) is a matroid.