greedy algorithms

27
Greedy Algorithms

Upload: kalil

Post on 16-Mar-2016

47 views

Category:

Documents


5 download

DESCRIPTION

Greedy Algorithms. Review: Dynamic Programming. Summary of the basic idea: Optimal substructure: optimal solution to problem consists of optimal solutions to subproblems Overlapping subproblems: few subproblems in total, many recurring instances of each - PowerPoint PPT Presentation

TRANSCRIPT

Greedy Algorithms

Zhengjin,Central South University

2

Review: Dynamic Programming Summary of the basic idea:

Optimal substructure: optimal solution to problem consists of optimal solutions to subproblems

Overlapping subproblems: few subproblems in total, many recurring instances of each

Solve bottom-up, building a table of solved subproblems that are used to solve larger ones

Variations: “Table” could be 3-dimensional, triangular, a

tree, etc.

Zhengjin,Central South University

3

Greedy Algorithms A greedy algorithm always makes the choice

that looks best at the moment The hope: a locally optimal choice will lead

to a globally optimal solution For some problems, it works well

Dynamic programming can be overkill; greedy algorithms tend to be easier to code

Zhengjin,Central South University

4

Review:The Knapsack Problem

The famous knapsack problem: A thief breaks into a museum. Fabulous

paintings, sculptures, and jewels are everywhere. The thief has a good eye for the value of these objects, and knows that each will fetch hundreds or thousands of dollars on the clandestine art collector’s market. But, the thief has only brought a single knapsack to the scene of the robbery, and can take away only what he can carry. What items should the thief take to maximize the haul?

Zhengjin,Central South University

5

Review: The Knapsack Problem More formally, the 0-1 knapsack problem:

The thief must choose among n items, where the ith item worth vi dollars and weighs wi pounds

Carrying at most W pounds, maximize value Note: assume vi ,wi ,and W are all integers each item must be taken or left in entirety

A variation, the fractional knapsack problem: Thief can take fractions of items

Zhengjin,Central South University

6

Solving The Knapsack Problem The optimal solution to the fractional knapsack

problem can be found with a greedy algorithm How?

The optimal solution to the 0-1 problem cannot be found with the same greedy strategy Greedy strategy: take in order of

dollars/pound

Zhengjin,Central South University

7

The Knapsack Problem: Greedy Vs. Dynamic The fractional problem can be solved greedily The 0-1 problem cannot be solved with a

greedy approach As you have seen, however, it can be solved

with dynamic programming

Zhengjin,Central South University

8

Change Making Problem How to make 63 cents of change

using coins of denominations of 25, 10, 5, and 1 so that the total number of coins is the smallest?

The idea: make the locally best choice at each

step. Is the solution optimal?

Zhengjin,Central South University

9

Greedy Algorithms A greedy algorithm makes a locally optimal choice in the

hope that this choice will lead to a globally optimal solution.

The choice made at each step must be: Feasible

Satisfy the problem’s constraints locally optimal

Be the best local choice among all feasible choices Irrevocable

Once made, the choice can’t be changed on subsequent steps.

Do greedy algorithms always yield optimal solutions? Example: change making problem with a denomination

set of 11, 5 and 1 and to make 15 cents of change.

Zhengjin,Central South University

10

Applications of the Greedy Strategy Optimal solutions:

change making Minimum Spanning Tree (MST) Single-source shortest paths Huffman codes

Approximations: Traveling Salesman Problem (TSP) Knapsack problem other optimization problems

Zhengjin,Central South University

11

Minimum Spanning Tree (MST) Spanning tree of a connected graph G: a connected

acyclic subgraph (tree) of G that includes all of G’s vertices.

Minimum Spanning Tree of a weighted, connected graph G: a spanning tree of G of minimum total weight.

Example:

3

42

14

26

1

3

Zhengjin,Central South University

12

Prim’s MST algorithm Start with a tree , T0 ,consisting of one vertex

“Grow” tree one vertex/edge at a time Construct a series of expanding subtrees T1, T2, … Tn-1. .At each stage

construct Ti+1 from Ti by adding the minimum weight edge connecting a vertex in tree (Ti) to

one not yet in tree choose from “fringe” edges (this is the “greedy” step!)

Or (another way to understand it) expanding each tree (Ti) in a greedy manner by attaching to it the

nearest vertex not in that tree. (a vertex not in the tree connected to a vertex in the tree by an edge of the smallest weight)

Algorithm stops when all vertices are included

Zhengjin,Central South University

13

Examples 3

42

14

26

1

3a

edc

b1

5

24 6

3 7

Fringe edges: one vertex is in Ti and the other is not.Unseen edges: both vertices are not in Ti.

Zhengjin,Central South University

14

The Key Point Notations : T: the expanding subtree.,Q: the remaining

vertices.

At each stage, the key point of expanding the current subtree T is to determine which vertex in Q is the nearest vertex.

Q can be thought of as a priority queue: The key(priority) of each vertex, key[v], means the

minimum weight edge from v to a vertex in T. Key[v] is ∞ if v is not linked to any vertex in T.

The major operation is to to find and delete the nearest vertex (v, for which key[v] is the smallest among all the vertices)

Remove the nearest vertex v from Q and add it to the corresponding edge to T.

With the occurrence of that action, the key of v’s neighbors will be changed.

Zhengjin,Central South University

15

ALGORITHM MST-PRIM( G, w, r ) //w: weight; r: root, the starting vertex

1. for each u V[G]2. do key[u] 3. P[u] Null // P[u] : the parent of u4. key[r] 05. Q V[G] //Now the priority queue, Q has been built.6. while Q 7. do u Extract-Min(Q) //remove the nearest vertex from

Q8. for each v Adj[u] // update the key for each of u’s

adjacent node9. do if v Q and w(u,v) < key[v]10. then P[v] u11. Key[v] w(u,v)

Zhengjin,Central South University

16

Notes about Prim’s algorithm

Need priority queue for locating the nearest vertex

Use unordered array to store the priority queue:

Efficiency: Θ(n2)

use min-heap to store the priority queue

Efficiency: For graph with n vertices and m edges: (n + m) logn

O(m log n)

number of stages(min-heap deletions)

number of edges considered(min-heap key decreases)

Key decreases/deletion from min-heap

Zhengjin,Central South University

17

Another Greedy Algorithm for MST: Kruskal

Edges are initially sorted by increasing weight Start with an empty forest “grow” MST one edge at a time

intermediate stages usually have forest of trees (not connected)

at each stage add minimum weight edge among those not yet used that does not create a cycle at each stage the edge may:

expand an existing tree combine two existing trees into a single tree create a new tree

need efficient way of detecting/avoiding cycles algorithm stops when all vertices are included

Zhengjin,Central South University

18

Kruskal’s AlgorithmALGORITHM Kruscal(G)//Input: A weighted connected graph G = <V, E> //Output: ET, the set of edges composing a minimum spanning tree of

G.

1. Sort E in nondecreasing order of the edge weights w(ei1) <= … <= w(ei|E|) 2. ET ; ecounter 0 //initialize the set of tree edges and its size3. k 0 4. while encounter < |V| - 1 do

k k + 1 if ET U {eik} is acyclic ET ET U {eik} ; ecounter ecounter + 1

5. return ET

P314-P317 (UNION-FIND ALGORITHM)

Zhengjin,Central South University

19

Efficiency of Kruskal’s AlgorithmEfficiency: For graph with n vertices and m

edges: O(n + m logn)

if use the efficiency UNION-FIND algorithm.

SORT: O( m logm) FIND: O( m logn) UNION: O( n)So the efficiency of Kruskal’s Algorithm is O(n + m

logn)

Zhengjin,Central South University

20

Minimum Spanning Tree-SUMMARY Is Prim’s algorithm greedy? Why? Is Kruskal’s algorithm greedy?

Why?

Zhengjin,Central South University

21

Shortest Paths – Dijkstra’s Algorithm Shortest Path Problems

All pair shortest paths (Floy’s algorithm) Single Source Shortest Paths Problem (Dijkstra’s

algorithm): Given a weighted graph G, find the shortest paths from a source vertex s to each of the other vertices.

a

edc

b3

4

62 5

7 4

Zhengjin,Central South University

22

Prim’s and Dijkstra’s Algorithms Generate different kinds of spanning trees

Prim’s: a minimum spanning tree. Dijkstra’s : a spanning tree rooted at a given source s, such

that the distance from s to every other vertex is the shortest. Different greedy strategies

Prims’: Always choose the closest (to the tree) vertex in the priority queue Q to add to the expanding tree VT.

Dijkstra’s : Always choose the closest (to the source) vertex in the priority queue Q to add to the expanding tree VT.

Different labels for each vertex Prims’: parent vertex and the distance from the tree to the

vertex.. Dijkstra’s : parent vertex and the distance from the source to

the vertex.

Zhengjin,Central South University

23

Dijkstra’s AlgorithmALGORITHM Dijkstra(G, s)//Input: A weighted connected graph G = <V, E> and a source vertex s//Output: The length dv of a shortest path from s to v and its penultimate vertex pv for

every vertex v in VInitialize (Q) //initialize vertex priority in the priority queuefor every vertex v in V do

dv ∞ ; Pv null // Pv , the parent of vinsert(Q, v, dv) //initialize vertex priority in the priority queue

ds 0; Decrease(Q, s, ds) //update priority of s with ds, making ds, the minimumVT

for i 0 to |V| - 1 do //produce |V| - 1 edges for the treeu* DeleteMin(Q) //delete the minimum priority elementVT VT U {u*} //expanding the tree, choosing the locally best vertexfor every vertex u in V – VT that is adjacent to u* doif du* + w(u*, u) < dudu du + w(u*, u); pu u*Decrease(Q, u, du)

Zhengjin,Central South University

24

Notes on Dijkstra’s Algorithm Doesn’t work with negative weights

Can you give a counter example? Applicable to both undirected and directed

graphs Efficiency

Use unordered array to store the priority queue: Θ(n2)

Use min-heap to store the priority queue: O(m log n)

Zhengjin,Central South University

25

Summary The greedy technique suggests constructing a

solution to an optimization problem through a sequence of steps,each expanding a partially constructed solution obtained so far,until a complete solution to the problem is reached .On each step,the choice made must be feasible,locally optimal,and irrevocable.

Prim’s algorithm is a greedy algorithm for constructing a minimum spanning tree of a weighted connected graph.It works by attaching to a previously constructed subtree a vertex closest to the already in the tree.

Zhengjin,Central South University

26

Summary Kruskal’s algorithm is another greedy algorithm for

the minimum spanning tree problem.It constructs a minimum spanning tree by selecting edges in increasing order of their weights provided that the inclusion doesn’t create a cycle.

Dijkstra’s algorithm solves the single-source shortest-path problem of finding shortest paths from a given vertex (the source) to all the other vertices of a weighted graph or digraph.It works as Prim’s algorithm but compares path lengths rather than edge lengths.Dijktra’s algorithm always yields a correct solution for a graph with nonnegative weights.

Zhengjin,Central South University

27

Homework5(a) Exercise 9.1 2, 6.a Exercise 9.2 5 Exercise 9.3 8