Greedy 2
Greedy 2
Greedy 2
CLRS 16.1-16.2
Greedy Algorithms
We have previously discussed how to speed up optimization problems using the technique of dynamic programming : The problem must have the optimal substructure property : the optimal solution to the problem contains within it optimal solutions to smaller subproblems. Typically the number of dierent subproblems is polynomial, but the recursive algorithm implementing the recursive formulation of the problem is exponential. This is because of overlapping calls to same subproblem. Idea: If same subproblem is solved several times, use table to store result of a subproblem the rst time it is computed and never compute it again. Alternatively, we can think about lling up a table of subproblem solutions from the bottom. Another, simpler and often less powerful (and less well-dened), technique that uses the same feature of optimal substructure is greediness. Like in the case of dynamic programming, we will introduce greedy algorithms via an example.
1.1
Activity Selection
Problem: Given a set A = {A1 , A2 , , An } of n activities with start and nish times (si , fi ), 1 i n, select maximal set S of non-overlapping activities. One can think of the problem as corresponding to scheduling the maximal number of classes (given their start and nish times) in one classroom. Dynamic programming solution: O(n3 ) read the textbook. Greedy solution: Sort activity by nish time (let A1 , A2 , , An denote sorted sequence) Pick rst activity A1 Remove all activities with start time before nish time of A1 Recursively solve problem on remaining activities.
Program: Sort A by nish time S = {A1 } j=1 FOR i = 2 to n DO IF si sj THEN S = S {Ai } j=i FI OD Example: 11 activities sorted by nish time: (1, 4), (3, 5), (0, 6), (5, 7), (3, 8), (5, 9), (6, 10), (8, 11), (8, 12), (2, 13), (12, 14)
1 3 0 5 3 5 6 8 8 2 12 4 5 6 7 8 9 10 11 12 13 14
4 5
11 12
14
Is algorithm correct? Output is set of non-overlapping activities, but is it the largest possible? Proof of correctness: Given activities A = {A1 , A2 , , An } ordered by nish time, there is an optimal solution containing A1 : Suppose S A is optimal solution If A1 S , we are done If A1 / S: Let rst activity in S be Ak Make new solution S \ {Ak } {A1 } by removing Ak and using A1 instead This is valid solution (f1 < fk ) of maximal size (|S \ {Ak } {A1 }| = |S |) S is an optimal solution for A containing A1 S = S \ {A1 } optimal solution for A = {Ai A : sj f1 } (e.g. after choosing A1 the problem reduces to nding optimal solution for activities not overlapping with A1 ) Suppose we have solution S to A such that |S | > |S | = |S | 1 S = S {A1 } would be solution to A Contradiction since we would have |S | > |S | Correctness follows by induction on size of S
Example: 0 1 knapsack problem: Given n items, with item i being worth $ vi and having weight wi pounds, ll knapsack of capacity w pounds with maximal value. Fractional knapsack problem: As 0 1 knapsack problem but we can take fractions of items. Problems appear very similar, but only fractional knapsack problem can be solved greedily: Compute value per pound
vi wi
Sort items by value per pound. Fill knapsack greedily (take objects in order) O(n log n) time, easy to show that solution is optimal. Example that 0 1 knapsack problem cannot be solved greedily:
30
= $220
20
20 10
= $160