06 Dynamic Programming I
06 Dynamic Programming I
06 Dynamic Programming I
D YNAMIC P ROGRAMMING I
2
Dynamic programming history
Etymology.
・Dynamic programming = planning over time.
・Secretary of Defense was hostile to mathematical research.
・Bellman sought an impressive name to avoid confrontation.
3
Dynamic programming applications
Areas.
・Bioinformatics.
・Control theory.
・Information theory.
・Operations research.
・Computer science: theory, graphics, AI, compilers, systems, ….
・...
Some famous dynamic programming algorithms.
SECTION 6.1-6.2
Weighted interval scheduling
d
e
h
time
0 1 2 3 4 5 6 7 8 9 10 11
6
Earliest-finish-time first algorithm
weight = 999 b
weight = 1 a
h
time
0 1 2 3 4 5 6 7 8 9 10 11
7
Weighted interval scheduling
time
0 1 2 3 4 5 6 7 8 9 10 11
8
Dynamic programming: binary choice
・compatible
Must include optimal solution to problem consisting of remaining
jobs 1, 2, ..., j – 1.
0 if j 0
OPT( j)
max v j OPT( p( j)), OPT( j 1) otherwise
9
Weighted interval scheduling: brute force
Compute-Opt(j) if j = 0
return 0. else
return max(v[j] + Compute-Opt(p[j], Compute-Opt(j–1))).
10
Weighted interval scheduling: brute force
Ex. Number of recursive calls for family of "layered" instances grows like
Fibonacci sequence.
1 4 3
2
3 2 2 1
3
4
1 1 1 0 1 0
5
recursion tree
11
Weighted interval scheduling: memoization
12
Weighted interval scheduling: running time
13
Weighted interval scheduling: finding a solution
Find-Solution(j) if j = 0
return .
else if (v[j] + M[p[j]] > M[j–1]) return { j } Find-Solution(p[j]).
else
return Find-Solution(j–1).
14
Weighted interval scheduling: bottom-up
Sort jobs by finish time so that f1 ≤ f2 ≤ … ≤ fn. Compute p(1), p(2), …, p(n).
M [0] ← 0. FOR j = 1 TO n
M [ j ] ← max { vj + M [ p(j) ], M [ j – 1] }.
15
6. D YNAMIC P ROGRAMMING I
SECTION 6.3
Least squares
x1 < x2 < ... < xn, find a sequence of lines that minimizes f (x).
x 18
Segmented least squares
Given n points in the plane: (x1, y1), (x2, y2) , …, (xn, yn) with x1 < x2 < ... < xn and a
constant c > 0, find a sequence of lines that minimizes f (x) = E + c L:
・E = the sum of the sums of the squared errors in each segment.
・L = the number of lines.
x 19
Dynamic programming: multiway choice
Notation.
・OPT(j) = minimum cost for points p , p , …, p .
1 2 j
To compute OPT(j):
・Last segment uses points pi, pi+1, …, pj for some i.
・Cost = e(i, j) + c + OPT(i – optimal substructure property
(proof via exchange argument)
1).
OPT 0 if j 0
( j) min e(i, j) c OPT(i 1) otherwise
1 i j
20
Segmented least squares algorithm
M [0] ← 0. FOR j = 1 TO n
M [ j ] ← min 1 ≤ i ≤ j { eij + c + M [ i – 1] }.
RETURN M[n].
21
Segmented least squares analysis
Pf.
・Bottleneck = computing e(i, j) for O(n2) pairs.
・O(n) per pair using previous formula. ▪
22
6. D YNAMIC P ROGRAMMING I
SECTION 6.4
Knapsack problem
1 1 1
value.
2 6 2
knapsack instance
(weight limit W = 11)
wi.
0 if i 0
OPT(i, w) OPT(i 1, w) if w w
i
max OPT(i 1, w), vi OPT(i 1, w wi ) otherwise
26
Knapsack problem: bottom-up
FOR w = 0 TO W
M [0, w] ← 0.
FOR i = 1 TO n
FOR w = 1 TO W
IF (wi > w)M [ i, w ] ← M [ i – 1, w].
27
Knapsack algorithm demo
i vi wi
1 1 1 0 if i 0
2 6 2 OPT(i, w) OPT(i 1, w) if w w
i
3 18 5 max OPT(i 1, w), vi OPT(i 1, w wi ) otherwise
4 22 6
5 28 7
weight limit w
0 1 2 3 4 5 6 7 8 9 10 11
{} 0 0 0 0 0 0 0 0 0 0 0 0
{1} 0 1 1 1 1 1 1 1 1 1 1 1
subset { 1, 2 } 0 1 6 7 7 7 7 7 7 7 7 7
of items
1, …, i { 1, 2, 3 } 0 1 6 7 7 18 19 24 25 25 25 25
{ 1, 2, 3, 4 } 0 1 6 7 7 18 22 24 28 29 29 40
{ 1, 2, 3, 4, 5 } 0 1 6 7 7 18 22 28 29 34 34 40
SECTION 6.5
RNA secondary structure
C A
A A
A U
G C
C G U A A G
G
U A U U A
G
A C G C U
G
C G C G A G C
G
A U
Free energy. Usual hypothesis is that an RNA molecule will form the
secondary structure with the minimum total free energy.
32
RNA secondary structure
Examples.
G
G G G G
G G
C U C U
C G C G C U
A U A U A G
U A U A U A
base pair
A U G U G G C C A A U G G G G C A U A G U U G G C C A U
U
33
RNA secondary structure: subproblems
1 t n
1.
34
Dynamic programming over intervals
Case 1. If i ≥ j – 4.
35
Bottom-up dynamic programming over intervals
j
RNA (n, b1, …, bn )
6 7 8 9 10
FOR k = 5 TO n – 1
4 0 0 0
FOR i = 1 TO n – k
3 0 0
i
j ← i + k. 2 0
Compute M[i, j] using formula. 1
RETURN M[1, n].
order in which to solve subproblems
36
Dynamic programming summary
Outline.
Techniques.
37