Unconstrained Optimization (Contd.) Constrained Optimization
Unconstrained Optimization (Contd.) Constrained Optimization
Unconstrained Optimization (Contd.) Constrained Optimization
Outline
Convex Optimization 1
Lecture 11
Unconstrained Minimization
minimize f (x)
subject x ∈ <n.
• Assumption 1:
• The function f is convex and continuously differentiable over Rn
• The optimal value f ∗ = inf x∈Rn f (x) is finite.
Convex Optimization 2
Lecture 11
• Theorem 1 Let Assumption 1 hold, and suppose that the gradients are
bounded. Then, the gradient method generates the sequence {xk } such
that
∗ αL2
lim inf f (xk ) ≤ f +
k→∞ 2
kxk+1 − yk2 ≤ kxk − yk2 − 2αk (f (xk ) − f (y)) + α2k k∇f (xk )k2.
Convex Optimization 3
Lecture 11
• Q-approximation Lemma
For continuously differentiable function with Lipschitz gradients, we have
M
f (y) ≤ f (x) + ∇f (x)T (y − x) + ky − xk2 for all x, y ∈ Rn,
2
Convex Optimization 4
Lecture 11
2 α2 M
f (xk+1) ≤ f (xk ) − αk∇f (xk )k + k∇f (xk )k2
α 2
= f (xk ) − (2 − αM ) k∇f (xk )k2
2
Convex Optimization 5
Lecture 11
∞
k∇f (xk )k2 < ∞.
X
k=1
Convex Optimization 6
Lecture 11
lim xk = x∗, x∗ ∈ X ∗ .
k→∞
Convex Optimization 7
Lecture 11
∞
k∇f (xl )k2 < ∞.
X
l=1
Convex Optimization 8
Lecture 11
P∞
Therefore adding α2 l=k+1 k∇f (xl )k
2
to both sides we obtain
∞ ∞
kxk+1 − x̃k2 + α2 k∇f (xl )k2 ≤ kxk − x̃k2 + α2 k∇f (xl )k2.
X X
l=k+1 l=k
Define ∞
uk = kxk − x̃k2 + α2 k∇f (xl )k2.
X
l=k
Thus, equation is
uk+1 ≤ uk ,
P∞
which implies {uk } converges. We already know l=k k∇f (xl )k2
converges as k → ∞. Therefore, kxk − x̃k converges. Therefore, {xk }
is bounded => convergent subsequence exists.
Take a convergent subsequence {xsk } and let limit point be x̄. We know
Convex Optimization 9
Lecture 11
But we just showed that limk→∞ kxk − x̄k exists. Therefore, the limit
must be 0 and {xk } converge to x̄.
• Quick Summary:
• Without convexity no convergence of iterates.
• With convexity convergence of iterates.
1
• Go back and check f (x) = 1+x2
. It can’t be convex.
Convex Optimization 10
Lecture 11
Rates of convergence
Convex Optimization 11
Lecture 11
Convex Optimization 12
Lecture 11
with y = x∗ gives
k+1
2
= 1 − mα + α M kx0 − x∗k2
2
• Remark: Result holds when 0 < α < m
.
• Geometric is not bad. But, how many functions are strongly convex?
Convex Optimization 13
Lecture 11
minimize f (x)
subject x ∈ X.
• Assumption 2
• f is continuously differentiable and convex on an open interval
containing X
• X is closed and compact
• f ∗ > −∞
Convex Optimization 14
Lecture 11
• Recall projection
• kPX [x] − xk ≤ kPX [x] − yk when y ∈ X
• kPX [x] − PX [y]k ≤ kx − yk when x, y ∈ <n
kxk+1 − yk2 ≤ kxk − yk2 − 2αk (f (xk ) − f (y)) + α2k k∇f (xk )k2.
Convex Optimization 15
Lecture 11
∗ αL2
lim inf f (xk ) ≤ f +
k→∞ 2
Rest identical
Convex Optimization 16
Lecture 11
lim xk = x∗, x∗ ∈ X ∗ .
k→∞
Convex Optimization 17
Lecture 11
Proofs
Convex Optimization 18