Lagrange New Qualif
Lagrange New Qualif
Lagrange New Qualif
h h
h(x1 , x2 ) = ( (x , x ), (x , x )) 6= (0, 0),
x1 1 2 x2 1 2
(this condition is called constraint qualification at the point (x1 , x2 )).
Then, there is a real number such that (x1 , x2 , ) is a critical point of
the Lagrangian function
L L L
= 0, = 0, = 0.
x1 x2
1
In other words at the point (x1 , x2 ) the level set of f and C have common
tangent, that is their tangents have equal slopes, or equivalently, parallel
gradient vectors:
2
This two equations together with the third one
h(x1 , x2 ) c = 0
This proof was based on the fact that the level curves of f and of h
at (x1 , x2 ) and have equal slopes. We now present another version of the
proof using the gradient vectors f (x1 , x2 ) and h(x1 , x2 ). Since gradient
is orthogonal to level curve, then the level curves of f and h at (x1 , x2 ) are
tangent if and only if the gradients f (x1 , x2 ) and h(x1 , x2 ) line up at
(x1 , x2 ), that is the gradients are scalar multiplies of each other
f (x1 , x2 ) = h(x1 , x2 ),
(note that f (x1 , x2 ) and h(x1 , x2 ) can point in the same direction, in
this case > 0, or point in opposite directions, in this case < 0). This
equality immediately implies the above condition
f h
(x1 , x2 ) (x , x ) = 0,
x1 x1 1 2
f h
(x1 , x2 ) (x , x ) = 0.
x2 x2 1 2
f (x1 , x2 ) = h(x1 , x2 ),
also in the following crazy manner: the vector f (x1 , x2 ) is linear combi-
nation of the linearly independent (i.e. nonzero, and it is so because the
constraint qualification, is not it?) vector h(x1 , x2 )).
Actually, the Theorem can be reformulated as follows:
Remark 2. Note that f (x1 , x2 ) and h(x1 , x2 ) can point in the same
direction, in this case > 0, or point in opposite directions, in this case
< 0.
3
Remark 3. It is seen from this proof that the necessary condition for (x1 , x2 )
to be a maximizer is the system of two equalities
f h
x (x1 ,x2 ) x (x1 ,x2 )
1
= 1
f (x1 ,x2 ) h (x1 ,x2 )
x2 x2
h(x1 , x2 ) c = 0,
the equality of slopes and the constraint. The introduction of a Lagrange
multiplier as an additional variable looks artificial but it makes possible to
apply to the constrained-extremum problem the same first-order condition
used in the free-extremum problem (but for more complex function L). Note
also that has certain economical meaning, we will see it lather.
easy to see that the minimizer of this one variable minimization problem is
the point (x , y ) = (0, 0).
We claim that there exists NO for which (x , y , ) = (0, 0, ) is a
critical point of Lagrangian
4
Indeed, for our minimizer (0, 0) there exists no for which (0, 0, ) satisfies
the second equation of the system
Lx (x, y) = 4 x3 = 0;
Ly (x, y) = 1 3 y 2 = 0;
L (x, y) = y 3 + x4 = 0.
1.2.3 Strategy
1. There is naive way to solve this problem: just solve y from the
constraint y = 1 x, substitute to the function f
(x) = 10 x2 (1 x)2
and maximize this one-variable function. The solution gives the critical point
x = 1/2 and 00 (1/2) < 0, so (x = 1/2, y = 1/2) is a maximizer of f subject
of constrained by x + y = 1.
2. Now let us solve the problem using Lagrange method. First we remark
that qualification is satisfied: h(x, y) = x + y has no critical points at all.
5
The Lagrangian here is
L(x, y, ) = 10 x2 y 2 (x + y 1).
solution gives
x = 1, y = 0, = 2, f (1, 0) = 4;
x = 1, y = 0, = 2, f (1, 0) = 4.
Note that the constraint set is compact, so the function achieves its min and
max. Thus (1, 0) is minimizer and (1, 0) is maximizer.
6
Compute partial derivatives
L
x1
= 2x1 (x2 2),
L
x2
= x21 2x2 ,
L
= 2x21 x22 + 3.
Note that the constraint set is compact, so the function achieves its min
and max. So we must seek max points among these six candidates. The
computation shows that
Theorem 2
df (x (c), y (c))
= (c).
dc
7
Proof*. By Lagrange theorem for all c we have
(1) h(x (c), y (c)) = c;
(2)
f (x (c), y (c)) = (c) h(x (c));
x x
(3)
f (x (c), y (c)) = (c) h(x (c)).
y y
Differentiating (1) with respect to c we obtain
h dx (c) h dy (c)
(x (c), y (c)) + (y (c), y (c)) = 1.
x dc y dc
f (x() , y () ) f (x , y ) + .
L(x, y, ) = xy (x + y 20).
8
(b) Find partials
Lx = y ;
Ly = x
L = x y + 20.
(c)
Solve the system
y=0
x=0 x = 10, y = 10, = 10.
x y + 20 = 0
(d) The maximal value is f (10, 10) = 100.
2. Now redo this problem, this time using the constraint x + y = 21.
The similar solution, or
> LagrangeM ultipliers(x y, [x + y 21], [x, y], output = detailed);
gives x = 212
, y = 212
, = 21
2
and the new maximal value is f ( 21 , 21 ) =
2 2
441
4
= 110.25. So increasing the constraint from 20 to 21 the maximal value
increases by 10.25.
In other words the NCQ means that the gradient vectors Dh1 (x ), ... , Dhm (x )
are linear independent in Rn .
9
Theorem 3 Suppose x = (x1 , ..., xn ) Ch is a local max or min of f
on Ch . Suppose also that NDCQ is satisfied at x . Then there exists =
(1 , ..., m ) Rm so that (x , ) is a critical point of the Lagrangian L(x , ),
that is L(x , ) = 0, in other words
L L
x1
(x , ) = 0, ... , x n
(x , ) = 0,
L L
1
(x , ) = 0, ... , m
(x , ) = 0.
f (x ) = 1 h1 (x ) + ... + m hm (x ).
Note that this condition, together with constraints, gives exactly L(x , ) =
0.
Actually, the Theorem can be reformulated as follows:
h1 (x ), ... , hm (x )
10
1. Again the naive solution is possible in this case. From the second
constraint we have y = x and from the first constraint we have z = 1
x y = 1 2x. Substituting in f we obtain a function of one variable
6x2 4x + 1. Its minimizer is x = 1/3, thus for our problem we have the
minimizer (1/3, 1/3, 1/3).
L(x, y, z) = x2 + y 2 + z 2 1 (x + y + z 1) 2 (y x).
11
Calculation similar to one used in two variable case shows that
f (x1 (a), ..., xn (a)) = j (a).
aj
-In other words j measures how the optimal value is affected by relax-
ation of j-th constraint aj .
The second order condition allows to check that this is maximizer. The
optimal value is f (1, 1) = 1.
12
1.3.1 Income Expansion Path
Back to the problem
Maximize f (x, y) subject to h(x, y) = a.
The solution (x , y ) of this problem depends on a, so, assume x = x (a)
and y = y (a).
Let us try to write the equation of income expansion path for this problem.
L(x, y, ) = xy (x + 2y a);
Lx (x, y, ) = y = 0
x = 2y
Ly (x, y, ) = x 2 = 0 x = a2 , y = a4 , = a2 .
4y = a,
L (x, y, ) = (x + 2y a) = 0
13
Exercises
4. A manufacturing firm has budgeted $60,000 per month for labor and
materials. If $x thousand is spent on labor and $y thousand is spent on
materials, and if the monthly output (in units) is given by N (x, y) = 4xy
8x how should the $60,000 be allocated to labor and materials in order to
maximize N ? What is the maximum N ?
where x is the number of units of labor and y is the number of units of capital
required to produce f (x, y) units of the product. Each unit of labor costs $50
and each unit of capital costs $100. If $500,000 has been budgeted for the
production, how should this amount be allocated between labor and capital
in order to maximize production? What is the maximum number of units
that can be produced?
14
(A) If $300,000 is budgeted for production of the product, determine
how this amount should be allocated to maximize production, and find the
maximum production.
(B) Find the marginal productivity of money in this case, and estimate the
increase in production if an additional $80,000 is budgeted for the production
of this product.
8. Find the maximum and minimum distance from the origin to the
ellipse x2 + xy + y 2 = 3. Then estimate the answer for the same problem for
the ellipse x2 + xy + y 2 = 3.3 using shadow price.
10. The standard beverage can has a volume 12 oz, or 21.66 in3 . What
dimension yield the minimum surface area? Find the minimum surface area.
11. Find the general expression (in terms of all the parameters) for the
commodity bundle (x1 , x2 ) which maximizes the Cobb-Douglas utility func-
tion U (x1 , x2 ) = kxa1 x1a
2 on the budget set p1 x1 + p2 x2 = I
12. Find the point closest to the origin in R3 that is on both the planes
3x + y + z = 5 and x + y + z = 1.
15. Maximize the Cobb-Douglas utility function U (x, y) = x0.5 y 0.5 subject
to the budget constraint px + qy = I.
min x2 + y s.t. x + y = a.
Homework
Exercises 3, 5, 7, 10, 14.
15