Proofs of Norms
Proofs of Norms
Proofs of Norms
87
88 CHAPTER 4. NORMS AND MATRIX CONDITIONING
kx + hk kxk khk
n
X
= hi e i
i=1
n
X
|hi | ke i k
i=1
n
X
max |hi | ke i k
1in
i=1
= M khk1 ,
It follows that
y
M1 M2 ,
ky k1
or, equivalently,
M1 ky k1 ky k M2 ky k1 .
Since the result is clearly true for y = 0, the proof is complete.
Example 4.1.8. For any A = [ai,j ] 2 Cm⇥n define matrix max norm via
These functions are bona fide norms, as the reader should confirm. The matrix
max norm is not sub-multiplicative; the Frobenius norm is sub-multiplicative.
It can also be shown that the Frobenius norm is consistent with respect to the
`2 (Cm ) and `2 (Cn ) norms. The reader should prove this.
Definition 4.1.9. Suppose that k · kCm : Cm ! R and k · kCn : Cn ! R are
vector norms. The function k · k : Cm⇥n ! R defined by
kAxkCm
kAk := sup , 8 A 2 Cm⇥n ,
x2Cn? kxkCn
is called the induced matrix norm with respect to k · kCm and k · kCn .
Whenever n = m, it is understood that the vector norms are the same.
We say that k · k : Cn⇥n ! R is the induced matrix norm with respect the
vector norm k · k : Cn ! R i↵
kAxk
kAk := sup , 8 A 2 Cn⇥n ,
x2Cn? kxk
Hence
kAw k2`2 (Cm )
= 0.
kw k2`2 (Cn )
As B = AH A is Hermitian, it has an orthonormal basis of eigenvectors
S = {w 1 , . . . , w n } ⇢ Cn .
where max1in i =: max .Since the far right hand side is independent of x
q p
kAk2 max i = max i .
1in 1in
where
SCn n 1 = {x 2 Cn | kxkCn = 1} .
since the supremum (i.e., the least upper bound) is an upper bound. Hence,
and, similarly,
Therefore,
M2
kAk kAk2 ,
C1
and the previous theorem gives the desired result.
(Equivalent Definition:) Using non-negative homogeneity,
kAxkCm
kAk = sup
x2Cn? kxkCn
1
= sup Ax
kxkCn
x2Cn? Cm
✓ ◆
x
= sup A
n
x2C? kxk Cn Cm
= sup kAw kCm .
w 2SCn n 1
Since
kAk = sup kAw kCm ,
w 2SCn n 1
4.2 Conditioning
Definition 4.2.1 (condition number). Suppose that A 2 Cn⇥n is invertible.
The condition number of A with respect to the matrix norm k · k : Cn⇥n ! R
is
(A) := kAkkA 1 k.
Before we get on to the meaning and utility of the condition number, let
us get some elementary properties out of the way.
Proposition 4.2.2. Suppose that k · k : Cn⇥n ! R is an induced matrix norm
with respect to the vector norm k · k : Cn ! R, and A 2 Cn⇥n is invertible.
Then
kAk A 1 =: (A) 1.
Furthermore,
1
kA Bk ,
kA 1 k
for any B 2 Cn⇥n that is singular. Consequently,
1 kA Bk
inf .
(A) det(B)=0 kAk
Proof. Exercise.
There are some nice formulae for and estimates of the condition number
with respect to the induced matrix 2-norm:
Proposition 4.2.3. Suppose that A 2 Cn⇥n is invertible and k · k2 : Cn⇥n ! R
is the induced matrix 2-norm.
1. Then, if the singular values of A are 1 2 ··· n > 0,
1 1
2 (A) := kAk2 kA k2 = .
n
4.
1 kA Bk2
= inf .
2 (A) det(B)=0 kAk2
5. If, additionally, it is assumed that A is Hermitian, then
max | |
2 (A)
2 (A) = .
min | |
2 (A)
Proof. Exercise.
The first and last formulae give an easy geometric interpretation of the
condition number. It is the ratio of the maximal stretching to the minimal
stretching under the action of the matrix A.
Definition 4.2.4. Suppose that A 2 Cn⇥n is invertible and b 2 Cn is given.
The residual vector with respect to x 0 2 Cn is defined as
r = r (x 0 ) := b Ax 0 = A(x x 0 ),
e = e(x 0 ) := x x 0.
Consequently,
Ae = r .
It often happens that we have obtained an approximate solution x 0 2 Cn .
We would like to have some measure of the error, but a direct measurement
of the error would require the exact solution x. The next best thing is the
residual, which is an indirect measurmnent of the error, as the last definition
suggests. The next theorem tells us how useful the residual is in determining
the relative size of the error.
Theorem 4.2.5. Suppose that A 2 Cn⇥n is invertible, b 2 Cn? is given, and
x = A 1 b. Assume that k · k : Cn⇥n ! R is the induced matrix norm with
respect to the vector norm k · k : Cn ! R. Then
1 kr k kek kr k
(A) .
(A) kbk kxk kbk
4.2. CONDITIONING 97
Proof. Using the reverse triangle inequality and consistency, since kMk < 1,
for any x 2 Cn ,
k(In M)xk kxk kMxk (1 kMk)kxk.
This inequality implies that, if (In M)x = 0 then x = 0. Therefore, In M
is invertible.
To obtain the norm estimate notice that
1
1 = kIn k = k(In M)(In M) k
1 1
= (In M) M(In M)
1 1
k(In M) k kMkk(In M) k,
where we have used the reverse triangle inequality for the matrix norm, which
holds as for vector norms, and sub-multiplicativity. The upper bound of the
quantity k(In M) 1 k now follows.
Finally, consider
X̀
S` := Mk .
k=0
We will show that S` (In M) ! In as ` ! 1. Indeed,
X̀ X̀ X̀
S` (In M) = Mk (In M) = Mk Mk+1 = In M`+1 ,
k=0 k=0 k=0
T 2 Cn⇥n satifies
S 1 kS Tk < 1,
then T is invertible.
4.2. CONDITIONING 99
implies
1 1
1 kk
.
1 kMk 1 kA Ak
Now
1 1
x = (A + A) b A b
1 1 1
= (In M) A b A b
1 1 1
= (In M) (A b (In M)A b)
1 1
= (In M) MA b
1
= (In M) Mx.
100 CHAPTER 4. NORMS AND MATRIX CONDITIONING
Consequently,
1
k xk k(In M) k · kMk · kxk
kA 1 kk Ak
kxk
1 kA 1 kk Ak
(A) k Ak
= k Ak kAk
kxk.
1 (A) kAk
To end the discussion, let us see what happens when we perturb both A
and b.
1 1 k Ak
kMA bk = kMxk kMkkxk kA k · k Ak · kxk = (A) kxk
kAk
4.3. PROBLEMS 101
and
1 1 kAxk k bk
kA bk kA kk bk (A) kxk.
kAxk kbk
The previous three inequalities, when combined, yield
✓ ◆
(A) k bk k Ak
k xk + kxk,
1 (A) k Ak kbkkAk
kAk
as we intended to show.
4.3 Problems
1. Suppose that k · k : Cm⇥n ! R is the induced norm with respect to the
vector norms k · kCm and k · kCn , and A 2 Cm⇥n , prove that the function
kA( · )kCm : Cn ! R is uniformly continuous. Use this fact to prove that
there is vector x 2 SCn n 1 such that
kAk = kAxkCm .
(b) Show that for any invertible matrix A, the upper bound for kkxk
xk
(a)
1 p
p kAk2 kAk1 nkAk2 .
n
4.3. PROBLEMS 103
(b) If A is nonsingular
1 1 (A)
n.
n 2 (A)
13. Suppose that k · k : Cn⇥n ! R is the induced norm with respect to
the vector norm k · k : Cn ! R. Suppose that A, B 2 Cn⇥n and A is
non-singular and B is singular. Prove that
1 kA Bk
,
(A) kAk
15. Suppose
1.0000 2.0000
A= .
1.0001 2.0000
16. Suppose
a b
A= ,
b a
where a and b are real numbers. Show that the subordinate matrix norms
satisfy kAk1 = kAk2 = kAk1 .
18. Suppose that k · k : Cn⇥n ! R is the induced norm with respect to the
vector norm k · k : Cn ! R. Show that if is an eigenvalue of AH A,
where A 2 Cn⇥n , then
0 AH kAk .
21. Suppose that A 2 Cn⇥n is invertible. Use the results of problems 18 and
19 to show that p
2 (A) 1 (A) 1 (A) .