Math 40 Lecture 6: Linear Independence Continued and Matrix Operations
Math 40 Lecture 6: Linear Independence Continued and Matrix Operations
Math 40 Lecture 6: Linear Independence Continued and Matrix Operations
DAGAN KARP
In this lecture we continue our study of linear independence, and then discuss basic matrix operations. Recall, we ended last time with the following theorem, which characterizes linear indepdendence in terms of a matrix of column vectors. Theorem 1. Let v1 , . . . , vk be column vectors in Rn and let A be the n k matrix A = ( v1 v2 vk ). Then v1 , . . . , vk are linearly dependent if and only if the homogeneous linear system with augmented matrix (A|0) has a nontrivial solution. Theorem 2. Let A be an m n matrix, and consider the homogeneous linear system with augmented matrix (A|0). If m < n, then the system has innitely many solutions. (More columns than rows means innitely many solutions for a homogeneous system.) P ROOF. Any homogeneous system is consistent, since (x1 , . . . , xn ) = 0 is always a solution. Therefore, by the Rank Theorem, # free variables = n rank(A) n m > 0. Therefore the system is consistent with at least one free variable. Thus the system has innitely many solutions. Corollary 3. Any set of k vectors in Rn is linearly dependent if k > n. P ROOF. Apply the two previous theorems. We may also characterize linear indepdendence in terms of the associated matrix of rows. Theorem 4. Let v1 , . . . , vm be row vectors in Rn , and let A be the m n matrix v1 v2 . A= . . . vm Then v1 , . . . , vm are linearly dependent if and only if rank(A) < m. P ROOF. The rank of A is less than m if and only if a sequence of elementary row operations results in a zero row. But a sequence of elementary row operations is merely a linear combination of rows. Therefore the rank(A) < m if and only if one row is a linear combination of the others, ie, the rows are linearly depdendent.
Date: January 29, 2012. These are lecture notes for HMC Math 40: Introduction to Linear Algebra and roughly follow our course text Linear Algebra by David Poole.
1
Example 5. Recall our old friend, the linear system x + y 2z = 0 2x + 2y 3z = 1 3x + 3y + z = 7. We know this system is consistent if the columns of its augmented matrix are linearly dependent. This augmented matrix is 1 1 2 0 A = 2 2 3 1 . 3 3 1 7 We also know that these four column vectors are independent if the matrix 1 2 3 1 2 3 2 3 1 0 1 7 has rank less than 4. But this matrix has at least one zero row. Therefore its rows are not linearly independent. Therefore the columns of A are linearly dependent. Therefore our original linear system is consistent. Remark 6. Above we used two closely related matrices. If v1 , . . . , vm are vectors in Rn , then we can form the m n matrix of columns or the n m matrix of rows. The ith column of one is the ith row of the other, and vice versa. This ipping of a matrix is imporant enough to have a special name. Denition 7. Let A = (aij ) be an m n matrix. The transpose of A is the n m matrix AT dened by (AT )ij = aji . In other words, the number in row i and column j of AT is the number found in row j column i of A. Example 8. We compute. a b c d
T
= 1 0
a c b d
(1, 0)T =
Denition 9. A matrix is symmetrix if A = AT . Remark 10. If A is symmetric, then A has the same number of rows and columns. Denition 11. Let A = (aij ) be an m n matrix. (1) A is square if n = m. (2) A is diagonal if aij = 0 when i = j. (3) A is scalar if aij = c for some scalar c and all i, j.
2
Example 12. The identity matrix is the square diagonal matrix with all 1s on the diagonal. 1 0 0 0 0 1 0 0 0 0 1 0 I= . ... . . . . . 0 0 0 1 Denition 13. Let A = (aij ) and B = (bij ) both be m n matrices. The sum of A and B is dened by A + B = C, where C = (cij ) is dened by cij = aij + bij . Denition 14. Let A = (aij ) be an m n matrix and let B = (bjk ) be an n l matrix. Then the product A B is dened by A B = C, where C = (cij ) is given by
n
cij =
j=1
aij bjk .