Test 3 MAT 1341C March 31, 2012 1
Test 3 MAT 1341C March 31, 2012 1
Test 3 MAT 1341C March 31, 2012 1
(1) (6 pts) Are the following statements true or false? Answer with T for true and F for false.
(a) One calls vectors X
1
, X
2
, . . . , X
k
linearly independent if the following condition holds: When-
ever all scalars s
1
, . . . , s
k
are zero then s
1
X
1
+s
2
X
2
+ s
k
X
k
= 0.
My answer:
(b) Any linearly independent subset of R
n
is orthogonal.
My answer:
(c) Every set of 3 vectors in R
n
is a spanning set of a three-dimensional subspace of R
n
.
My answer:
(d) If A is an n n matrix whose columns span R
n
, then A
T
is invertible.
My answer:
(e) Any 3 vectors in R
2
will be linearly dependent.
My answer:
(f) If X
1
, X
2
are linearly independent and Y is not a linear combination of X
1
, X
2
, then {X
1
, X
2
, Y }
can be linearly dependent.
My answer:
Solution: (a) This is false. The denition of linear independence is the opposite: Whenever
s
1
X
1
+s
2
X
2
+ s
k
X
k
= 0, then all scalars s
1
, . . . , s
k
are zero.
(b) This is false. For example,
1
0
1
2
= and dimU
=
(c) (1 pt) 1 Suppose X
1
, X
2
, X
3
is a basis of a subspace U in R
5
. Then the orthogonal basis
F
1
, F
2
, F
3
of U obtained from X
1
, X
2
, X
3
by the Gram-Schmidt algorithm is ... (give the formula)
(d) (3 pts) Let , , be the third last, second last and last digit of your student number. Which
one(s) of the following subsets of R
3
is (are) a basis of R
3
?
B
1
=
1
1
0
2
2
0
0
0
, B
2
=
0
1
0
2
0
0
, B
3
=
1
2
0
0
1
2
1
1
1
1
0
No justication required.
My answer:
Solution: (a) Since the homogeneous linear system AX = 0 has 5 variables and 3 basic solutions,
this means that the rank of the matrix A is 2. But the rank of A equals the dimension of the row
space of A. Thus: dimrow(A) = 5 3 = 2.
(b) By Theorem 3 in 4.6,
dimU
= dimU = 4
since U
= U.
Marking: 1 point for each correct answer
(c) By Theorem 8 in 4.5 we have
F
1
= X
1
F
2
= X
2
X
2
F
1
||F
1
||
2
F
1
F
3
= X
3
X
3
F
1
||F
1
||
2
F
1
X
3
F
2
||F
2
||
2
F
2
(d) B
1
is not a basis: The second vector is a scalar multiple of the rst, whence the set is not
linearly independent. Alternative solution: If we put the vectors as columns of a matrix, we get
det(B
1
) = 0, so B
1
is not a basis of R
3
by the Invertible Matrix Theorem.
B
2
is a basis, since for example the matrix formed out of the vectors in the set B
2
is invertible.
For example, its determinant is not zero:
0 2 0
1 0
0 0 1
2 0
0 1
= 2.
Finally, B
3
contains more than 3 vectors and is therefore not a basis of R
3
.
Marking: 1 point for each correct answer.
Test 3 MAT 1341C March 31, 2012 3
(3) (6 pts) Let U = Span {X
1
, X
2
, X
3
, X
4
} where the vectors X
1
, . . . , X
4
R
4
are
X
1
=
1
2
1
1
, X
2
=
1
2
0
0
, X
3
=
2
1
1
2
, X
4
=
3
1
1
2
.
Find a basis of U and dimU.
Solution: We write the vectors X
1
, X
2
, X
3
, X
4
into a matrix A with X
i
the i-th row,
A =
1 2 1 1
1 2 0 0
2 1 1 2
3 1 1 2
R2 +R1
R3 2R1
R4 3R1
1 2 1 1
0 4 1 1
0 3 3 0
0 5 2 1
R3
3
1 2 1 1
0 4 1 1
0 1 1 0
0 5 2 1
R1 2R3
R2 4R3
R4 + 5R3
1 0 1 1
0 0 3 1
0 1 1 0
0 0 3 1
switchR2andR4
1 0 1 1
0 1 1 0
0 0 3 1
0 0 3 1
add R3 to R4
multiply row R3 by 1
1 0 1 1
0 1 1 0
0 0 3 1
0 0 0 0
= R
Since U = rowA, so a basis of U are the rows of R with leading 1s, that is, the following is a
basis:
1
0
1
1
0
1
1
0
0
0
3
1
.
So dimU = 3. Note the basis is not unique. Alternative solution: Interpret U as the column
space of a matrix, and apply the Rank Theorem.
Marking: 2 points for correct method, 2 points for correct row reduction, 1 point for correct
basis, 1 point for correct dimension.
Test 3 MAT 1341C March 31, 2012 4
(4) (6 pts) Let X
1
=
1 1 1
T
and X
2
=
1 1 0
T
.
(a) (2 pts) Show that {X
1
, X
2
} is an orthogonal set. Is it also orthonormal?
(b) (4 pts) Let U = Span {X
1
, X
2
}. Find the point in U closest to X =
1 2
T
where is
the third last digit of your student number.
Solution: (a) The dot product X
1
X
2
= 0, whence {X
1
, X
2
} is an orthogonal set. But it is
not an orthonormal set because ||X
1
|| = 1.
(b) The point in U closest to X =
1 2
T
is given by
Z = proj
U
(X) =
X X
1
||X
1
||
2
X
1
+
X X
2
||X
2
||
2
X
2
But X X
1
= 3 +; X X
2
= 1; ||X
1
||
2
= 3, and ||X
2
||
2
= 2. Consequently
Z =
3 +
3
X
1
1
2
X
2
=
3
+
1
2
3
+
3
2
3
+ 1
T
Marking: (b) 2 points for realizing that Z = proj
U
(X) and writing down the correct formula
for proj
U
(Z). 2 points for calculating Z.
(5) (a) (3 pts) Let F[0, 9] be the vector space of functions f : [0, 9] R. In the following denition
replace with the second last digit of your student number. Show that
U = {f F[0, 9] : f() = 0}
is a subspace of F[0, 9].
(b) (3 pts) Let s, c, h be the functions in F[R] dened by
s(x) = sin
2
(x), c(x) = cos
2
(x), h(x) = x.
Is h Span {s, c}?
In both parts you must include all details to get full marks.
Solution: (a) We check the 3 conditions of the subspace test:
(1) 0
F[0,9]
U: Indeed, the zero vector of F[0, 9] is the zero function which is zero on ALL
points in [0, 9], in particular it vanishes at the point [0, 9].
(2) U is closed under addition: Given any two elements f, g in U, then
f() = 0, g() = 0.
Now, using the denition of addition in F[0, 9], we get
(f +g)() = f() +g() = 0 + 0 = 0.
Hence, f +g U.
(3) U is closed under scalar multiplication: Again, let f U, so f() = 0 by denition of U.
But then
(cf)() = cf() = c 0 = 0 for any c
so cf U.
Now (1)(3) together show that U is a subspace.
(b) Suppose h Span {c, s}. Then there exist scalars a, b such that h = ac + bs. This means
that
x = h(x) = a cos
2
(x) +b sin
2
(x)
holds for all x R. We evaluate (*) for x = 0 and get 0 = h(0) = a since cos
2
(0) = 1 and
sin
2
(0) = 0. Next, we evaluate (*) for x = /2 and get /2 = h(/2) = b since cos
2
(/2) = 0
and sin
2
(/2) = 1. Thus x =
2
sin
2
(x) holds for all z R, which is not true: For example, for
x = we get = h() =
2
sin
2
() = 0. Therefore h Span {c, s}.
Another argument: c and c are bounded functions, hence so is every linear combination.
However, h is unbounded.
Test 3 MAT 1341C March 31, 2012 5
(6) (2 bonus points) Let v
1
and v
2
be elements of a vector space V . Show that Span {v
1
, v
2
} is a
subspace of V . Recall that Span {v
1
, v
2
} is the set of all linear combinations of v
1
, v
2
. In other
words, Span {v
1
, v
2
} = {s
1
v
1
+s
2
v
2
: s
1
, s
2
R}.
Solution: This has been proven in class twice.
Let K = Span {v
1
, v
2
}. We have just to prove the three conditions dening a subspace.
(1) 0 K because 0 = 0v
1
+ 0v
2
.
(2) Let u
1
and u
2
be elements of K, then u
1
= s
1
v
1
+ s
2
v
2
for suitable scalars s
1
, s
2
R and
u
2
= t
1
v
1
+t
2
v
2
for scalars t
1
, t
2
R. Consequently u
1
+u
2
= (s
1
+t
1
)v
1
+(s
2
+t
2
)v
2
= av
1
+bv
2
,
where a = s
1
+t
1
R and b = s
2
+t
2
R. Thus u
1
+u
2
also is in K.
(3) Let u K and a R. Then au = (as
1
)v
1
+ (as
2
)v
2
is in K.
Altogether, this proves that K in a subspace of V .