Isomorphisms Math 130 Linear Algebra
Isomorphisms Math 130 Linear Algebra
Isomorphisms Math 130 Linear Algebra
1
'
Of course, the identity function IV : V → V is Example 4. Consider P3 , the vector space of poly-
'
an isomorphism. nomials over R of degree 3 or less. Define T : P3 →
After we introduce linear transformations (which R4 by T (a1 x3 + a2 x2 + a3 x + a4 ) = (a1 , a2 , a3 , a4 ).
is what homomorphisms of vector spaces are It just associates to a polynomial its 4-tuple of co-
called), we’ll have another way to describe isomor- efficients starting with the coefficient of x3 and go-
phisms. ing down in degree. This T preserves addition and
You can prove various properties of vector space scalar multiplication, it is one-to-one, and it is onto.
isomorphisms from this definition. (Those statements are easy to verify.)
'
Since the structure of vector spaces is defined in This is not the only isomorphism P3 → R4 . A
terms of addition and scalar multiplication, if T cubic polynomial is determined by its value at any
preserves them, it will preserve structure defined in four points. The association f (x) to the 4-tuple
terms of them. For instance, T preserves 0, nega- (f (1), f (2), f (3), f (4)) is also an isomorphism.
tion, subtraction, and linear transformations.
Theorem 5. If T : V → W is an isomorphism,
'
Theorem 2. If T : V → W is an isomorphism of then T carries linearly independent sets to linearly
'
vector spaces, then its inverse T −1 : W → V is also independent sets, spanning sets to spanning sets,
an isomorphism. and bases to bases.
Proof. Since T is a bijection, T −1 exists as a func- Proof. For the first statement, let S be a set of lin-
tion W → V . We have to show T −1 preserves ad- early independent vectors in V . We’ll show that its
dition and scalar multiplication. image T (S) is a set of linearly independent vectors
First, we’ll do addition. Let w and x be elements in W . If 0 were a nontrivial linear combination of
of W . We have to show that vectors in T (S), then an application of T −1 would
yield a nontrivial linear combination of vectors in
−1 −1
T (w + x) = T (w) + T (x). −1 S, but there is none since S is independent. There-
fore, T (S) is linear independent.
We’ll show that by simplifying it to logically equiv- For the second statement, let w be any vector in
−1
alent statements until we reach one which we know W , then T (w) is a linear combination of vectors
is true. Since T and T −1 are inverse functions, that in V . Apply T to that linear combination to see
equation holds if and only if that w is a linear combination of vectors in W .
Since T carries both independent and spanning
−1 −1
w + x = T (T (w) + T (x)). sets from V to W , it carries bases to bases. q.e.d.
Since T is an isomorphism, we can rewrite that as More generally, any property of vector spaces de-
fined in terms of the structure of vector spaces (ad-
w + x = T (T −1 (w)) + T (T −1 (x)) dition and scalar multiplication) is preserved by iso-
morphisms.
which simplifies to w + x = w + x which is true.
Scalar multiplication is left to you. Show Coordinates with respect to a basis deter-
−1 −1
T (cw) = cT (w). q.e.d. mine an isomorphism. One of the main uses of
a basis β = (b1 , b2 , . . . , bn ) for a vector space V
We’ll omit the proof of the next theorem. over a field is to impose coordinates on V . Each
' ' vector v in V is a unique linear combination of of
Theorem 3. If S : V → W and T : W → X are
the basis vectors
both isomorphisms of vector spaces, then so is their
'
composition (T ◦ S) : V → X. v = v1 b1 + v2 b2 + · · · + vn bn .
2
The coefficients are used as coordinates for v with
the respect to the basis β
v1
v2
[v]β = .. .
.
vn