LinAlg2 Midterm Notes

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

1 Linear Algebra II

1.1 Vector spaces


Denition Vector space
Consider a eld F of scalars and a set V with elements called vectors.
We dene two operations on V :
• ⊕ : V × V → V Vector addition
• ⊙ : F × V → V Scalar multiplication
V is a vector space over F if and only if these axioms hold:
1. There exists an element 0 (the zero vector) in V such that 0 ⊕ u = u for all u  V .
2. Vector addition is commutative: u ⊕ v = v ⊕ u
3. Vector addition is associative: u ⊕ (v ⊕ w) = (u ⊕ v) ⊕ w
4. For each u  V , there exists z  V (the additive inverse) such that u ⊕ z = 0
5. For all u  V , 1 ⊙ u = u
6. For all a, b  F and all u  V , a ⊙ (b ⊙ u) = (ab) ⊙ u
7. For all c  F and all u, v  V , c ⊙ (u ⊕ v) = (c ⊙ u) ⊕ (c ⊙ v)
8. For all a, b  F and all u  V , (a + b) ⊙ u = (a ⊙ u) ⊕ (b ⊙ u)

Theorem
Let V be a vector space and let u  V . Then,
1. The zero vector 0 is unique.
2. 0 ⊙ u = 0
3. The additive inverse z of u is unique.
4. −1 ⊙ u = z
Alternative notation:
additive inverse −u zero vector 0 vector addition u + v scalar multiplication cu
Denition Subspace
Let V be a vector space over F . U ⊆ V is a subspace of V if
U is a vector space over F with the same vector addition and scalar multiplication operations as V .
Only axioms 1 and 4 have to be checked to see if U is a subspace.
Theorem
Let V be a vector space over F and U ⊆ V . Then U is a subspace of V if and only if:
1. U is nonempty
2. u + v  U for all u, v  U
3. cu  U for all c  F and u  U

1.1.1 Linear combinations


Denition
Linear combination
V be a vector space over F.
For given c1 , c2 ,    , c2  F and v1 , v2    , vr  V , we say that an expression of the form c1 v1 + c2 v2 +
   + cr vr is a linear combination of v1 , v2    , vr .
Denition Span
The set of all linear combinations of v1 , v2 ,    , vr  V is called the span of v1 , v2 ,    , vr .
We denote it by span(v1 , v2    , vr ) = c1 v1 + c2 v2 +    + cr vr  c1 , c2    , cr  F

Theorem
Let V be a vector space and v1 , v2 ,    , vr  V . Then span(v1 , v2 ,    , vr ) is a subspace of V .

Denition Spanning set


We say that v1 , v2 ,    , vr is a spanning set for V (or v1 , v2 ,    , vr span V )
if every vector in V can be written as a linear combination of v1 , v2 ,    , vr .

2
Theorem
1. If v1 , v2 ,    , vr span V and one of these vectors is a linear combination of the others,
then these r − 1 vectors span V .
2. One of the vectors v1 , v2 ,    , vr is a linear combination of the others if and only if
there exist scalars c1 , c2 ,    , cr  F, not all zero, such that c1 v1 + c2 v2 +    + cr vr = 0

Denition Linear dependence and independence


The vectors v1 , v2 ,    , vr are linearly dependent if there exist scalars c1 , c2 ,    , cr  F,
not all zero, such that c1 v1 + c2 v2 +    + cr vr = 0. Otherwise, they are linearly independent.

Theorem
Let v1 , v2 ,    , vr  V be vectors. Every vector in span(v1 , v2 ,    , vr ) can be written uniquely
as a linear combination of v1 , v2 ,    , vr if and only if v1 , v2 ,    , vr are linearly independent.

1.1.2 Subspace operations


Theorem
If U, W are subspaces of a vector space V , then U  W is a subspace of V .
This is not necessarily true for U  W .
Denition Sum of subspaces
The sum of two subspaces U and W of V is dened as U + W = u + w  u  U and w  W 

Theorem
If U, W are subspaces of a vector space V , then U + W is a subspace of V .

Denition Direct sum of subspaces


The sum of two subspaces U, W is called a direct sum if U  W = 0
Every vector in a direct sum U + W can be uniquely expressed as a sum of u  U and w  W .

1.2 Bases
Denition Basis
Let V be a vector space over F . The vectors v1 , v2 ,    , vn form a basis of V if
1. v1 , v2 ,    , vn are linearly independent
2. span(v1 , v2 ,    , vn ) = V
Bases are not unique.
Lemma Replacement lemma
Let V be a nonzero vector space over F and let r be a positive integer.

r
Suppose that u1 , u2 ,    , ur span V . Let v = ci u1 be a nonzero vector in V . Then,
i=1
1. cj ̸= 0 for some j  1, 2,    , r.
2. If cj ̸= 0 then v, u1 ,    , uj−1 , uj+1    , ur span V .
3. If u1 , u2 ,    , ur  is a basis for V then v, u1 ,    , uj−1 , uj+1    , ur  is also a basis for V .

Theorem
Let n and r be positive integers and let V be a vector space over F .
Let v1 , v2 ,    , vn  be a basis of V and let u1 , u2 ,    , ur be linearly independent. Then,
1. r ≤ n
2. If r = n, then u1 , u2 ,    , ur  is a basis of V .

Theorem
If v1 , v2 ,    , vn  and u1 , u2 ,    , ur  are bases of V , then n = r.

3
Denition Dimension
Let V be a vector space and let n be a positive integer.
• If v1 , v2 ,    , vn  is a basis of v, then n is the dimension of V .
• If V = 0, then V has dimension zero.
• If V has dimension n for some n  N, then V is nite dimensional (notation: dim V = n)
• Otherwise, V is innite dimensional.
If span(v1 , v2 ,    , vr ) = V , then dim V ≤ r
If V is nite dimensional, every set of linearly independent vectors in V can be extended to a basis.
If V is nite dimensional and U is a subspace of V , then dim U ≤ dim V .
The set of sequences in R is an example of an innite-dimensional vector space.
Theorem Grassman’s formula

dim(U  W ) + dim(U + W ) = dim U + dim W

1.2.1 Linear transformations


Denition Linear transformation
Let V and W be vector spaces over the same eld F.
A function T : V → W is called a linear transformation (a linear operator if V = W ) if

T (au + bv) = aT (u) + bT (v) ∀a, b  F ∀u, v  V

Let A  Fn×m and T (x) = Ax. Then T is a linear transformation from Fm to Fn .


Derivatives and integrals are linear transformations.
Lemma
Let V and W be vector spaces over F and let T : V → W be a linear transformation. Then,
1. T (cv) = cT (v) for all c  F and v  V
2. T (0) = 0 (the rst 0 is in V , the second 0 is in W )
3. T (a1 v1 + a2 v2 +    + an vn ) = a1 T (v1 ) + a2 T (v2 ) +    + an T (vn ) ∀vk  V ∀ak  F

Denition Kernel and range


Let T : V → W be a linear transformation.
• The kernel of T is dened as ker T = v  V  T (v) = 0
• The range of T is dened as ran T = w  W  ∃v  V s.t. T (v) = w

Theorem
The kernel of T is a subspace of V and the range of T is a subspace of W .
Any element of an n-dimensional vector space can be represented by the
E-coordinate vector ⟨c1 , c2 ,    , cn ⟩  Fn by xing an ordered basis E.
Denition E-basis representation
Let E = (v1 , v2 ,    , vn ) be an ordered basis of V .
For any u  V , write u = c1 v1 + c2 v2 +    + cn vn where ck  F.
The function [ · ]E : V → Fn , dened by [u]E = ⟨c1 , c2 ,    , cn ⟩ is called the E-basis representation.

Theorem
[ · ]E : V → Fn is a linear transformation.

Theorem Isomorphism V and Fn


[ · ]E is a bijection between V and Fn .
We say that [ · ]E : V → Fn is a (linear) isomorphism and V and Fn are isomorphic.

Theorem Dimension theorem


Let T : V → W be a linear transformation. Then dim ker(T ) + dim ran(T ) = dim V

4
The range of a matrix is equal to the span of its columns.
Lemma
Let T : V → W be a linear map. Then ker(T ) = 0 if and only if T is injective.

Lemma
Let T : V → W be a linear map with dim V = dim W < ∞. Then T is injective ⇐⇒ T is surjective.

1.2.2 Matrix representations


Denition Matrix representation of a linear transformation
Let V be a n-dimensional vector space over F with basis  and linear transformation .
Then the matrix representation with respect to  is the matrix β [T ]β
where each column is a vector of the basis with T applied to it.
     

 1 0 0  

  0  1  0  
     
Every n-dimensional vector space (mapped to Fn ) has the standard basis  .  ,  .  ,    ,  . 

  ..   ..   .. 

 

0 0 1

Denition Change of basis matrix


The change of basis matrix from  to  is the n × n matrix β [I]γ
whose columns are the elements of B expressed in A.

Theorem
Let V be a nite dimensional vector space with bases ,  and linear transformation T .
Let S =γ [I]β . Then β [I]γ = S −1 and γ [T ]γ = S β [T ]β S −1

Denition Similarity
Two matrices A and B are similar if and only if A = Q−1 BQ for some Q.

1.3 Inner product spaces


Denition Inner product space over R
An inner product space over R is an R-vectorspace V together with a map

V × V → R : (v1 , v2 ) → ⟨v1 , v2 ⟩

This map satises the axioms of an inner product:


1. Linearity For any xed v2 , the map V → R : v  V → ⟨v, v2 ⟩ is linear.
2. Symmetry ⟨v1 , v2 ⟩ = ⟨v2 , v1 ⟩ for all v1 , v2  V
3. Positivity ⟨v, v⟩ ≥ 0 and if v ̸= 0, ⟨v, v⟩ > 0

Denition Hermitian inner product space (Inner product space over C)


An inner product space over C is a C-vectorspace V together with a map

V × V → C : (v1 , v2 ) → ⟨v1 , v2 ⟩

This map satises the following axioms:


1. Linearity For any xed v2 , the map V → C : v  V → ⟨v, v2 ⟩ is linear.
The map is not necessarily linear if v1 is xed instead of v2 .
2. Symmetry ⟨v1 , v2 ⟩ = ⟨v2 , v1 ⟩ for all v1 , v2  V
3. Positivity ⟨v, v⟩  R and if v ̸= 0, ⟨v, v⟩ > 0
Notation: v1 ⊥ v2 means that v1 and v2 have inner product 0.
   
a1 b1 n
 ..   ..  
The standard inner product for C is  .  ·  .  =
n
a j bj
an bn j=1

5
Denition W ⊥
Let W ⊆ V be a subspace. Then W ⊥ := v  V : ⟨v, w⟩ = 0 ∀w  W 
W ⊥ is a subspace of V .
Theorem
If V is nite-dimensional and W ⊆ V , then dim W + dim W ⊥ = dim V .

1.3.1 Norms
Denition Norm
Let V be an F-vectorspace.
A vector-space norm on V is a map ∥ · ∥ : V → R where ∥v∥ is the norm of v  V .
This map has to satisfy the following conditions for all v  V :
1. Non-negativity ∥v∥ ≥ 0
2. Positivity ∥v∥ = 0 ⇐⇒ v = 0
3. Homogeneity ∥λv∥ = λ · ∥v∥ ∀λ  F
4. Triangle inequality ∥v + w∥ ≤ ∥v∥ + ∥w∥

Theorem

Let V be an inner product space. Then we can dene a norm as ∥v∥ = ⟨v, v⟩

Theorem Cauchy-Schwarz-Bunyakovsky inequality


⟨v, w⟩ ≤ ∥v∥ · ∥w∥

You might also like