Cogsci 131 Linear Algebra: Tom Griffiths

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

CogSci 131

Linear algebra

Tom Griffiths

Outline
Vectors
Break
Matrices

Vectors
A vector is an efficient way of expressing a set
of values of different features or dimensions
" x1 %
#1&
x =$ '
e.g. x = % (
# x2 &
$ 1'
Often, it is helpful to think of a vector as
corresponding to a point in space
x2

x1
x

Multiplying a vector by a scalar


" ax1 %
ax = $ '
# ax 2 &
# 1 & # 1.5 1 & # 1.5 &
e.g.1.5x = 1.5% ( = %
(=%
(
1
1.5

1
1.5
$
'
$
'
$
'

x2

x1
x
ax

Adding vectors
" x1 + y1 %
x + y =$
'
# x2 + y2 &
# 1 & # 2 & # 1+ 2 & # 3 &
% (+% (=%
(=% (
1
1
1+
1
2
$
'
$
'
$
'
$
'

y
x

x+y

Linear combinations
"1%
v1 = $ '
#2&

" 3%
v2 = $ '
# 2&

"9%
u=$ '
#10&

Can we find c1 and c2 such that u = c1v1+c2v2?

c1,c2 > 0

any c1,c2

Linear combinations
A vector v is a linear combination of v1, , vn if
there are c1, , cn s.t. v = c1v1 + c2v2 + + cnvn
The set of all linear combinations is the span
of v1, , vn
e.g. the span of

"1%
$ '
#0&

and

"0%
$ 'is
#1&

all vectors in 2D space

A set of vectors spanning a space provides a

basis for that space


e.g.

"1%
$ '
#0&

and

"0%
$ ' form
#1&

a basis for 2D space

Inner products

" y1 % T
y = $ ' y = ( y1 y 2 )
# y2 &
transpose
(y in Matlab)
" x1 % T
x = $ ' x = ( x1 x 2 )
x
#
2&

inner
"
%
x
1
y T x = ( y1 y 2 )$ ' = y1 x1 + y 2 x 2
# x2 &

Length

x T x = x12 + x 22

Based on the Pythagorean theorem, we can


define the length (or norm) of a vector to be
T

2
1

x = x x= x +x

2
2

Angle
The cosine of the angle between two vectors
is given by
xT y
cos( ) =
x y
So

x T y = x y cos( )

Two vectors x and y are orthogonal when


xTy = 0, corresponding to cos() = 0, meaning
thatthe vectors form a right-angle

Algebra of inner products


xT y = yT x

commutativity

c(x T y) = (cx)T y = x T (cy)


T

x (y1 + y 2 ) = x y1 + x y 2

linearity

Break

Up next:
Matrices

Matrices
y = Wx
y2

y
y1

" y1 %
y =$ '
# y2 &

x2

x1
x

" x1 %
x =$ '
# x2 &

" w11 w12 %


W =$
'
# w 21 w 22 &

Matrix multiplication
y = Wx
y2

" x1 %
" y1 %
" w11 w12 %
y =$ ' W =$
' x =$ '
# x2 &
# y2 &
# w 21 w 22 &

y1

y = Wx
x

x1
x

y1 = w11 x1 + w12 x 2
y 2 = w 21 x1 + w 22 x 2

Matrix multiplication

Inner and outer products


" y1 % T
y = Wx y = $ ' y = ( y1 y 2 )
y2 &
#
transpose
y
y
(y in Matlab)
" x1 % T
x
=
x
=
x
x
(
$
'
1
2)
y
x
#
2&

inner

"
%
x
1
T
y
x = ( y1 y 2 )$ ' = y1 x1 + y 2 x 2
x
# x2 &

outer

" y1 %
" y1 x1 y1 x 2 %
x
T
yx = $ '( x1 x 2 ) = $
'
x
# y2 &
# y 2 x1 y 2 x 2 &

Basic operations on matrices


Multiplication by a scalar

" aw11
aW = $
# aw 21

aw12 %
'
aw 22 &

Addition of matrices

" w11 + m11


W +M =$
# w 21 + m21

w12 + m12 %
'
w 22 + m22 &

Algebra of matrices and vectors


W(ax) = aWx
W(x + y) = Wx + Wy

(W + M)x = Wx + Mx

Matrix multiplication
Basic rules follow from multiplication of a
vector by a matrix

Algebra of matrix multiplication


W(aM) = aWM

W(M + N) = WM + WN

(M + N)W = MW + NW

Linearity
A function f is linear if:
f(cx) = cf(x)
f(x + y) = f(x) + f(y)
A function using matrix multiplication (with
inner products as a special case) is linear
In fact, any linear function can be represented
using matrix multiplication
Linear algebra: the math of linear functions

Eigenvectors and eigenvalues


Just think about square matrices mapping
from a space back to itself
Most vectors will change direction and length,
but some will just change length
these are the eigenvectors

Wv = v
eigenvector

eigenvalue

Eigenvectors and eigenvalues


Examples:

Each matrix can have many eigenvectors and


corresponding eigenvalues
e.g. the second matrix shown above has
"1%
"0%
eigenvectors $0' and $1' , with eigenvalues 3 and 4
# &
# &
any scalar multiple of an eigenvector is an
eigenvector, so typically the length is fixed

Some special kinds of matrices


The identity matrix maps every vector to itself
" 1 0%
I=$
'
# 0 1&
A diagonal matrix only has non-zero entries
along the diagonal
eigenvalues
are diagonal entries, eigenvectors

"1%
"0%
are $0' and $1'
# &
# &
" 3 0%
e.g.

D=$
'
#0 4 &

Thursday
Semantic networks and spreading activation
We will use some linear algebra

You might also like