UNIT II Eigenvalues and Eigenvectors

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors

---------------------------------------------------------------------------------------------------------------------------

Unit
Eigen Values and Eigen Vectors
2

Syllabus content: - Eigenvalues, Eigenvectors and Properties


The word eigenvalue comes from the German word Eigenwert where Eigen means
characteristic and Wert means value.
Let A be square matrix and a scalar 𝜆 is said to be eigen value of matrix A , if there
exists a nonzero vector X such that𝑨𝑿 = 𝝀𝑿then 𝜆 is called eigen value and X is
called eigen vector corresponds to eigen value 𝜆
Steps to solve Eigen Value Problems
I. Find (𝐴 − 𝜆𝐼) called as characteristic matrix.
II. |𝐴 − 𝜆𝐼| is called characteristic polynomial of A
III. Put |𝐴 − 𝜆𝐼| = 0 called as characteristic equation of A gives eigen value 𝜆 =
𝜆1 , 𝜆2 , 𝜆3 … … ..
IV. For 2x2 matrix , characteristic equation is𝜆2 − 𝑆1 𝜆 + |𝐴| = 0 , where 𝑆1= Sum
of Diagonal Elements
V. For 3x3 matrix , characteristic equation is 𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0 , where
𝑆1= Sum of Diagonal Elements, 𝑆2 = Sum of minors of principal diagonal
elements
VI. By using IV or V find values of 𝜆 and then find corresponding eigen vector
VII. (𝐴 − 𝜆𝐼)𝑋 = 0 is the eigen vector equation to find eigen vectors 𝑋 =
𝑋1 , 𝑋2 , 𝑋3 … … corresponding to eigen values 𝜆 = 𝜆1 , 𝜆2 , 𝜆3 … …
Properties of Eigenvalues and Eigenvectors
• For triangular or diagonal matrix,the eigen values are diagonal elements only.
1 1 1 1
• If 𝜆1 , 𝜆2 , 𝜆3 … … … . 𝜆𝑛 are eigen values of A then , , … … 𝜆 are eigen
𝜆1 𝜆2 𝜆3 𝑛
values of 𝐴 and −1
𝜆1𝑛 , 𝜆𝑛2 … … … 𝜆𝑛𝑛
are eigen values of 𝐴 𝑛

• If matrix A is Symmetric then eigen vector are orthogonal i.e. 𝑋 ′ 𝑋2 = 0


• Trace of A is defined as Sum of eigen values of A = Sum of Diagonal
elements of A.
• The product of eigen values of matrix A = The determinant of a matrix A.
• The set of all characteristic roots (eigen value) is called spectrum of A.
• The eigen values of a symmetric matrix are real.
• The eigen values of a matrix A and 𝐴′ are the same.

Example : Find the eigen values and eigen vectors of the following matrix
1 0 −1
𝐴 = [1 2 1]
2 2 3

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.1


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Solution:
1 0 −1
Step I:- 𝐴 = [1 2 1 ]
2 2 3
1−𝜆 0 −1
𝐴 − 𝜆𝐼 = [ 1 2−𝜆 1 ]
2 2 3−𝜆
1−𝜆 0 −1
Step II:-|𝐴 − 𝜆𝐼| = 0 ↦ | 1 2−𝜆 1 |=0
2 2 3−𝜆
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
𝑆1=1+2+3 = 6
2 1 1 −1 1 0
𝑆2 = | |+| |+| | = 11
2 3 2 3 1 2
|𝐴| = 1 × (6 − 2) − 1 × (2 − 4) = 6
𝜆3 − 6𝜆2 + 11𝜆 − 6 = 0
𝜆 = 1,2,3are the eigen values of A
1−𝜆 0 −1 𝑥 0
Step III:-(𝐴 − 𝜆𝐼)𝑋 = 0 ↦ [ 1 2−𝜆 𝑦
1 ] [ ] = [0]
2 2 3−𝜆 𝑧 0
0 0 −1 𝑥 0
For 𝜆 = 1, [1 1 1 ] [𝑦] = [0] ⇒ 𝑥 + 𝑦 = 0, 𝑧 = 0
2 2 2 𝑧 0
−𝑘 −1
Let y = k, x = - k ,z = 0 so 𝑋1 = [ 𝑘 ] = 𝑘 [ 1 ]
0 0
−1 0 −1 𝑥 0
For 𝜆 = 2, [ 1 0 1 ] [𝑦] = [0] ⇒ 𝑥 + 𝑧 = 0,2𝑥 + 2𝑦 + 𝑧 = 0
2 2 1 𝑧 0
−𝑘 −2
Let z = k, x = - k , y = k/2 so 𝑋2 = [𝑘/2] = 𝑘 [ 1 ]
𝑘 2
−2 0 −1 𝑥 0
For 𝜆 = 3, [ 1 −1 1 ] [𝑦] = [0] ⇒ 2𝑥 + 𝑧 = 0, 2𝑥 + 2𝑦 = 0
2 2 0 𝑧 0
𝑘 1
Let x = k , y = -k, z = -2k so 𝑋3 = [− 𝑘 ] = 𝑘 [−1]
−2𝑘 −2

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.2


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Example:Find the eigen values and eigen vectors of the following matrix
1 1 1
𝐴 = [0 2 1]
0 0 3
Solution:-
1 1 1
Step I:- 𝐴 = [0 2 1 ]
0 0 3
Given matrix is upper triangular matrix,so diagonal elements are eigen values of A
∴ 𝜆 = 1,2,3are the eigen values of A
To find eigen vector
1−𝜆 0 1 𝑥 0
Step II:- (𝐴 − 𝜆𝐼)𝑋 = 0 ↦ [ 0 2−𝜆 1 ] [𝑦] = [0]
0 0 3−𝜆 𝑧 0
0 1 1 𝑥 0
For 𝜆 = 1, [0 1 1] [𝑦] = [0] ⇒ 𝑥 + 𝑦 = 0, 𝑧 = 0
0 0 2 𝑧 0
−𝑘 −1
Let y = k, x = - k, z = 0 so 𝑋1 = [ 𝑘 ] = 𝑘 [ 1 ]
0 0
−1 1 1 𝑥 0
For𝜆 = 2, [ 0 0 1] [𝑦] = [0] ⇒ 𝑧 = 0, −𝑥 + 𝑦 + 𝑧 = 0
0 0 1 𝑧 0
𝑘 1
Let x = k, y = k, z = 0 so𝑋2 = [𝑘] = 𝑘 [1]
0 0
−2 1 1 𝑥 0
For𝜆 = 3, [ 0 −1 1] [𝑦] = [0] ⇒ −2𝑥 + 𝑦 + 𝑧 = 0, −𝑦 + 𝑧 = 0
0 0 0 𝑧 0
𝑘 1
Let z = k , y = k, x = k so 𝑋3 = [𝑘] = 𝑘 [1]
𝑘 1
Example1.16:-Find the eigen values and eigen vectors of the following matrix
1 −6 −4
𝐴 = [0 4 2]
0 −6 −3
Solution:-
1 −6 −4
Step I: 𝐴 = [0 4 2]
0 −6 −3

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.3


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

1−𝜆 −6 −4
𝐴 − 𝜆𝐼 = [ 0 4−𝜆 2 ]
0 −6 −3 − 𝜆
1−𝜆 −6 −4
Step II:- |𝐴 − 𝜆𝐼| = 0 ↦ | 0 4−𝜆 2 |=0
0 −6 −3 − 𝜆
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
∴ 𝜆 = 1,1,0are the eigen values of A
1−𝜆 −6 −4 𝑥 0
Step III:- (𝐴 − 𝜆𝐼)𝑋 = 0 ↦ [ 0 4−𝜆 2 ] [𝑦 ] = [ 0]
0 −6 −3 − 𝜆 𝑧 0
1 −6 −4 𝑥 0
For 𝜆 = 0, [0 4 2 ] [𝑦] = [0]
0 −6 −3 𝑧 0
⇒ 𝑥 − 6𝑦 − 4𝑧 = 0,4𝑦 + 2𝑧 = 0, −6𝑦 − 3𝑧 = 0
2𝑘 2
Let 𝑧 = 𝑘, 𝑦 = − 𝑘, 𝑥 = 2𝑘 so 𝑋1 = [−𝑘] = 𝑘 [−1]
𝑘 1
0 −6 −4 𝑥 0
For𝜆 = 1, [0 3 2 ] [𝑦] = [0]
0 −6 −4 𝑧 0
⇒ 3𝑦 + 2𝑧 = 0

𝑥 𝑡 1 0
2𝑘 2𝑘
Let 𝑧 = 𝑘, 𝑥 = 𝑡, 𝑦 = − 3 , so 𝑋 = [𝑦] = [− 3 ] = 𝑡 [0] + 𝑘 [−2/3]
𝑧 𝑘 0 1
1 0
𝑠o𝑋2 = [0] , 𝑋3 = [−2/3]
0 1
Exercise
1. Find the eigen values and eigen vectors for lowest eigen value of the following
matrix
4 6 6 4 2 −2
1)[ 1 3 2] 2)[−5 3 2]
−1 −4 −3 −2 4 1
0 2 0 1 0 0
3)[3 −2 3] 4) [0 3 −3]
0 3 0 0 −1 3

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.4


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

 1.5 0 1 
5) − 0.5 0.5 − 0.5
 
− 0.5 0 0 

2. Find the eigen values and eigen vectors for highest eigen value of the following
matrix
1 1 −2 1 0 −4
1) [−1 2 1] 2)[ 0 5 4]
0 1 −1 −4 4 3
3. Find the eigen values and eigen vectors for of the following matrix
1 1 1 0 1 0
14 −10
1) [0 2 1] 2)[0 0 1] 3)[ ]
5 −1
0 0 3 1 −3 3
4. Find the eigen values and eigen vectors for repeated eigen of the following matrix
4 −1 1 3 1 1 2 0 0 1 2 3
1) [−1 4 −1] 2)[1 3 −1]3)[0 2 0]4)[2 4 6]
1 −1 4 1 −1 3 0 0 2 3 6 9

Some More Applications of the Eigenvalues and Eigenvectors of a square matrix

1. Communication systems: Eigenvalues were used by Claude Shannon to


determine the theoretical limit to how much information can be transmitted
through a communication medium like your telephone line or through the air.
This is done by calculating the eigenvectors and eigenvalues of the
communication channel (expressed a matrix), and then water filling on the
eigenvalues. The eigenvalues are then, in essence, the gains of the fundamental
modes of the channel, which themselves are captured by the eigenvectors.
2. Designing bridges: The natural frequency of the bridge is the eigenvalue of
smallest magnitude of a system that models the bridge. The engineers exploit
this knowledge to ensure the stability of their constructions. [Watch the video
on the collapse of the Tacoma Narrow Bridge which was built in 1940]
3. Designing car stereo system: Eigenvalue analysis is also used in the design of
the car stereo systems, where it helps to reproduce the vibration of the car due
to the music.
4. Electrical Engineering: The application of eigenvalues and eigenvectors is
useful for decoupling three-phase systems through symmetrical component
transformation.
5. Mechanical Engineering: Eigenvalues and eigenvectors allow us to "reduce" a
linear operation to separate, simpler, problems. For example, if a stress is
applied to a "plastic" solid, the deformation can be dissected into "principle
directions"- those directions in which the deformation is greatest. Vectors in the

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.5


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

principle directions are the eigenvectors and the percentage deformation in each
principle direction is the corresponding eigenvalue.
6. Oil companies frequently use eigenvalue analysis to explore land for oil. Oil,
dirt, and other substances all give rise to linear systems which have different
eigenvalues, so eigenvalue analysis can give a good indication of where oil
reserves are located. Oil companies place probes around a site to pick up the
waves that result from a huge truck used to vibrate the ground. The waves are
changed as they pass through the different substances in the ground. The
analysis of these waves directs the oil companies to possible drilling sites.
7. Eigenvalues are not only used to explain natural occurrences, but also to
discover new and better designs for the future. Some of the results are quite
surprising. If you were asked to build the strongest column that you could to
support the weight of a roof using only a specified amount of material, what
shape would that column take? Most of us would build a cylinder like most
other columns that we have seen. However, Steve Cox of Rice University and
Michael Overton of New York University proved, based on the work of J.
Keller and I. Tadjbakhsh, that the column would be stronger if it was largest at
the top, middle, and bottom. At the points of the way from either end, the
column could be smaller because the column would not naturally buckle there
anyway. Does that surprise you? This new design was discovered through the
study of the eigenvalues of the system involving the column and the weight
from above. Note that this column would not be the strongest design if any
significant pressure came from the side, but when a column supports a roof, the
vast majority of the pressure comes directly from above.
8. Very (very, very) roughly then, the eigenvalues of a linear mapping is a
measure of the distortion induced by the transformation and the eigenvectors
tell you about how the distortion is oriented. It is precisely this rough picture
which makes PCA (Principal Component Analysis = A statistical procedure)
very useful.
9. Using singular value decomposition for image compression. This is a note
explaining how you can compress and image by throwing away the small
eigenvalues of AATAAT. It takes an 88megapixel image of an Allosaurus, and
shows how the image looks after compressing by
selecting 11,1010,2525,5050,100100 and 200200 of the largest singular values
10. Deriving Special Relativity is more natural in the language of linear algebra. In
fact, Einstein's second postulate really states that "Light is an eigenvector of the
Lorentz transform." This document goes over the full derivation in detail.
11. Spectral Clustering. Whether it's in plants and biology, medical imaging,
buisness and marketing, understanding the connections between fields on
Facebook, or even criminology, clustering is an extremely important part of
modern data analysis. It allows people to find important subsystems or patterns
inside noisy data sets. One such method is spectral clustering which uses the
eigenvalues of a the graph of a network. Even the eigenvector of the second

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.6


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

smallest eigenvalue of the Laplacian matrix allows us to find the two largest
clusters in a network.
12. Dimensionality Reduction/PCA. The principal components correspond the the
largest eigenvalues of ATAATA and this yields the least squared projection
onto a smaller dimensional hyperplane, and the eigenvectors become the axes
of the hyperplane. Dimensionality reduction is extremely useful in machine
learning and data analysis as it allows one to understand where most of the
variation in the data comes from.
13. Low rank factorization for collaborative prediction. This what Netflix does (or
once did) to predict what rating you'll have for a movie you have not yet
watched. It uses the SVD, and throws away the smallest eigenvalues
of ATAATA.
14. The Google Page Rank algorithm. The largest eigenvector of the graph of the
internet is how the pages are ranked.
15. When you watch a movie on screen (TV/movie theater,..), though the
picture(s)/movie you see is actually 2D, you do not lose much information from
the 3D real world it is capturing. That is because the principal eigen vector is
more towards 2D plane the picture is being captured and any small loss of
information (depth) is inferred automatically by our brain. (Reason why we
most of the times take photos using camera facing directly at us, not from the
top of the head). Each scene requires certain aspects of the image to be
enhanced, that is the reason the camera man/woman chooses his/her camera
angle to capture most of those visual aspects. (apart from colour of costume,
background scene and background music)
16. If you eat pizza, french fries,...or any food.... you are typically translating their
taste into sour, sweet, bitter,salty, hot, etc ... principal components of taste --
though in reality the way a food is prepared is formulated in terms of
ingredients' ratios (sugar,flour,butter, etc...10's of 100's of things that go into
making a specific food) ... However our mind will transform all such
information into the principal components of taste(eigen vector having sour,
bitter, sweet, hot,..) automatically along with the food texture and smell. So we
use eigen vectors every day in many situations without realizing that's how we
learn about a system more effectively. Our brain simply transforms all the
ingredients, cooking methods, final food product into some very effective eigen
vector whose elements are taste sub parts,smell and visual appearance
internally. (All the ingredients and their quantities along with the cooking
procedure represent some transformation matrix A and we can find some
principal eigen vector(s) V with elements as taste+smell+appearance+touch
having some linear transformation directly related. AV = wV , where w
represent eigen values scalar and V an eigen vector) (top wine tasters probably
have bigger taste+smell+appearance eigen vector and also with much bigger
eigen values in each dimension. This concept can be extended to any field of
study.)

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.7


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

17. if we take pictures of a person from many angles(front , back, top, side..) on a
daily basis and would like to measure the changes in the entire body as one
grows,... we can get the most information from the front angle with the axis of
camera perpendicular to the line passing from crown of the head to a point
passing between one's feet. This axis/camera angle captures the most useful
information to measure a person's outer physical body changes as the age
progresses. This axis becomes a principal eigen vectors with the largest eigen
values. (Note: the data/images that we capture directly from the top of the
person may give very less useful information compared to the camera directly
facing him/her in this situation. That is the reason why we use PCA-Principal
Component Analysis technique in determining most effective eigen vectors and
related eigen values to capture most of the needed information without
bothering about all the remaining axes of data capture.)Hope this helps in
understanding why and how we use eigen vectors and eigen values for better
perception in whatever we do on a day to day . Eigen vectors represent those
axes of perception/learn along which we can know/understand/perceive things
around us in very effective way(s).
18. Finally it boils down to the differences between person to person, in
consciously/sub-consciously building/refining such principal eigen vectors and
related eigen values, in each field of learning that differentiate one person from
the other. ( ex: musicians, artists, scientists, mathematicians, camera men,
directors, teachers, doctors, engineers, parents, stock market brokers, weather
prediction, ....)
19. In aeronautical engineering eigenvalues may determine whether the flow over a
wing is laminar or turbulent.
20. In electrical engineering they may determine the frequency response of an
amplifier or the reliability of a national power system.
21. In structural mechanics eigenvalues may determine whether an automobile is
too noisy or whether a building will collapse in an earth-quake.
22. In probability they may determine the rate of convergence of a Markov process.
23. In ecology they may determine whether a food web will settle into a steady
equilibrium.
24. In numerical analysis they may determine whether a discretization of a
differential equation will get the right answer or how fast a conjugate gradient
iteration will converge.

Syllabus content:-Diagonalisation of real matrix:-

If a square matrix A of order n is diagonalizable then there exists matrix P


such that 𝑃−1 𝐴𝑃 = 𝐷. Where D is diagonal matrix and P is called modal matrix.

Algebraic Multiplicity:-
Let A be 𝑛 × 𝑛 matrix with eigen value 𝜆. The algebraic multiplicity of 𝜆 is
the number of times 𝜆 is repeated as a root of the characteristics polynomial.
Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.8
Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Geometric Multiplicity:-
Geometric multiplicity of an eigen value is the number of linearly independent
eigen vectors associated with it.
Note:-
1) Let A be 𝑛 × 𝑛 matrix with eigen value 𝜆.
The geometric multiplicity of 𝜆 = 𝑛 − 𝜌(𝐴 − 𝜆𝐼).
2) If algebraic multiplicity of each eigen value is equal to geometric multiplicity
then matrix is diagonalizable.
3) If all eigen values of matrix A are distinct then matrix A is diagonalizable.
4) Matrix P is formed by grouping the eigenvectors of A in square matrix form.
5) Diagonal matrix D is known as spectral matrix and it has eigen values of A as
its diagonal elements.
1 1 3
Example 1:Determine whether the matrix 𝐴 = [1 5 1] in diagonalizable. If
3 1 1
diagonalizable, find modal matrix and diagonal matrix.
Solution: Characteristic equation is given by,
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
𝑆1 = 1 + 5 + 1 = 7
5 1 1 3 1 1
𝑆2 = [ ]+[ ]+[ ]=4−8+4 =0
1 1 3 1 1 5
|𝐴| = 1(5 − 1) − 1(1 − 3) + 3(1 − 15) = 4 + 2 − 42
= −36
Therefore characteristic equation is
𝜆3 − 7𝜆2 + 36 = 0
Solving, we get
𝜆 = −2,3,6
Here all eigen values of matrix A are distinct
∴ Given matrix A is diagonalizable.
To find modal and diagonal matrix
consider
[𝐴 − 𝜆𝐼]𝑋 = 0
1−𝜆 1 3 𝑥
[ 1 5−𝜆 1 ] [ 𝑦 ]=0
3 1 1−𝜆 𝑧
For 𝜆 = −2
3 1 3 𝑥
[1 7 1] [𝑦] = 0
3 1 3 𝑧
3𝑥 + 𝑦 + 3𝑧 = 0
𝑥 + 7𝑦 + 𝑧 = 0
By Cramer’s rule
𝑥 −𝑦 𝑧
= =
1 3 3 3 3 1
[ ] [ ] [ ]
7 1 1 1 1 7
𝑥 −𝑦 𝑧
= = = 𝑡(𝑠𝑎𝑦)
−20 0 20

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.9


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

𝑥 = −20𝑡, 𝑦 = 0, 𝑧 = 20𝑡
𝑥 −20𝑡 −1
[𝑦] = [ 0 ] = 20𝑡 [ 0 ]
𝑧 20𝑡 1
−1
∴ 𝑋1 = [ 0 ]
1
Similarly for 𝜆 = 3 and 𝜆 = 6 we get
−1 1
𝑋2 = [ 1 ] , 𝑋3 = [2]
−1 1
−1 −1 1
∴ Modal matrix 𝑃 = [ 0 1 2]
1 −1 1
−2 0 0
𝐷 = [ 0 3 0]
0 0 6
2 1
Example: Find the matrix P which transforms the matrix 𝐴 = [ ] in diagonal
2 3
form.
2 1
Solution:𝐴 = [ ]
2 3
Characteristic equation is given by,
𝜆2 − 𝑆1 𝜆 + |𝐴| = 0
𝑆1 = 2 + 3 = 5
|𝐴| = 6 − 2 = 4
∴ 𝜆2 − 5𝜆 + 4 = 0
𝜆 = 1, 4
Consider, [𝐴 − 𝜆𝐼]𝑋 = 0
2−𝜆 1 𝑥
[ ] [𝑦 ] = 0
2 3−𝜆
For 𝜆 = 1
1 1 𝑥
[ ][ ] = 0
2 2 𝑦
1
This gives, 𝑋1 = [ ]
−1
For 𝜆 = 4
−2 1 𝑥
[ ][ ] = 0
2 −1 𝑦
1
This gives, 𝑋2 = [ ]
2

1 1
∴ Modal matrix 𝑃 = [ ]
2 −1
4 0
𝐷=[ ]
0 1

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.10


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Exercise

a) Determine whether the following matrices are diagonalizable. If diagonalizable,


find modal matrix and diagonal matrix.
0 1
1) 𝐴 = [ ]
−6 −5
1 2
2) 𝐴 = [ ]
3 2
3 2
3) 𝐴 = [ ]
1 4
5 4
4) 𝐴 = [ ]
1 2
−2 2 −3
5) 𝐴 = [ 2 1 −6]
−1 −2 0
1 2 2
6) 𝐴 = [ 0 2 1]
−1 2 2
1 1 −2
7) 𝐴 = [−1 2 1 ]
0 1 −1
2 −2 2
8) 𝐴 = [1 1 1]
1 3 −1
3 1 4
9) 𝐴 = [0 2 6]
0 0 5
6 −2 2
10) 𝐴 = [−2 3 −1]
2 −1 3
1 −2 0
11) 𝐴 = [1 2 2]
1 2 3
9 −1 −
b) Diagonalise the matrix A, where 𝐴 = [ 3 −1 3 ]
−7 1 −7
0 1 1
c) Diagonalise the matrix 𝐴 = [1 0 1]
1 1 0
8 −6 2
d) Diagonalise the matrix𝐴 = [−6 7 −4]
2 −4 3
3 −1 1
e) Diagonalise the matrix[−1 5 −1]
1 −1 3

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.11


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------
Syllabus content: -Quadratic Forms: - Positive definite

A homogeneous polynomial of the second degree in any number of variables


is called a quadratic form.
If a real quadratic form
𝑋 𝑇 𝐴𝑋 = 𝑎11 𝑥 2 + 𝑎22 𝑦 2 + 𝑎33 𝑧 2 + 2𝑎23 𝑦𝑧 + 2𝑎13 𝑥𝑧 + 2𝑎12 𝑥𝑦
Then the matrix of the quadratic form
𝑎11 𝑎12 𝑎13
𝐴 = [𝑎12 𝑎22 𝑎23 ]
𝑎13 𝑎23 𝑎33
Nature of quadratic form: A real quadratic form 𝑋 𝑇 𝐴𝑋 is said to be
i) Positive definite if all eigenvalues of 𝐴 > 0
ii) Negative definite if all eigenvalues of 𝐴 < 0
iii) Positive semidefinite if all eigenvalues of 𝐴 ≥ 0 (at least one eigenvalue
is 0)
iv) Negative semidefinite if all eigenvalues of 𝐴 ≤ 0 (at least one eigenvalue
is 0)
v) Indefinite if some of the eigenvalues are positive and some are negative.

Example: Find the nature of the following quadratic forms


𝑥 2 + 5𝑦 2 + 𝑧 2 + 2𝑥𝑦 + 2𝑦𝑧 + 6𝑧𝑥
Solution: The matrix of the quadratic form is
1 1 3
𝐴 = [1 5 1]
3 1 1
Characteristic equation is given by
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
𝑆1 = 1 + 5 + 1 = 7
𝑆2 = 4 − 8 + 4 = 0
|𝐴| = 1(5 − 1) − 1(1 − 3) + 3(1 − 15) = 4 + 2 − 42 = 36
∴ 𝜆3 − 7𝜆2 + 36 = 0
∴ The eigenvalues are −2,3,6
Two values are positive and 1 negative
∴ The given quadratic form is indefinite
Example:Find the nature of the following quadratic forms
3𝑥 2 + 5𝑦 2 + 3𝑧 2 − 2𝑦𝑧 + 2𝑧𝑥 − 2𝑥𝑦
Solution: The matrix of the quadratic form is
3 −1 1
𝐴 = [−1 5 −1]
1 −1 3
Characteristic equation is given by
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
𝑆1 = 3 + 5 + 3 = 11
𝑆2 = 14 + 8 + 14 = 36
∴ 𝜆 = 2,3,6
All the eigenvalues being positive, given quadratic form is positive definite.

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.12


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Exercise
State the nature of following quadratic forms
1) 3𝑥 2 + 3𝑦 2 + 3𝑧 2 + 2𝑥𝑦 + 2𝑥𝑧 − 2𝑦𝑧
2) 4𝑥 2 + 3𝑦 2 + 𝑧 2 − 8𝑥𝑦 − 6𝑦𝑧 + 4𝑧𝑥
3) 8𝑥 2 + 7𝑦 2 + 3𝑧 2 − 12𝑥𝑦 − 8𝑦𝑧 + 4𝑧𝑥
4) 6𝑥 2 + 3𝑦 2 + 3𝑧 2 − 2𝑦𝑧 + 4𝑧𝑥 − 4𝑥𝑦
5) 𝑥 2 + 3𝑦 2 + 3𝑧 2 − 2𝑦𝑧
6) 2𝑥 2 − 6𝑥𝑦 + 𝑧 2
7) 5𝑥 2 + 26𝑦 2 + 2𝑦𝑧 + 6𝑥𝑧 + 4𝑥𝑦
8) 𝑥 2 + 2𝑦 2 + 3𝑧 2 − 4𝑦𝑧 + 6𝑥𝑦 + 2𝑥𝑧
9) 𝑥 2 − 2𝑦 2 − 3𝑧 2 + 5𝑥𝑦
10) 𝑥 2 + 2𝑧 2 + 8𝑦𝑧 + 6𝑥𝑦 + 4𝑥𝑧
11) 5𝑥 2 + 26𝑦 2 + 10𝑧 2 + 4𝑦𝑧 + 6𝑥𝑦 + 14𝑥𝑧
12) 8𝑥 2 + 7𝑦 2 + 3𝑧 2 − 8𝑦𝑧 + 12𝑥𝑦 + 4𝑥𝑧
13) −2𝑥 2 − 𝑦 2 − 3𝑧 2

Reduction of Quadratic form to canonical form by linear


transformation:-
Definition: Linear Transformation:-
A transformation from (𝑦1 , 𝑦2 , … , 𝑦𝑛 ) to (𝑥1 , 𝑥2 , … , 𝑥𝑛 ) is defined as 𝑋 = 𝐴𝑌,
where A is transformation matrix.

Definition: Congruent Matrices:-


A square matrix B of order n is said to be congruent of matrix A of order n if
there exist a non singular matrix P such that 𝐵 = 𝑃𝑇 𝐴𝑃
Procedure:-
Step I: Write the matrix of given quadratic form say A
Step II: Consider 𝐴 = 𝐼𝐴𝐼 and reduce L.H.S. matrix A to diagonal form using row as
well as column transformations without interchanging any two rows or columns.
Step III: All row operations which we are going to apply on matrix A, perform the
same on prematrix I on R.H.S. and similar for column operations on post matrix I on
R.H.S.
Step IV: Denote reduced matrix A as D matrix and prematrix as 𝑃𝑇 and post matrix
as P
We get 𝐷 = 𝑃𝑇 𝐴𝑃
Step V: Write linear transformation 𝑋 = 𝑃𝑌
Sum of Squares= 𝑌 𝑇 𝐷𝑌

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.13


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Example. Express the following quadratic form in canonical form using linear
transformation 𝑄(𝑥) = 6𝑥12 + 3𝑥22 + 3𝑥32 − 4𝑥1 𝑥2 + 4𝑥1 𝑥3 − 2𝑥2 𝑥3
6 −2 2 𝑥1
Solution: 𝑄(𝑥) = [𝑥1 𝑥2 𝑥3 ] [−2 3 −1] [𝑥2 ]
2 −1 3 𝑥3
6 −2 2
𝐴 = [−2 3 −1]
2 −1 3
Consider 𝐴 = 𝐼𝐴𝐼
6 −2 2 1 0 0 1 0 0
[−2 3 −1] = [0 1 0] 𝐴 [0 1 0]
2 −1 3 0 0 1 0 0 1
1 1
𝑅2 + 3 𝑅1 , 𝑅3 − 3 𝑅1

6 −2 2 1 0 0
7 1 1 1 0 0
0 − 1 0
3 3 = 3 𝐴 [0 1 0]
1 7 1 0 0 1
[ 0 − − 0 1]
3 3 ] [ 3
1 1
𝐶2 + 𝐶1 , 𝐶3 − 𝐶1
3 3
6 0 0 1 0 0
7 1 1 1 1
0 − 1 0 1 −
3 3 = 3 𝐴[ 3 3]
1 7 1 0 1 0
[0 − 3 3 ] [− 3 0 1] 0 0 1

1
𝑅3 + 𝑅2
7
6 0 0 1 0 0
7 1 1 1 1
0 − 1 0 1 −
3 3 = 3 𝐴[ 3 3]
16 2 1 0 1 0
[0 0 7 ] [− 7 7
1] 0 0 1

1
𝐶3 + 𝐶2
7
6 0 0 1 0 0 1 2
7 1 1−
0 0 1 0 3 7
3 = 3 𝐴 1
16 2 1 0 1
7
[0 0 −
7] [ 7 7
1] [
0 0 1 ]
𝐷 = 𝑃𝑇 𝐴𝑃

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.14


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------
1 2
−1
3 7
𝑃= 1
0 1
7
[0 0 1 ]
∴ Linear transformation is 𝑋 = 𝑃𝑌
1 2
𝑥1 1 −
3 7 𝑦1
𝑥
[ 2] = 1 [𝑦2 ]
𝑥3 0 1
7 𝑦3
[0 0 1 ]
1 2
𝑥1 = 𝑦1 + 𝑦2 − 𝑦3
3 7
1
𝑥2 = 𝑦2 + 𝑦3
7
𝑥3 = 𝑦3
And canonical form
𝑄 ′ (𝑥) = 𝑌 𝑇 𝐷𝑌
6 0 0
7 𝑦1
0 0
= [𝑦1 𝑦2 𝑦3 ] 3 𝑦
[ 2]
16 𝑦3
[0 0 7 ]
7 16 2
= 6𝑦12 + 𝑦22 + 𝑦
3 7 3

Exercise
Express the following quadratic forms to canonical form using linaer transformation
1. 𝑄(𝑥) = 10𝑥12 + 2𝑥22 + 5𝑥32 − 4𝑥1 𝑥2 − 10𝑥1 𝑥3 + 6𝑥2 𝑥3
2. 𝑄(𝑥) = 3𝑥12 + 2𝑥22 + 1𝑥32 + 4𝑥1 𝑥2 − 2𝑥1 𝑥3 + 6𝑥2 𝑥3
3. 𝑄(𝑥) = 2𝑥12 + 9𝑥22 + 6𝑥32 + 8𝑥1 𝑥2 + 6𝑥1 𝑥3 + 8𝑥2 𝑥3
4. 𝑄(𝑥) = 6𝑥12 + 3𝑥22 + 14𝑥32 + 4𝑥1 𝑥2 + 18𝑥1 𝑥3 + 4𝑥2 𝑥3
5. 𝑄(𝑥) = 3𝑥12 + 5𝑥22 + 3𝑥32 − 2𝑥1 𝑥2 + 2𝑥1 𝑥3 − 2𝑥2 𝑥3
6. 𝑄(𝑥) = 𝑥12 + 6𝑥22 + 18𝑥32 + 4𝑥1 𝑥2 + 8𝑥1 𝑥3 − 4𝑥2 𝑥3
7. 𝑄(𝑥) = 2𝑥12 + 𝑥22 − 3𝑥32 + 12𝑥1 𝑥2 − 4𝑥1 𝑥3 − 8𝑥2 𝑥3
8. 𝑄(𝑥) = 2𝑥12 + 7𝑥22 + 5𝑥32 − 8𝑥1 𝑥2 + 4𝑥1 𝑥3 − 10𝑥2 𝑥3
9. 𝑄(𝑥) = 3𝑥12 + 3𝑥22 + 3𝑥32 + 6𝑥1 𝑥2 + 2𝑥1 𝑥3 − 2𝑥2 𝑥3
10. 𝑄(𝑥) = 𝑥12 + 2𝑥22 − 7𝑥32 − 4𝑥1 𝑥2 + 8𝑥1 𝑥3
11. 𝑄(𝑥) = 10𝑥12 + 𝑥22 + 𝑥32 − 6𝑥1 𝑥2 + 6𝑥1 𝑥3 − 2𝑥2 𝑥3

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.15


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

Reduction to Canonical form by Orthogonal Transformation:-


Orthogonal Transformation:-
A transformation X=PY is said to be orthogonal if and only if matrix P is orthogonal.
Step I: From quadratic form write matrix of quadratic form A
Step II: Find all eigen values and orthogonal eigen vectors of matrix A. We get
modal matrix
𝑥1 𝑦1 𝑧1 𝜆1 0 0
𝑀 = [𝑥2 𝑦2 𝑧2 ] and diagonal matrix 𝐷 = [ 0 𝜆2 0 ]
𝑥3 𝑦3 𝑧3 0 0 𝜆3
Step III: To find matrix P, Normalize each vector of matrix M.
𝑥1 𝑦1 𝑧1
√𝑥1 2 + 𝑥2 2 + 𝑥3 2 √𝑦1 2 + 𝑦2 2 + 𝑦3 2 √𝑧1 2 + 𝑧2 2 + 𝑧3 2
𝑥2 𝑦2 𝑧2
𝑃=
√𝑥1 2 + 𝑥2 2 + 𝑥3 2 √𝑦1 2 + 𝑦2 2 + 𝑦3 2 √𝑧1 2 + 𝑧2 2 + 𝑧3 2
𝑥3 𝑦3 𝑧3
[√𝑥1 2 + 𝑥2 2 + 𝑥3 2 √𝑦1 2 + 𝑦2 2 + 𝑦3 2 √𝑧1 2 + 𝑧2 2 + 𝑧3 2 ]

The orthogonal transformation is X=PY then 𝑄 ′ = 𝑌 𝑇 𝐷𝑌


Example: Reduce the following quadratic form to canonical form by orthogonal
transformation
𝑄(𝑥) = 3𝑥 2 + 5𝑦 2 + 3𝑧 2 − 2𝑥𝑦 + 2𝑥𝑧 − 2𝑦𝑧
Solution:-𝑄(𝑥) = 𝑋 𝑇 𝐴𝑋
3 −1 1
𝐴 = [−1 5 −1]
1 −1 3
Characteristic equation is 𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
𝑆1 = 11, 𝑆2 = 36, |𝐴| = 36
𝜆3 − 11𝜆2 + 36𝜆 − 36 = 0
𝜆 = 2,3,6
For 𝜆 = 2 (𝐴 − 𝜆𝐼)𝑋 = 0
1 −1 1 𝑥 0
[−1 3 −1] [𝑦] = [0]
1 −1 1 𝑧 0
𝑥−𝑦+𝑧 = 0
−𝑥 + 3𝑦 − 𝑧 = 0

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.16


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------
1
𝑋1 = [ 0 ]
−1
For 𝜆 = 3 (𝐴 − 𝜆𝐼)𝑋 = 0
0 −1 1 𝑥 0
𝑦
[−1 2 −1] [ ] = [0]
1 −1 0 𝑧 0
0𝑥 − 𝑦 + 𝑧 = 0
−𝑥 + 2𝑦 − 𝑧 = 0
1
𝑋2 = [1]
1

For 𝜆 = 6 (𝐴 − 𝜆𝐼)𝑋 = 0
−3 −1 1 𝑥 0
[−1 −1 −1] [𝑦] = [0]
1 −1 −3 𝑧 0
−3𝑥 − 𝑦 + 𝑧 = 0
−𝑥 − 𝑦 − 𝑧 = 0
1
𝑋3 = [−2]
1
2 0 0 1 1 1
∴ 𝐷 = [0 3 0], 𝑀=[ 0 1 −2]
0 0 6 −1 1 1
Orthogonal transformation, 𝑋 = 𝑃𝑌 is,
1 1 1

𝑥1 √2 √3 √6 𝑦
1 2 1
[𝑥2 ] = 0 − [𝑦2 ]
𝑥3 √3 √6 𝑦3
1 1 1
[√2 √3 √6 ]
1 1 1
𝑥1 = 𝑦1 + 𝑦2 + 𝑦3
√2 √3 √6
1 2
𝑥2 = 𝑦2 − 𝑦
√3 √6 3

1 1 1
𝑥3 = 𝑦1 + 𝑦2 + 𝑦3
√2 √3 √6

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.17


Linear Algebra and Partial Differentiation Eigen Values and Eigen Vectors
---------------------------------------------------------------------------------------------------------------------------

and 𝑄 ′ (𝑥) = 𝑌 𝑇 𝐷𝑌 = 2𝑦12 + 3𝑦22 + 6𝑦32

Exercise
1. 𝑄(𝑥) = 𝑥12 + 3𝑥22 + 3𝑥32 − 2𝑥2 𝑥3
2. 𝑄(𝑥) = 2(𝑥12 + 𝑥1 𝑥2 + 𝑥22 )
3. 𝑄(𝑥) = 7𝑥12 + 10𝑥22 + 7𝑥32 − 4𝑥1 𝑥2 + 2𝑥1 𝑥3 + 4𝑥2 𝑥3
4. 𝑄(𝑥) = 2𝑥12 + 2𝑥22 + 2𝑥32 − 2𝑥1 𝑥3
5. 𝑄(𝑥) = 2𝑥12 + 2𝑥22 − 𝑥32 − 8𝑥1 𝑥2 + 4𝑥1 𝑥3 − 4𝑥2 𝑥3
6. 𝑄(𝑥) = 7𝑥12 − 8𝑥22 − 8𝑥32 + 8𝑥1 𝑥2 − 8𝑥1 𝑥3 − 2𝑥2 𝑥3
7. 𝑄(𝑥) = 3𝑥12 − 2𝑥22 −𝑥32 − 4𝑥1 𝑥2 + 8𝑥1 𝑥3 + 12𝑥2 𝑥3
8. 𝑄(𝑥) = 8𝑥12 + 7𝑥22 + 3𝑥32 − 12𝑥1 𝑥2 + 4𝑥1 𝑥3 − 8𝑥2 𝑥3
9. 𝑄(𝑥) = 𝑥12 + 4𝑥22 + 9𝑥32 + 4𝑥1 𝑥2 + 6𝑥1 𝑥3 + 12𝑥2 𝑥3

Sanjivani College of Engineering, Kopargaon (An Autonomous Institute) 2.18

You might also like