Orthogonal Trans 2
Orthogonal Trans 2
Orthogonal Trans 2
T
Definition: A linear transformation Rn −→ Rn is orthogonal if |T (~x)| = |~x| for all ~x ∈ Rn .
T
Theorem: If Rn −→ Rn is orthogonal, then ~x · ~y = T ~x · T ~y for all vectors ~x and ~y in Rn .
Discuss with your table the geometric intuition of each of these statements. Why do they make
sense, for example, in R3 .
2 2 a c
B. Suppose T : R → R is given by left multiplication by A = . Assuming T is orthogonal,
b d
illustrate an example of such a T by showing what it does to a unit square. Label your picture
with a, b, c, d. What do you think det A can be?
Solution note: Your picture should show a square of sides 1 with vertices (0, 0), (a, b), (c, d)
and (a + c, b + d). The determinant of A is either 1 or -1, since | det A| is the area of
this square.
Solution note: Yes, the columns are orthonormal since the (cosθ)2 + (−sinθ)2 = 1,
and (cosθ)(−sinθ) + (sinθ)(cosθ) = 0.. Multiplication by A is rotation through an
angle of θ (counterclockwise), so it preserves lengths and hence is orthogonal.
3/5 4/5 0
2. Is B = −4/5 3/5 0 orthogonal? Is B T orthogonal? Is the map “multiplication by B”
0 0 1
orthogonal? Why? Describe it geometrically.
Solution note: Yes, again, just check that the dot product of the columns is either 1
or 0, depending on whether we dot a column with itself or a different column. Same
is true for B T . Multiplication by A is orthogonal...actually it is a rotation around
the e3 -axis.
T
D. Prove the following Theorem: Let Rn −→ Rn be the linear transformation T (~x) = A~x, where
A is an n × n matrix. Then T is orthogonal if and only if the matrix A has orthonormal columns.
[Hint: Scaffold first. What are the two things you need to show?]
Solution note: We need to show two things:
1). If T is orthogonal, then A has orthonormal columns. Assume T is orthogonal.
The columns of A are [T (~e1 ) T (~e2 ) . . . T (~en )]. To check they are orthonormal, we
need two things: T (~ei ) · T (~ei ) = 1 for all i, and T (~ei ) · T (~ej ) = 0 if i 6= j. For the
first, note
T (~ei ) · T (~ei ) = ||T (~ei )||2 = ||~ei ||2 = 1,
with the first equality coming from the definition of the length of the vector and the
second coming from the definition of T being orthogonal. For the second, T (~ei ) ·
T (~ej ) = ~ei · ~ej by the Theorem above. This is zero since the standard basis is
orthonormal.
2). Assume A has orthonormal columns. We need to show that for any ~x ∈ Rn ,
||T (~x)|| = ||~x||. Since the length is always a non-negative number, it suffices to show
||T (~x)||2 = ||~x||2 . That is, it suffices to show T (~x) · T (~x) = ~x · ~x. For, this we take an
arbitrary ~x and write it in the basis {~e1 , . . . , ~en }. Note that
X
~x · ~x = (x1~e1 + · · · + xn~en ) · (x1~e1 + · · · + xn~en ) = (xi~ei ) · (xj ~ej ) = x21 + · · · + x2n .
ij
here the third equality is using some basic properties of dot product (like ”foil”), and
the third equality is using the fact that the ~ei are orthonormal so that (xi~ei )·(xj ~ej ) = 0
if i 6= j. On the other hand, we also have
with the last equality coming from the fact that the T (~ei )’s are the columns of A and
hence orthonormal. QED.
G. Prove that the rows of an orthogonal matrix are also orthonormal. [Hint: don’t forget that
the rows of A are the columns of AT . Remember that the inverse of an orthogonal map is also
orthogonal.]
Solution note: Say A is orthogonal. Then the map TA is orthogonal. Hence its inverse
is orthogonal, and so the matrix of the inverse, which is A−1 is orthogonal. By the
previous problem, we know also that A−1 = AT is orthogonal. So since the columns
of AT are orthonormal, which means the rows of A are orthonormal.
H. Prove the very important Theorem on the first page saying that orthogonal transformations
preserve dot products. Why does this tell us that orthogonal transformations preserve angles?
[Hint: consider x + y.]
Solution note: Assume T is orthogonal. So ||T (x + y)|| = ||(x + y)|| for all x, y ∈ Rn ,
by definition of orthogonal. Hence also
(T (x) + T (y)) · (T (x) + T (y)) = (T (x) · (T (x)) + 2T (x) · T (y) + (T (y) · T (y))
= ||T (x)||2 + 2T (x) · T (y) + ||T (y)||2 .
(x + y) · (x + y) = x · x + 2x · y + y · y = ||x||2 + 2x · y + ||y||2 .
That is,
||T (x)||2 + 2 T (x) · T (y) + ||T (y)||2 = ||x||2 + 2 x · y + ||y||2 .
Because T is orthogonal, ||T (x)|| = ||x|| and ||T (y)|| = ||y||, so we can cancel these
from both sides to get 2T (x)·T (y) = 2x·y. Dividing by 2, we see that T (x)·T (y) = x·y.
The proof is complete.
Solution
note:
We have A = QR where Q is the orthogonal matrix above and R is
5 1 1
0 10 −1.
0 0 5
5. Use your QR factorization to quickly solve the system A~x = [0 0 25]T without row
reducing!.
Solution note: To solve A~x = QR~x = ~b, multiply both sides by Q−1 , which
is Q
T
15
since Q is orthogonal. We have the equivalent system R~x = QT ~b = 0 . This is
20
easy to solve because R is upper triangular. We get z = 4 from the bottom row. Then
the second row gives 10y − 4 = 0, so y = 2/5. The top row gives 5x + 2/5 + 4 = 15,
so x = 53/5.
G. TRUE OR FALSE. Justify. In all problems, T denotes a linear transformation from Rn to itself,
and A is its matrix in the standard basis.
2. If T sends every pair of orthogonal vectors to another pair of orthogonal vectors, then T is
orthogonal.
7. The product of two orthogonal matrices (of the same size) is orthogonal.