True False

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

111 TRUE/FALSE QUESTIONS

Chapter 1: Systems of Linear Equations

(1) A system of 3 linear equations in 2 unknowns must have no solution

(2) A system of 2 linear equations in 3 unknowns could have exactly


one solution

(3) A system of linear equations could have exactly two solutions

(4) If there’s a pivot in every row of A, then Ax = b is consistent for


every b

(5) If the augmented matrix has a pivot in the last column, then Ax = b
is inconsistent

(6) If A has a row of zeros, then Ax = b is inconsistent for all b

(7) Ax = 0 is always consistent

(8) If {u, v, w} is linearly dependent then {Au, Av, Aw} is also lin-
early dependent for every A

(9) If {u, v, w} is linearly independent and {v, w, p} is linearly inde-


pendent, then so is {u, v, w, p}

(10) If {u, v, w} is linearly dependent, then u is in the span of {v, w}

(11) If {u, v, w} is linearly dependent and {u, v} is linearly indepen-


dent, then w is in the span of {u, v}

(12) If T is a linear transformation from R2 to R3 , then the matrix A of


T is a 2 × 3 matrix

Date: Thursday, March 14, 2019.


1
2 111 TRUE/FALSE QUESTIONS

(13) If T (cu) = cT (u) for every real number c, then T is a linear trans-
formation

(14) If T (u + v) = T (u) + T (v) for all u and v, then T is a linear


transformation

(15) If T (u + cv) = T (u) + cT (v) for all u and v and every real number
c, then T is a linear transformation

(16) If T : R2 → R3 is linear, then T cannot be onto R3

(17) If T is one-to-one and {u, v, w} is linearly independent, then {T (u), T (v), T (w)}
is also linearly independent

Chapter 2: Matrix Algebra

(18) AB + B T AT is always symmetric

(19) Any matrix A can be written as a sum of a symmetric (AT = A)


and antisymmetric (AT = −A) matrix

(20) (AB)−1 = A−1 B −1

(21) If AB = AC, then B = C


 
1 2 3
(22) 2 4 6 is not invertible
3 6 9
(23) If AB = I for some B, then A is invertible

(24) A 3 × 2 matrix could be invertible

(25) A 2 × 3 matrix could be invertible

(26) If AB is invertible, then A and B are invertible

(27) Same, but this time A and B are square

(28) If N ul(A) = {0}, then A is invertible


111 TRUE/FALSE QUESTIONS 3

(29) Every linear transformation T : Rn → Rm has a matrix

(30) If T : Rn → Rn is one-to-one, then T is onto Rn

(31) The row-operations that transform A to I also transform I to A−1

(32) If A is square and Ax = 0 implies x = 0, then A is row-equivalent


to the identity matrix

Chapter 3: Determinants

(33) In general, det(2A) = 2 det(A)

(34) det(A + B) = det(A) + det(B)

(35) If det(A2 ) + 2 det(A) + det(I) = 0, then A is invertible

(36) det(A−1 ) = − det(A)

(37) If A100 is invertible, then A is invertible

(38) If det(A) = 1 and A has only integer entries, then A−1 has integer
entries

(39) If det(A) = 1 and A and b have only integer entries, then the solu-
tion x to Ax = b has only integer entries

Chapter 4: Vector Spaces and Subspaces

(40) {(x, y) ∈ R2 | x2 + y 2 = 0} is a subspace of R2

(41) The union of two subspaces of V is still a subspace of V

(42) The intersection of two subspaces of V is still a subspace of V

(43) Given any basis B of V , and a subspace W of V , then there is a


subset of B that is a basis of W

(44) R2 is a subspace of R3
4 111 TRUE/FALSE QUESTIONS
   
1 0 1
(45) N ul = Span
0 0 0
(46) For a fixed b 6= 0, the set of solutions to Ax = b is a subspace of Rn

(47) If A is a 4 × 6 matrix with 2 pivot columns, then N ul(A) = R4

(48) If A is m × n and has n pivot columns, then N ul(A) = {0}

(49) If A is m × n and has n pivot columns, then Col(A) = Rm

(50) If A is row-equivalent to B, then Col(A) = Col(B)

(51) Rank(A2 ) = Rank(A)

(52) The set W of polynomials of degree n is a subspace of the set V of


polynomials (of any degree)

(53) If A is 5 × 9, then N ul(A) is at least 4 dimensional

(54) cos2 (t), sin2 (t), cos(2t) is linearly dependent




(55) Z is a subspace of R

(56) If W is a subset of V such that 0 (the zero vector in V ) is in W and


W is closed under addition, then W is a subspace of V

(57) If 0 is in W and W is closed under scalar multiplication, then W is


a subspace of V

(58) If W is closed under addition and scalar multiplication, then W is a


subspace of V

(59) A vector space V is always a subspace of something

(60) If A s row-equivalent to B, then the pivot columns of B form a basis


for Col(A)

(61) Row-operations preserve the span of the columns of a matrix


111 TRUE/FALSE QUESTIONS 5

(62) Row-operations preserve the linear independence relations of the


columns of a matrix

(63) If B spans a space V , then there is a subset of B that is a basis for V

(64) If B = {v1 , · · · , vn } is a linearly independent subset of a n−dimensional


vector space V , then B is a basis for V

(65) If B = {v1 , · · · , vn } is a spanning subset of a n−dimensional vec-


tor space V , then B is a basis for V

(66) dim(P4 ) = 4

(67) If B is a basis for Rn and P is a matrix with the vectors of B as its


columns, then P x = [x]B (the coordinates of x with respect to B)
 
(68) If B = {b1 , · · · , bn } and C are bases of Rn and P = [b1 ]C · · · [bn ]C ,
then P [x]C = [x]B

(69) If B = {e1 , · · · , en } is the standard basis of Rn , then [x]B = x

(70) Rank(A) = Rank(AT )

Chapter 5: Eigenvalues and Eigenvectors

(71) A 3 × 3 matrix with eigenvalues λ = 1, 2, 4 must be diagonalizable

(72) A 3 × 3 matrix with eigenvalues λ = 1, 1, 2 is never diagonalizable

(73) Every matrix is diagonalizable

(74) If A is similar to B, then det(A) = det(B)

(75) If A is similar to B, then A and B have the same eigenvalues

(76) If A is diagonalizable, then det(A) is the product of the eigenvalues


of A

(77) If A is similar to B, then A and B have the same eigenvectors


6 111 TRUE/FALSE QUESTIONS

(78) If A is invertible, then A is diagonalizable

(79) If A is diagonalizable, then A is invertible

(80) If A is similar to B, then A2 is similar to B 2

(81) If A is diagonalizable and invertible, then A−1 is diagonalizable

(82) If λ = 0 is an eigenvalue of A, then A is not invertible

(83) (Nonzero) Eigenvectors corresponding to different eigenvalues of A


are linearly independent

(84) Every matrix has a real eigenvalue

(85) Every matrix has a complex eigenvalue

(86) If the characteristic polymomial of A is λ2 − 3λ + 2 = 0, then


A2 − 3A + 2I = O (the zero-matrix)

Chapter 6: Orthogonality and Least-Squares

(87) If x̂ is the orthogonal projection of x on a subspace W , then x̂ is


perpendicular to x

ˆ = x̂
(88) x̂

(89) The orthogonal projection of x on W ⊥ is x − x̂

(90) Every (nonzero) subspace W has an orthonormal basis

(91) W ∩ W ⊥ = {0}

(92) AAT x is the projection of x on Col(A)

(93) Same, but the columns of A are orthonormal

(94) Rank(AT A) = Rank(A)

(95) If Q is an orthogonal matrix, then Q is invertible


111 TRUE/FALSE QUESTIONS 7

(96) If Q is a matrix with orthonormal columns, then kQxk = kxk

(97) An orthogonal set without the zero-vector is linearly independent


u·v

(98) The orthogonal projection of v on W = Span {u} is v·v
v

(99) An orthogonal matrix has orthogonal columns

(100) If x̂ is a least-squares solution of Ax = b, then x̂ is the orthogonal


projection of x on Col(A).

(101) If x̂ is a least-squares solution of Ax = b, then Ax̂ is the point on


Col(A) that is closest to b

(102) Ax = b has only one least-squares solution

(103) If ku + vk2 = kuk2 + kvk2 , then u is orthogonal to v (assume that


everything is real)
R1 R  12 R  21
1 1
(104) 0
f (x)g(x)dx ≤ 0
(f (x))2 dx 0
(g(x))2 dx

(105) The product of two orthogonal matrices (it it’s defined) is orthogo-
nal

(106) Col(A) is orthogonal to N ul(AT )

Chapter 7: Symmetric Matrices

(107) If A is symmetric, then eigenvectors corresponding to different eigen-


values of A are orthogonal

(108) A symmetric matrix has only real eigenvalues

(109) Linearly independent eigenvectors of a symmetric matrix are or-


thogonal

(110) If A is symmetric, then it is orthogonally diagonalizable

(111) If A is orthogonally diagonalizable, then it is symmetric

You might also like