7th quiz-solution - Math Berkeley - University of California, Berkeley

U NIVERSITY OF C ALIFORNIA , B ERKELEY
Math 54 Section 212& 214
7th Quiz(10/23/2013) Solution
Bo Lin
Note: Write your answers on this paper. Please do not divulge the contents on this paper until
5pm today.
Questions 1 ∼ 2: Mark the only one correct choice. Each question is worth 2 points.
1. What are the real part and imaginary part of −i(1 + i) respectively?
(a) 1, 1 ; (b) 1, −1 ; (c) −1, 1 ; (d) −1, −1.
Solution. The answer is (b).
−i(1 + i) = −i − i2 = (−1)i − (−1) = 1 + (−1)i.
2. Under the standard basis, what kind of linear transformation from R2 to itself corresponds
to a 2 × 2 matrix without real eigenvalues?
(a) reflection through y-axis;
(b) scaling of x-axis by a positive number;
(c) rotation along the origin by 90◦ ;
(d) shear transformation.
Solution. The answer is (c).
1
(a) corresponds to matrix
0
a
(b) corresponds to matrix
0
0
(c) corresponds to matrix
1
ues.
0
, which has real eigenvalues 1, −1.
−1
0
, where a > 0. Then it has real eigenvalues a, 1.
1
−1
, which has complex eigenvalues i, −i and no real eigenval0
1
1 a
(d) corresponds to matrix
, where a 6= 0. Then it has real eigenvalue 1 with multiplicity
0 1
2.
Questions 3 ∼ 9: There are at least one correct choice(s) for each question. Mark all correct
choices. Each question is worth 3 points. Partial credit is 1 point, however if you marked any
incorrect choice, you get 0 point.
3. Let λ be a real eigenvalue of n × n matrix A. Let d be the dimension of the eigenspace of
λ and m be the multiplicity of λ in char(A), then which of the following must be true?
(a) d > 1 ; (b) d ≥ m ; (c) d ≤ m ; (d) d < n.
Solution. The answer is (c).
1 1
We have the proposition that d ≤ m.
would be a counterexample of both (a) and (b)
0 1
(λ = 1, d = 1, m = 2) and identity matrices would be a counterexample of (d). Actually all
correct relation of d, m, n are
1 ≤ d ≤ m ≤ n.
4. Which of 
the following
are diagonalizable
over
 matrices



 real numbers?
2 1 0
1 0 0
2 1 0
1 0
; (b)0 1 0; (c)0 0 −1; (d)0 2 0;
(a)
0 0
0 0 0
0 1 0
0 0 0
Solution. The answers are (a)(b).
Recall: A matrix is diagonalizable over real numbers it and only if
1. Its characteristic polynomial factors into linear factors over real numbers.
2. For each eigenvalue, the dimension of its eigenspace equals to its multiplicity in the characteristic polynomial.
(a) itself is diagonal, so it is diagonalizable. (b) is triangular, so it has three distinct eigenvalues
2, 1, 0, which means all eigenspaces have dimension 1, so it is diagonalizable. As for (c), note
that the characteristic polynomial is (λ − 1)(λ2 + 1) = (λ − 1)(λ + i)(λ − i), which could
not factor into linear factors over real numbers. (d) has unique eigenvalue 2 with multiplicity 3,
however, the dimension of the eigenspace of 2 is 2.
2
5. If two square matrices A and B are similar, then which of the following statements is true?
(a) det(A) = det(B);
(b) char(A) = char(B) ;
(c) they have the same set of eigenvalues;
(d) they are row equivalent.
Solution. The answers are (a)(b)(c).
If A, B are similar, then there exists invertible square matrix P such that
A = P BP −1 .
Then take determinants we have
det(A) = det(P BP −1 ) = det(P )det(B)det(P −1 ) = det(B)det(P )(det(P ))−1 = det(B).
So (a) is true.
we also have
char(A)
= det(A − λI)
= det(P BP −1 − λI)
= det(P BP −1 − P (λI)P −1 )
= det(P (B − λI)P −1 )
= det(P )det(B − λI)det(P −1 )
= det(B − λI)
= char(B)
So (b) is true. Actually (a) could be derived from (b) by taking λ = 0. (c) follows directly from
(b), since eigenvalues are roots of the characteristic polynomials. (d) is unrelated to similarity.
One counterexample is the following pair of matrices:
0 0
1 0
.
,
0 1
0 0
They are similar because if we change the basis by swapping the two vectors in the standard
basis then we could get one from the other, however theyare not row equivalent, because no
matter what elementary row operations we apply to the former, its second column remains zero
vector, while the latter’s second column is not zero vector.
6. Which of the following vectors has length 5?
(a) [−5]T ;
(b) [1, 2]T ;
(c) [5 cos θ, −5 sin θ]T (0 ≤ θ ≤ 90◦ );
(d) [3, 4, 0]T .
3
Solution. The answers are (a)(c)(d).
p
√
2 .
Recall: the length of a vector ~v = (v1 , · · · , vm ) is v · v = v12 + · · · + vm
Then we could check them individually:
p
√
(−5)2 = 25 = 5.
p
√
12 + 22 = 5 6= 5.
q
p
√
2
2
(5 cos θ) + (−5 sin θ) = 25(cos2 θ + sin2 θ) = 25 = 5.
p
√
√
32 + 42 + 02 = 9 + 16 + 0 = 25 = 5.
7. If u, v, w are three vectors in R3 , then which of the following statements is always true?
(a) u · v = v · u;
(b) (u · v)w = (w · v)u;
(c) ||u + v||2 + ||u − v||2 = 2(||u||2 + ||v||2 );
(d) u · v ≥ 0.
Solution. The answers are (a)(c).
(a) is true by the definition of inner product. As for (b), note that the inner product of two vectors
is a number, not a vector. (u · v)w is just a scalar multiple of w, and (w · v)u is just a scalar
multiple of u. If w and u are linearly independent, then they could not be equal. So (b) is not
true. Please remember that inner product does not have an analogue of associativity.
(c) is true because
||u + v||2 + ||u − v||2
= (u + v) · (u + v) + (u − v) · (u − v)
= [u · u + v · v + 2u · v] + [u · u + v · v − 2u · v]
= 2(u · u + v · v)
= 2(||u||2 + ||v||2 )
(d) is not true. A counterexample could be
[1, 0]T · [−1, 0]T = −1.
8. Which of the following is a orthogonal basis of R2 or R3 ?
(a) {[1, 0]T , [0, 1]T };
(b) {[1, 2]T , [−1, 2]T };
(c) {[1, 1, 1]T , [0, 1, −1]T , [−6, 3, 3]T };
(d) {[1, 2, 0]T , [2, −1, 1]T };
4
Solution. The answers are (a)(c).
Recall: an orthogonal basis of Rn is a set of vectors that is both an orthogonal set and a basis of
Rn .
(a) is the standard basis of R2 and it’s also an orthogonal set.
(b) is not an orthogonal set, since [1, 2]T · [−1, 2]T = 3 6= 0.
(c) is true. Just verify both conditions.
(d) is an orthogonal set in R3 , while it only contains two vectors, so it could be a basis of R3 .
9. W is a subspace of Rn , let W ⊥ be the orthogonal complement of W . Then which of the
following statements is true?
(a) W ⊥ is also a subspace of Rn ;
(b) W ∩ W ⊥ = {~0};
(c) dimW + dimW ⊥ = n;
(d) ∀v ∈ W, v 0 ∈ W ⊥ we have v · v 0 = 0;
Solution. The answers are (a)(b)(c)(d).
(a) and (d) could be derived from the definition of W ⊥ . As for (b), firstly since W and W ⊥ are
both subspaces, then they both contain ~0. Secondly, if a vector x ∈ W ∩ W ⊥ , then by (d)
x · x = 0,
which implies that x = 0. So (b) is true. (c) is easy to understand but a little bit tricky to prove
at the moment. After learning the orthogonal projection it will become clear.
5