Math 54. Selected Solutions for Week 9 Section 6.5 (Page 312) 15

Math 54. Selected Solutions for Week 9
Section 6.5 (Page 312)
15.
Use the factorization A = QR to find the least-squares solution of A~x = ~b .

2
A = 2
1
 
3
2/3
4  =  2/3
1
1/3

−1/3 3
2/3 
0
−2/3
5
1
 
7
~b =  3  .
1
,
By Theorem 15 on page 311,
x
b = R−1 QT ~b
−1 3 5
2/3 2/3
=
0 1
−1/3 2/3
1/3 −5/3
7
=
0
1
−1
4
=
.
−1
20.
 
7
1/3  
3
−2/3
1
Let A be an m × n matrix such that AT A is invertible. Show that the columns of A
are linearly independent. [Careful: You may not assume that A is invertible; it may
not even be square.]
Let ~v1 , . . . , ~vn be the columns of A . Let c1 , . . . , cn be constants such that
c1~v1 + · · · + cn~vn = 0 .
Letting ~u = (c1 , . . . , cn ) , this gives A~u = ~0 . Multiplying on the left by AT gives
AT A~u = ~0 . Since AT A is invertible, this implies ~u = ~0 . Therefore c1 = · · · = cn = 0 ,
so ~v1 , . . . , ~vn must be linearly independent.
24. Find a formula for the least-squares solution of A~x = ~b when the columns of A are
orthonormal.
If the columns of A are orthonormal, then we can take Q = A and R = I as a
QR factorization of A . Therefore, by Theorem 15 on page 311,
x
b = R−1 QT ~b = AT ~b .
1
2
Section 6.7 (Page 328)
17.
Use the inner product axioms and other results of this section to verify that
h~u, ~v i =
1
1
k~u + ~v k2 − k~u − ~v k2 .
4
4
We have
k~u + ~v k2 = h~u + ~v , ~u + ~v i
= h~u, ~ui + h~u, ~v i + h~v , ~ui + h~v , ~v i
= k~uk2 + 2h~u, ~v i + k~v k2 .
Similarly,
k~u − ~v k2 = h~u − ~v , ~u − ~v i
= h~u, ~ui − h~u, ~v i − h~v , ~ui + h~v , ~v i
= k~uk2 − 2h~u, ~v i + k~v k2 .
Therefore, four times the right-hand side is
k~uk2 + 2h~u, ~v i + k~v k2 − (k~uk2 − 2h~u, ~v i + k~v k2 ) = 4h~u, ~v i ,
19.
which is four times the left-hand side.
√ √ a
b
Given a ≥ 0 and b ≥ 0 , let ~u = √
and ~v = √ . Use the Cauchy-Schwarz
b √
a
inequality to compare the geometric mean ab with the arithmetic mean (a + b)/2 .
We have
and
√ √ √
a
b |h~u, ~v i| = √ · √ = 2 ab ,
b
a
√ √
√a = a + b ,
k~uk = b √ √
b k~v k = √a = a + b .
Therefore, the Cauchy-Schwarz inequality |h~u, ~v i| ≤ k~ukk~v k with the given choices of
~u and ~v gives
√
2 ab ≤ a + b .
Dividing both sides by 2 gives that the geometric mean is always less than or equal to
the arithmetic mean:
√
a+b
ab ≤
.
2
3
Section 7.1 (Page 345)
28.
Show that if A is an n×n symmetric matrix, then (A~x)·~y = ~x ·(A~y ) for all ~x, ~y ∈ Rn .
(A~x) · ~y = (A~x)T ~y = ~xT AT ~y = ~x · (AT ~y ) = ~x · (A~y ) (the last step uses the fact
that A is symmetric).
30.
Suppose both A and B are both orthogonally diagonalizable and AB = BA . Explain
why AB is also orthogonally diagonalizable.
The matrix AB is orthogonally diagonalizable because it is symmetric:
(AB)T = B T AT = BA = AB .
The second step used the fact that A and B are symmetric, because both are orthogonally diagonalizable (Theorem 2), and the third step uses the assumption that
AB = BA .
35.
Let ~u be a unit vector in Rn , and let B = ~u~uT .
(a). Given any ~x in Rn , compute B~x and show that B~x is the orthogonal
projection of ~x onto ~u , as described in Section 6.2.
(b). Show that B is a symmetric matrix and B 2 = B .
(c). Show that ~u is an eigenvector of B . What is the corresponding eigenvalue?
a. We have
B~x = ~u~uT ~x = ~u(~uT ~x) = ~u(~u · ~x) = (~u · ~x)~u =
~x · ~u
~u = proj~u ~x .
~u · ~u
(Here we used that ~u is a unit vector.)
b. B is a symmetric matrix because
B T = (~u~uT )T = ~uT T ~uT = ~u~uT = B .
Also B 2 = B because
B 2 = (~u~uT )(~u~uT ) = ~u(~uT ~u)~uT = ~u(~u · ~u)~uT = ~u [ 1 ] ~uT = ~u~uT = B .
c. Finally, we have
B~u = (~u~uT )~u = ~u(~uT ~u) = ~u(~u · ~u) = ~u(1) = ~u .
The eigenvalue is 1 .