Review for Lectures 14-16

Review for Lectures 14-16
² Inner Product:
2 3
v1
6 v2 7
6 7
~u ¢ ~v = ~uT ~v = [u1 ; u2 ; :::; un ] 6 .. 7 = u1 v1 + u2 v2 + ::: + un vn
4.5
vn
² Norm, Unit vector & distance:
k~uk =
The vector
q
p
~u ¢ ~u = u21 + u22 + ::: + u2n :
~u
k~uk
is called the unit vector in the same direction as ~u:
Then,
q
dist (~u; ~v ) = k~u ¡ ~vk = (u1 ¡ v1 )2 + (u2 ¡ v2 )2 + ::: + (un ¡ vn )2 :
°2 3 2 3° °2 3°
° 3
°
°
2 °
°
° ° 1 °
°6¡57 6 1 7° °6¡67° p
6 7 6 7° °6 7°
dist (~u; ~v ) = k~u ¡ ~vk = °
°4 1 5 ¡ 4 0 5° = °4 1 5° = 47:
°
° °
°
° 2
¡1 ° ° 3 °
² Orthogonality:
~u is said to be orthogonal to ~v
if ~u ¢ ~v = 0:
Theorem 14.1 (Pythagorean & Triangle Inequality) Let ~u and ~v be two vectors in Rn :
Then
1. ~u and ~v are perpendicular, denoted by ~u ? ~v; i¤
k~u ¡ ~vk2 = k~u k2 + k ~vk2 or k~u + ~vk2 = k~u k2 + k ~v k2 :
2. The following triangle inequality holds
k~u ¡ ~vk · k~u k + k ~vk ;
1
3. The above triangle inequality becomes equality, i.e.,
k~u ¡ ~vk = k~u k + k ~vk i¤ ~u = ¸~v
for some scalars ¸:
De…nition 14.5. Let W be a subspace of Rn : A vector ~u is said to be orthogonal to W;
denoted by ~u ?W; if ~u is orthogonal to every vector in W; i.e.,
~u ¢ w
~ = 0 for any w
~ 2 W (w
~ 2 W means w
~ belongs to W ).
We call the subspace
W ? = f~v j ~v ? W g
the orthogonal complement space of W:
² Finding orthogonal complement: let
W = Span f~u1 ; ~u2 ; :::; ~upg be a subspace in Rn :
Then
2 3
~uT1
³
´T 6~uT 7
6 27
W ? = Null (A) ; A = [~u1 ~u2 ::: ~up]n£p = 6 .. 7
:
4 . 5
~uTp p£n
This relation may be summarized as
¡ ¢?
¡ ¢
Col (A)? = Row AT = N ull AT :
² Orthogonal Set: f~u1 ; ~u2 ; :::; ~up g is called an orthogonal set if ~ui ¢ ~uj = 0 for i 6= j:
Theorem 15.1. Any orthogonal set is linearly independent.
Theorem 15.2. Let B= f~u1 ; ~u2 ; :::; ~up g be an orthogonal basis for a subspace W: Then,
for each w
~ 2 W; its coordinate [w]
~ B relative to this orthogonal basis can be expressed as
2 3
c1
6c2 7
w
~ ¢ ~ui
6 7
[w]
~ B = 6 .. 7 ; ci =
; i = 1; 2; :::; p:
(1)
4.5
k~ui k2
cp
In other words,
w
~ = c1 ~u1 + c2 ~u2 + ::: + cp~up
w
~ ¢ ~u1
w
~ ¢ ~u2
w
~ ¢ ~up
=
u1 +
u2 + ::: +
~up :
2~
2~
k~u1 k
k~u2 k
k~up k2
2
(2)
² Orthogonal Projections.
De…nition 15.3. Given a vector ~u; the orthogonal projection of ~y onto ~u; denoted
by
y^ = P roj~u (~y) ;
is de…ned as the vector y^ parallel to ~u such that
~y = y^ + ~z ;
~z ? ~u:
In general, for any subspace W; the orthogonal projection of ~y onto W; denoted by
y^ = P rojW (~y) ;
is de…ned as the vector in W such that
(~y ¡ y^) ? W:
In other words,
~y = y^ + z; y^ 2 W; z? W:
This means that any vector ~y can be decomposed into two components: one is the projection
y^ on W (which is in W ) and other component is perpendicular to W:
Suppose W has an orthogonal basis B= f~u1 ; ~u2 ; :::; ~up g : Then
y^ = P rojW (~y) = P roj~u1 (~y ) + P roj~u2 (~y) + ::: + P roj~up (~y )
~y ¢ ~u1
~y ¢ ~u2
~y ¢ ~up
=
u1 +
u2 + ::: +
~up
2~
2~
k~u1 k
k~u2 k
k~up k2
(3)
Theorem 15.3. Let U = [~u1 ; ~u2 ; :::; ~un ] be a n £ n matrix with columns ~u1 ; ~u2 ; :::; ~un .
Suppose that the columns of U form an orthonormal set. Then
U ¡1 = U T ; i:e:; UU T = U T U = I:
We call such matrix U orthonormal matrix.
Note that the same technique may be used to calculate the inverse of a matrix A =
[u1 ; u2 ; :::; un ]; where the column vectors ~u1 ; ~u2 ; :::; ~un form an orthogonal set, but not orthonormal set. In this case,
2
3 2
3
(~u1 )T ~u1 (~u1 )T ~u2 ::: (~u1 )T ~un
~u1 ¢ ~u1
0
:::
0
6 (~u2 )T ~u1 (~u2 )T ~u2 ::: (~u2 )T ~un 7 6 0
~u2 ¢ ~u2 :::
0 7
7=6
7:
AT A = 6
4 :::
5
4
:::
:::
:::
::: 5
:::
:::
:::
0
0
::: ~un ¢ ~un
(~un )T ~u1 (~un )T ~u2 ::: (~un )T ~un
So
2
~u1 ¢ ~u1
0
6 0
~
u
u2
2 ¢~
6
4 :::
:::
0
0
3¡1
:::
0
:::
0 7
7 AT A = I;
:::
::: 5
::: ~un ¢ ~un
3
i.e.,
A¡1
2
~u1 ¢ ~u1
0
6 0
~u2 ¢ ~u2
=6
4 :::
:::
0
0
2
1
0
6 ~u1 ¢ ~u1
6
1
6
6 0
=6
~u2 ¢ ~u2
6 :::
:::
6
4
0
0
3¡1
:::
0
:::
0 7
7 AT
:::
::: 5
::: ~un ¢ ~un
2
3
3
(~u1 )T
:::
0 72
6 ~u ¢ ~u 7
T3
6 1 T1 7
7 (~u1 )
6 (~u2 ) 7
76
T7
7
:::
0 7 6 (~u2 ) 7 6
6 ~u ¢ ~u 7 :
=
74
2
27
::: 5 6
6 ::: 7
:::
::: 7
7
T
6
7
1 5 (~un )
4 (~un )T 5
:::
~un ¢ ~un
~un ¢ ~un
Example 15.5. We know from previous examples that
2 3
2 3
2 3
3
¡1
¡1
1
1
1
~u1 = p 415 ; ~u2 = p 4 2 5 ; ~u3 = p 4¡45
11 1
6 1
66 7
form an orthonormal basis, but
2 3
2 3
3
¡1
~v1 = 415 ; ~v2 = 4 2 5 ;
1
1
form only an orthogonal basis. Set
2
3
¡1
~v3 = 4¡45
7
2
3
3
¡1 ¡1
p
p
p
6 11
6
66 7
6 1
7
2
¡4
6p
7
p
p
U = [~u1 ; ~u2 ; ~u3 ] = 6
7
6 11
6
66 7
4 1
1
7 5
p
p
p
11
6
66
2
3
3 ¡1 ¡1
4
V = [~v1 ; ~v2 ; ~v3 ] = 1 2 ¡45 :
1 1
7
The …rst matrix U is an orthogonal matrix, and
2
3
1
p
p
6 11
11
6
2
6 ¡ p1
¡1
T
p
U =U =6
6
6
6
4
1
4
¡p
¡p
66
66
4
3
1
p
11 7
1 7
p 7
7:
67
7 5
p
66
The matrix is not an orthonormal matrix. However,
2
32
3 2
3
3
1 1 3 ¡1 ¡1
11 0 0
V T V = 4¡1 2 15 41 2 ¡45 = 4 0 6 0 5 :
¡1 ¡4 7 1 1
7
0 0 66
Therefore,
2
3¡1
11 0 0
4 0 6 0 5 V T V = I;
0 0 66
or
V ¡1
² Applications:
2
11 0
=40 6
0 0
2 ¡1
11
=4 0
0
2
3
6 11
6 1
=6
6 ¡6
4 1
¡
66
3¡1
0
05 VT
66
32
3
0
0
3
1 1
6¡1
0 5 4¡1 2 15
0 66¡1
¡1 ¡4 7
3
1
1
11 11 7
1
17
7:
3
67
2
75
¡
33 66
1. Theorem 15.4 (Orthonormal Decomposition) Let B= f~u1 ; ~u2 ; :::; ~up g be an orthonormal basis of a subspace W in Rn : Then, the orthogonal projection of any vector ~y onto
~ui and orthogonal projection of any vector ~y onto W have the following expressions,
respectively,
P roj~ui (~y ) = (~y ¢ ~ui ) ~ui ; i = 1; 2; :::; p
p
p
X
X
P rojW (~y ) =
(~y ¢ ~ui ) ~ui =
P roj~ui (~y ) ;
i=1
i=1
and
~y = P rojW (~y ) + ~z ;
~z ? W:
Moreover, if we set U = [~u1 ; ~u2 ; :::; ~up ]n£p to be a matrix whose columns are f~u1 ; ~u2 ; :::; ~up g ;
then
P rojW (~y) = UU T ~y:
(4)
5
2. Theorem 15.5 (Best Approximation) Let W be a subspace. Then, for any vector
~y , P rojW (~y ) 2 W is the best approximation to ~y by vectors in W . More precisely,
P rojW (~y) is the closest point in W to ~y , i.e.,
k~y ¡ P rojW (~y)k · k~y ¡ wk
~ ; for any w
~ 2 W:
(5)
3. Least-Square Approximation Solution
Theorem 16.1 Consider inconsistent linear systems
A~x = ^b:
Then the following linear system
AT A^
y = AT ~b
is always consistent, and any solution y^ of this consistent linear system is a least-squares
approximation solution.
² Two-Step procedure:
1. Calculate B = AT A and ^b = AT ~b
2. Solve B^
y = ^b:
6