Chapter 9. Orthogonal Projections in Function Spaces

Math 344, Maple Lab Manual
Chapter 9: Projections in Function Spaces
Orthogonal Families
Chapter 9. Orthogonal Projections in Function Spaces
Orthogonal Projection to Span g in PS a, b
PS a, b is the vector space of real-valued piecewise smooth functions defined on the interval a, b .
Given such a function g (not the zero function) and another one f , the (orthogonal) projection of f in the
IP f x , g x
direction of g is defined to be the function Projg f x =
g x where, by definition,
IP g x , g x
b
IP u, v =
u$v dx . IP : PS a, b /R is called the standard inner product on PS a, b . This is a natural
a
generalization of the Euclidean dot product on Rn, and has all the same properties. In particular, Projg f is
the function in Span g that is closest to f using the inner product distance defined as follows.
b
hKk
=
IP h x K k x , h x K k x
=
2
h x Kk x
dx
a
1
Example 1. Let f x d x2 : and g x d x : in PS 0, 1 , so IP u, v d
0
function h defined as follows.
h d unapply
This yields h x =
IP f x , g x
IP g x , g x
g x ,x
u$v dx : Then Projg f is the
:
3
x . See the plot.
4
1
0.8
plot
f x ,g x ,h x
, x = 0 ..1, linestyle = 1, 1, 2
=
0.6
0.4
0.2
0
0.2 0.4 0.6 0.8 1
x
The dotted line is the graph of the projection of x2 to Span x .
The function h is that function in Span x that is closest to f in the least squares sense. Here is the squared
error (SE).
1
SE = IP f x K h x , f x K h x =
80
Improving the Approximation
The function k x = 1 is not orthogonal to h x = x in PS 0, 1 (verify). However, using Gram-Schmidt in
1/2
3x
this context, 1 K Projx 1 = 1 K
x=1K
is orthogonal to x : IP x, 1 K 3 x / 2 = 0 . Therefore,
1/3
2
3x
the approximation above can be improved by adding to h the projection of x2 to Span 1 K
. This
2
projection is the function k defined below.
k d unapply
IP x2, 1 K 3 x / 2
$ 1K3 x/2 , x
IP 1 K 3 x / 2, 1 K 3 x / 2
:
1
1
1
C
x, and the improved approximation is h x C k x = x K .
6
4
6
This yields k x = K
page 48
Math 344, Maple Lab Manual
Chapter 9: Projections in Function Spaces
Orthogonal Families
1
0.8
plot
f x , g x , h x Ck x
0.6
=
, x = 0 ..1, linestyle = 1, 1, 2
0.4
0.2
0
0.2 0.4 0.6 0.8 1
x
An improved approximation
The function h C k is that function in Span 1, x that is closest to f in the least squares sense. Here is the
squared error associated with the improved approximation. It is about half of the error associated with the
first approximation.
1
SE = IP f x K h x C k x , f x K h x C k x
=
180
Gram-Schmidt in PS a, b
Given a family of functions fn
m
n=1
that is linearly independent in PS a, b , the Gram-Schmidt algorithm
will generate an orthogonal family spanning the same subspace. For example, starting with the 10 functions
9
9
xn n = 0 on K1, 1 the for..do loop below generates an orthogonal basis for Span xn n = 0 in PS K1, 1 .
1
Define the inner product, IP u, v d
u$v dx : , and projection function, Pr u, v d
K1
IP u, v
IP v, v
v: .
5 2
x , IP x4 K w, x2 = 0.
7
Then test them. w d Pr x4, x2 =
The for..do loop for Gram-Schmidt, as it applies to 1, x, x2, ... , x9 , looks like this.
v0 d 1 :
for n from 1 to 9 do vn d xn K add Pr xn, vk , k = 0 ..n K 1 : end do: unassign 'n '
The first 5 orthogonal functions that Gram-Schmidt generates are shown below.
1
3
3
6 2
vn $ n = 0 ..4 = 1, x, x2 K , x3 K
x, x4 C
K
x
3
5
35
7
We can verify orthogonality with the matrix of inner products.
2 0
0
0
0
2
3
0
0
0
0 0
8
45
0
0
0 0
0
8
175
0
0 0
0
0
128
11025
0
=
Matrix 5, 5, i, j /IP viK 1, vjK 1
Normalizing by dividing each expression vn by its value at x = 1 :
for n from 0 to 9 do wn d
vn
eval vn, x = 1
: end do: unassign 'n ' ,
yields the first 10 Legendre polynomials. See the first 5 below, followed by their inner product matrix.
page 49
Math 344, Maple Lab Manual
Chapter 9: Projections in Function Spaces
Orthogonal Families
These should look familiar to you, see page 32 in Chapter 6.
wn $ n = 0 ..4
1, x,
3 2
1 5 3
3
35 4
3
15 2
x K ,
x K
x,
x C
K
x
2
2 2
2
8
8
4
2 0
0
0
0
2
3
0
0
0
0 0
2
5
0
0
0 0
0
2
7
0
0 0
0
0
2
9
0
Matrix 5, 5, i, j /IP wiK 1, wjK 1
=
(1)
1
0.5
You have also seen their graphs.
plot
(1) , x =K1 ..1 =
K1
K0.5
0
K0.5
0.5
x
1
K1
Natural Orthogonal Families
In Chapter 6 we saw that the Legendre polynomials, just obtained using the Gram-Schmidt process, are also
solutions to Legendre's equation:
1 K x2 y ''K2 x y 'C n n C 1 y = 0.
Because of this the Legendre polynomials are regarded as a natural orthogonal family in PS K1, 1 .
In Chapter 6 we also saw that the family: Jn zk x
N ,
k=1
of n th order Bessel functions of the first kind is
orthogonal with respect to x in PS 0, 1 . Recall that zk denotes the kth positive zero of Jn. Because these
functions are solutions for the n th order Bessel equation:
x2 y ''C x y 'C x2 K n 2 y = 0 ,
they are also considered to be a natural orthogonal family.
Legendre and Bessel equations are similar in that they are both so-called Sturm-Liouville Equations (SL).
By definition, these are second order linear differential equations of the form
p x y ' 'C q x y = l w x y ,
(SL)
where l is a constant and p, q, w are functions of the independent variable x.
1. For Legendre's equation, p x = 1 K x2, q x = 0, w x = 1, and l =Kn n C 1 .
2. For Bessel's equation, after dividing through by x to obtain x y ''C y C x K n 2 / x y = 0, p x = x,
q x = x, w x = 1 / x, and l = n 2.
Sturm-Liouville equations, in conjunction with certain conditions at the two boundary points a and b,
generate families of solutions having useful orthogonality properties in PS a, b .
page 50
Math 344, Maple Lab Manual
Chapter 9: Projections in Function Spaces
Orthogonal Families
One More Example
Finally, consider the undamped, unforced mass-spring equation m y ''C k y = 0 or, setting w =
k/m ,
2
y ''C w y = 0.
2
This is also Sturm-Liouville with p x = 1, q x = 0, w x = 1, and l =Kw . The general solution is
y x = A cos w x C B sin w x .
If we specify solutions on the symmetric interval KL, L satisfying periodic boundary conditions
y KL = y L and y ' KL = y ' L
it readily follows that A and B can be chosen arbitrarily provided that w = n p / L for n = 0, 1, 2, 3, . . . (see
npx
npx
Problem 4 in Lab 9). This yields the family of solutions a n cos
C b n sin
L
L
orthogonal in PS KL, L . This is confirmed by the following inner product computation.
L
a n cos
KL
npx
L
C b n sin
npx
L
a m cos
mpx
L
C b m sin
mpx
L
N
which are
n=0
dx
assuming m T integer, n T integer
0
(2)
Because a n and b n can be chosen arbitrarily, this orthogonal family in PS KL, L should be expressed in the
following form.
1, cos
px
L
, sin
px
L
, cos
2px
L
, sin
2px
L
, . . . , cos
npx
L
, sin
npx
L
,...
This is, without question, one of the most important orthogonal families in applied mathematics. We will
discuss them in more detail in Chapter 10.
page 51
Math 344, Maple Lab Manual
Chapter 9: Projections in Function Spaces
Orthogonal Families
Chapter 9 Procedures
Math Entry (Example)
Typical Application
Inner Product in PS a, b
The natural inner product in
PS a, b is defined by the
integral
Find the distance between u = x and
v = x2 in PS 0, 1 . Compare it to the
distance in PS K1, 1 .
This plot shows why the distance in PS K1.1 is
much larger than the distance in PS 0, 1 .
In PS 0, 1
plot x, x2 , x =K1 ..1
b
IP u, v =
1
u$v dx.
x K x2
a
The associated distance function
is
0
at 5 digits
2
K1
a
2
4
=
15
dx
at 5 digits
If we require the same projection in PS K1, 1 ,
just change the definition of the inner product:
1
1
u$v dx :
IP u, v d
0
.
Pr u, v d
u$v dx :
K1
Then define the projection.
g x
1
K1
1.0328
In PS 0, 1 : IP u, v d
0.5
x
K0.5
15
To define the projection function, first
define the appropriate inner product.
The projection of f x in the
direction of g x is the function
IP f x , g x
IP g x , g x
0.5
K1 K0.5 0
x K x2
dx
Project to Span g x
in PS a, b
1
30
0.18257
1
b
uKv
1
30
dx =
2
In PS K1, 1
u K v = IP u K v, u K v
=
Calculation
and calculate:
IP u, v
v:
IP v, v
Now, Pr x2 , x = 0 , they are orthogonal!
3
Now, Pr x2 , x =
x.
4
Gram-Schmidt in PS a, b
Given u , 1 % k % m, linearly
k
independent in PS a, b , an
orthogonal family spanning the
same subspace is obtained using
the for..do loop shown on the
right.
v du :
1
1
for n from 2 to m do v d u K add Pr u , v , k = 0 ..n K 1 : end do:
n
n
n
k
unassign 'n '
See the example below.
We will orthogonalize the list u d 1, x, sin p x , cos p x : in PS 0, 1 . The inner product and projection functions are
already defined. v d u : for n from 2 to 4 do v d u K add Pr u , v , k = 1 ..n K 1 : end do: unassign 'n '
1
1
n
n
n
k
This yields v $ k = 1 ..4 : evalf %, 4
k
1., x K 0.5000, sin 3.142 x K 0.6366, cos 3.142 x C 2.431 x K 1.216
The "exact" inner product matrix: Matrix 4, 4, i, j /IP v , v
i
j
1
0
0
0
0
1
12
0
0
0
0
1 p2 K 8
2
p2
0
0
0
0
1 p4 K 96
2
p4
page 52