Introduction to finite differences

Finite Differences
February 12, 2006
1
Finite Differences
Finite differences is a way of approximating derivatives. The derivative is the
limit of ∆f /∆x as ∆x → 0. A finite difference approximation is, roughly
speaking, ∆f /∆x evaluated for a small value of ∆x.
Several variations are possible. Use h in place of ∆x and let
fi−1
fi
fi+1
= f (x − h)
= f (x)
= f (x + h)
Then we have the forward finite difference
fi+1 − fi
f0 ≈
,
h
the backward finite difference
fi − fi−1
f0 ≈
h
and the central finite difference
fi+1 − fi−1
f0 =
2h
1.1
Example
Consider f (x) = x3 . Its derivative at x = 5 is 3 × 52 = 75. Take h = 0.1.
Forward difference
3
(5 + .1) − 53
= 76. 51
.1
Backward difference
3
53 − (5 − .1)
= 73. 51
.1
Central difference
(5 + .1)3 − (5 − .1)3
= 75. 01
2 × .1
Central difference is clearly the best estimate.
1
1.2
DF rate of convergence
The following code reveals the rate of convergence of finite difference estimates.
f = inline(’x.^3’, ’x’);
h = 10.^[ -1 -2 -3 -4 ];
forward = (f(5 + h) - f(5))./h;
backward = (f(5) - f(5 - h))./h;
central = (f(5 + h) - f(5 - h))./(2*h);
clf;
loglog(h, abs(forward - 75), ’bo-’, ’LineWidth’, 2)
hold on;
loglog(h, abs(backward - 75), ’go-’, ’LineWidth’, 2);
loglog(h, abs(central - 75), ’ro-’, ’LineWidth’, 2);
grid on;
grid minor;
(Blue is hiding behind green.)
We observe that forward difference and backward difference yield linear convergence, while central difference yields quadratic.
2
1.3
Don’t let h be too small
The smaller the h the better the FD estimate. That’s certainly true "early on",
but at some point you run into trouble. For example, consider the same plot as
above for f (x) = sin x at x = π/10 and h = 10.^-(1:10):
We observe that the central difference element starts falling apart around
h = 10−5 while the forward and backward differences fall apart around h =
10−7 .
Why is this happening? The answer is lack of precision. We only have 16
digits of precision.
Let us¢ consider the function sin x and x = π4 . The values of
¡π
π
sin 4 and sin 4 + 10−10 agree to 10 digits:
π
= 0.309 016 994 374 947 4
10
´
³π
= 0.309 016 994 470 053 1
+ 10−10
sin
10
sin
so when we subtract one from the other and divide by h = 10−10 , we get
¢
¡π
π
+ 10−10 − sin 10
sin 10
0.309 016 994 470 053 1 − 0.309 016 994 374 947 4
=
−
10
10−10
= 0.951 057
3
and we lose 10 digits of precision! So the best we can hope for is 6 digits of
accuracy, which is what we observe on the plot above.
1.4
What is the best h?
Trade-off between accuracy of the estimate and the available precision.
1.5
So how to get more accurate estimates?
Central differences + Richardson extrapolation!
1.6
FD estimate for the second derivative
Here’s a systematic way to derive the finite difference estimates for the second
derivative. Approximate the original function by a quadratic interpolant, then
compute the exact derivative the approximation. Once again, let
fi−1
fi
fi+1
= f (x0 − h)
= f (x0 )
= f (x0 + h)
and let us interpolate the values fi−1 , fi , and fi+1 by a quadratic function
y (x) = ax2 + bx + c
The coefficients a, b, and c can be determined from the
⎤⎡
⎤ ⎡
⎡
a
(x0 − h)2 x0 − h 1
⎣
x20
x0
1 ⎦⎣ b ⎦ = ⎣
2
c
(x0 + h) x0 + h 1
following linear system:
⎤
fi−1
fi ⎦
fi+1
Now, while a, b, and c certainly depend on x0 (i.e. a, b, and c are different for
different values of x0 ), the eventual FD estimate is only a function of fi−1 , fi ,
fi+1 and h. Since the eventual answer does not depend on x0 , we can simplify
the algebra by picking a convenient value of x0 , such as x0 = 0. The system
simplifies to
⎡
⎤
⎤⎡
⎤ ⎡
2
a
fi−1
(−h) −h 1
⎣
0
0 1 ⎦ ⎣ b ⎦ = ⎣ fi ⎦
2
fi+1
c
h
h 1
and the solution is
⎡
⎤ ⎡
a
(−h)2
⎣ b ⎦=⎣
0
c
h2
⎤ ⎡
⎤−1 ⎡
fi−1
−h 1
0 1 ⎦ ⎣ fi ⎦ = ⎣
fi+1
h 1
1 fi−1 −2fi +fi+1
2
h2
1 fi+1 −fi−1
2
h
fi
⎤
⎦
i +fi+1 2
i−1
so the interpolant is y (x) = 12 fi−1 −2f
x + 12 fi+1 −f
x + fi . Its second
h2
h
derivative is
fi−1 − 2fi + fi+1
y 00 (x) =
h2
4
which motivates the following FD estimate for the second derivative of f (x):
f 00 (x) ≈
1.7
fi−1 − 2fi + fi+1
h2
DF rate of convergence for the second derivative
Here’s the code that provides the answer for f (x) = x3 at x = 5.
f = inline(’sin(x)’, ’x’);
x = pi/10;
h = 10.^[ -1 -2 -3 -4 ];
fd = (f(x - h) - 2*f(x) + f(x + h)).*h.^-2;
loglog(h, abs(fd - (-sin(x))), ’bo-’, ’LineWidth’, 2)
grid on;
grid minor;
We observe quadratic convergence and loss-of-precision problems start early
at h = 10−4 .
We know that this finite difference estimate is exact for linear and quadratic
functions (since the original function and the interpolant are the same thing).
Interestingly, this finite difference approximation is exact for cubic polynomials:
(x − h)3 − 2x3 + (x + h)3
= 6x
h2
5