Lecture on Nonlinear Equations

Lecture on
Nonlinear Equations
Eskil Hansen and Gustaf Söderlind
Lund University
Nonlinear equations
Two main types of nonlinear equations:
♥ x = g (x)
Examples:
♠ f (x) = 0
♠
x2 − 2 = 0
♥
x = e−x
♥
x = cos x
♠
Ax = b
Problems:
1. How many solutions x ∗ ? None, one or many?
2. Computation? Almost no nonlinear equations can be solved
analytically or with finite algorithms.
Numerical solution: Construct {xn }∞
n=0 such that
lim xn = x ∗
n→∞
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
2 / 15
Iteration
Principle: Given xn , find rule to compute xn+1 .
Example: x = cos x; unique root x ∗ = 0.739 085 133.
Choose starting value, say x0 = 0.7.
Compute xn+1 = cos xn , n = 0, 1, . . .
1
0.8
x1 = 0.764 842
x2 = 0.721 492
x3 = 0.750 821
0.6
0.4
0.2
Convergence? How many iterations?
Eskil Hansen (Lund University)
0
0
FMN050 Nonlinear Equations
0.5
1
3 / 15
Iteration. . .
Example: x = e−x .
Compute xn+1 = e−xn , x0 = 0.55
x1 = 0.576 950
x2 = 0.561 609
x3 = 0.570 291
···
∗
x = 0.567 143 290 4
Converges, but quite slowly.
Eskil Hansen (Lund University)
1
0.8
0.6
0.4
0.2
0
0
FMN050 Nonlinear Equations
0.5
1
4 / 15
Iteration. . .
Same example: x = e−x ⇒ log x = −x.
Try xn+1 = − log xn , x0 = 0.55
x1
x2
x3
x4
x5
x6
x7
x8
= 0.597 837
= 0.514 437
= 0.664 682
= 0.408 447
= 0.895 394
= 0.110 492
= 2.202 816
= −0.789 736
1
0.8
0.6
0.4
0.2
0
0
0.5
1
Divergence! (“Log of negative number”)
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
5 / 15
Fixed point iteration
When does xn+1 = g (xn ) converge and how fast?
xn+1 = g (xn ) and x ∗ = g (x ∗ )
Terminology: x ∗ is the fixed point under the map g .
By the mean value theorem,
xn+1 − x ∗ = g (xn ) − g (x ∗ )
= g 0 (ξ) (xn − x ∗ )
|xn+1 − x ∗ | = |g 0 (ξ)| |xn − x ∗ |
Conclusion: If |g 0 (ξ)| < 1 the error will decrease.
Convergence is fast if |g 0 (ξ)| ¿ 1.
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
6 / 15
Fixed point theorem
Theorem. Assume that g is continuously differentiable on the
compact interval I, i.e., g ∈ C 1 (I).
• If g : I → I then there exists an x ∗ ∈ I such that x ∗ = g (x ∗ ).
• If in addition |g 0 (x)| < 1 for all x ∈ I, then x ∗ is unique,
and the iteration
xn+1 = g (xn )
converges to x ∗ for all x0 ∈ I.
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
7 / 15
Fixed point theorem. . .
Note: Both conditions are necessary:
|g 0 (x)| ≥ 1
g : I → J 6= I
g : I → I, |g 0 (x)| < 1
1
1
1
0.8
0.8
0.8
0.6
0.6
0.6
0.4
0.4
0.4
0.2
0.2
0.2
0
0
0.5
1
No fixed point.
Eskil Hansen (Lund University)
0
0
0.5
1
Many fixed points.
FMN050 Nonlinear Equations
0
0
0.5
1
Unique fixed point!
8 / 15
Error estimation in fixed point iteration
Assume |g 0 (x)| ≤ m < 1 for all x ∈ I.
xn+1 = g (xn ) and x ∗ = g (x ∗ )
xn+1 − x ∗ = g (xn ) − g (x ∗ )
= g (xn ) − g (xn+1 ) + g (xn+1 ) − g (x ∗ )
By the mean value theorem,
|xn+1 − x ∗ | ≤ m |xn − xn+1 | + m |xn+1 − x ∗ |
Error estimate: (computable error bound)
|xn+1 − x ∗ | ≤
Eskil Hansen (Lund University)
m
|xn − xn+1 |.
1−m
FMN050 Nonlinear Equations
9 / 15
Newton’s method
Newton’s method solves f (x) = 0 using repeated linearizations.
Linearize at the point (xn , f (xn ))!
Equation of the line:
1
0
y − f (xn ) = f (xn ) (x − xn ).
Define x = xn+1 so that y = 0, i.e.,
−f (xn ) = f 0 (xn ) (xn+1 − xn ).
Solve for xn+1 , to get
0.6
(x ,f(x ))
0.4
n
n
0.2
Newton’s method:
xn+1 = xn − f (xn )/f 0 (xn ).
Eskil Hansen (Lund University)
0.8
0
0
FMN050 Nonlinear Equations
(xn+1,0) 0.5
1
10 / 15
Newton’s method: alternative derivation
Expand f (xn+1 ) in a Taylor series around xn :
f (xn+1 ) = f (xn + (xn+1 − xn ))
≈ f (xn ) + f 0 (xn ) (xn+1 − xn ) := 0
Newton’s method:
xn+1 = xn −
f (xn )
f 0 (xn )
Note: Also if f is vector valued (systems of equations)
¡
¢−1
xn+1 = xn − f 0 (xn ) f (xn )
f 0 (xn ) is the Jacobian matrix of f .
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
11 / 15
Newton’s method: convergence
Newton iteration function: g (x) := x − f (x)/f 0 (x),
xn+1 = g (xn )
Note: Newton’s method converges fast if f 0 (x ∗ ) 6= 0,
as g 0 (x ∗ ) = f (x ∗ )f 00 (x ∗ )/f 0 (x ∗ )2 = 0.
Expand g (x) in a Taylor series around x ∗ :
g (xn ) − g (x ∗ ) ≈ g 0 (x ∗ )(xn − x ∗ ) +
xn+1 − x ∗ ≈
g 00 (x ∗ )
(xn − x ∗ )2
2
g 00 (x ∗ )
(xn − x ∗ )2
2
Define the error by εn = xn − x ∗ . Then εn+1 ∼ ε2n .
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
12 / 15
Convergence order
∗
∗
Given {xn }∞
n=0 with lim xn = x . Let εn = xn − x .
Definition: The convergence order is p with (asymptotic) error
constant Cp , if
|εn+1 |
0 < lim
= Cp < ∞.
n→∞ |εn |p
Important special cases: |εn+1 | ≈ Cp |εn |p
p=1
p=2
Eskil Hansen (Lund University)
Linear convergence
Ex: Fixed pt iteration
Cp = |g 0 (x ∗ )|
Quadratic convergence
Ex: Newton iteration
¯ 00 ∗ ) ¯
¯
Cp = ¯ 2ff 0(x
(x ∗ )
FMN050 Nonlinear Equations
13 / 15
When does Newton’s method converge?
Answer: If the starting value x0 is “close enough” to the root.
Example 1: x 2 − a = 0
xn+1 = 12 (xn +
⇒
a
)
xn
Convergence for every x0 > 0.
Example 2: arctan x = 0
⇒
xn+1 = xn − (1 + xn2 ) arctan xn
Convergence if |x0 | < α ≈ 1.391 745 2.
Example 3: x 1/3 = 0
⇒
xn+1 = −2xn
No convergence for any x0 6= 0; the error increases
by a factor of −2 every iteration!
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
14 / 15
Conditioning of a root
0.03
0.025
δy ≈ f 0 (x) δx
⇒
0.02
0.015
∆x ≈ ∆y /|f 0 (x)|
0.01
0.005
0
−0.005
−0.01
0
0.2
0.4
0.6
Left root is ill conditioned, right root is well conditioned.
A root where f 0 (x) is small is not well conditioned.
“Accuracy limit” independent of the method.
Eskil Hansen (Lund University)
FMN050 Nonlinear Equations
15 / 15