Math 127H: Lecture 15 Dynamical Systems, Equilibrium Points

Math 127H: Lecture 15
Dynamical Systems, Equilibrium Points, Stability
Let Pn denote the population some group of animals at time n. It is governed by the
equation
Pn+1 = f (Pn ).
We call this a dynamical system.
Definition 1. We say a population P ∗ is an equilibrium population if f (P ∗ ) = P ∗ .
Definition 2. We say this equilibrium population is stable provided for ANY initial population P0 sufficiently near P ∗ , we have
Pn → P ∗ .
If P ∗ is not stable, then we say it is unstable.
We study the following family of dynamical systems. We assume that the population
grows according to
(Pn+1 − Pn )/h = rate ∗ Pn .
If the rate is constant, we obtain the model
Pn = P0 ∗ R n
for some constant R.
We are interested in the case where the rate is not constant. We assume that the rate
decreases as the population increases. We take the simplest kind of function with that
property. Somewhat arbitrarily we set
rate = (10 − P ).
This gives
Pn+1 = Pn + h(10 − P )P.
We have discovered the
Theorem 1. Consider the dynamical system
Pn → f (Pn ) = Pn+1.
Let P ∗ be an equilibrium point for this system. It is stable if
|f 0 (P ∗ )| < 1.
1
We apply this to our system above. First we find the equilibrium points. They are the
solutions to the equation
P ∗ = f (P ∗ ) = P ∗ + h(10 − P ∗ )P ∗ , so
h(10 − P ∗ )P ∗ = 0, so
P ∗ = 0, or P ∗ = 10.
We limit our discussion to P ∗ = 10. We compute the derivative of f at P = 10 obtaining
f 0 (10) = (1 + 10h) − 20h = 1 − 10h.
We find that this has absolute value ( so the system is stable at P + 10) strictly less
than 1 provided
0 < h < .2.
2
3