LINK¨OPING UNIVERSITY Department of Mathematics

LINKÖPING UNIVERSITY
Department of Mathematics
Mathematical Statistics
John Karlsson
TAMS29
Stochastic Processes with
Applications in Finance
3. Markov chains in discrete time
Definition 3.1. For a Markov chain with n
abilities P is the matrix

p(1, 1) p(1, 2)
 p(2, 1) p(2, 2)

P = .
..
 ..
.
states, the matrix of transition prob···
···
..
.
p(n, 1) p(n, 2) · · ·

p(1, n)
p(2, n) 

.. 
. 
p(n, n)
where p(i, j) denotes the probability to move from state i to state j. Alternative
notation pij .
NOTE: The two step transition probabilities p(2) (i, j), that is the probability to go
from state i to state j in exactly two steps can be obtained from the matrix P 2 .
Definition 3.2. A distribution π is said to be a stationary distribution for the
Markov chain (Xn , n ≥ 0) if
π = πP.
NOTE: A stationary distribution does not necessarily exist nor is it necessarily
unique. However if it exists and is unique one can interpret the elements πi as the
average proportion of time spent in state i.
Example 3.3. Let P be the 2 × 2 unit matrix I2 . Then every distribution is
stationary.
Definition 3.4. A distribution π is said to be a limiting distribution for the Markov
chain (Xn , n ≥ 0) if for every initial distribution π0 of the chain we have
lim πP n .
n→∞
NOTE: A limiting distribution is a stationary distribution. A limiting does not
necessarily exist but if it does then it is unique and is the only stationary distribution.
Example 3.5. Let
1/2 1/2
1/2 1/2
then the limiting distribution is π = (1/2, 1/2).
P =
Definition 3.6. A state j is accessible from state i if p(n) (i, j) > 0 for some n ≥ 0.
State i and j communicate if j is accessible from i and i is accessible from j. States
that communicate with each other are said to belong to the same equivalence class.
A Markov chain is said to be irreducible if all states communicate with each other.
Definition 3.7. A state i is periodic with period d if d is the smallest integer such
that p(n) (i, i) = 0 for all n which are not multiples of d. If d = 1 then the state is
said to be aperiodic. If a chain is irreducible then all states have the same period.
Definition 3.8. Let fi := P (X ever returns to i|X0 = i). A state is said to be
recurrent if fi = 1 and transient if fi < 1.
∞
P
Proposition 3.9. A state is recurrent if and only if
p(n) (i, i) = ∞.
n=1
1/2
Definition 3.10. Let i be a recurrent state and Ti be the first time of return to i.
The state i is said to be positive recurrent if E[Ti |X0 = i] < ∞ and null recurrent
if E[Ti |X0 = i] = ∞.
Remark 3.11. In a given class all states are either positive recurrent, null recurrent
or transient. In a finite state Markov chain all recurrent states are positive recurrent.
In the simple symmetric random walk all states are null recurrent.
Theorem 3.12. Let (Xn , n ≥ 0) be an irreducible and aperiodic Markov chain. If
the Markov chain has a stationary distribution π then π is a limiting distribution.
Theorem 3.13. Let (Xn , n ≥ 0) be an irreducible and positive recurrent Markov
chain. Then the Markov chain has a unique stationary distribution π.
Remark 3.14. An irreducible finite state Markov chain is always positive recurrent
so it always has a unique stationary distribution.
Corollary 3.15. If a Markov chain is irreducible, aperiodic and positive recurrent
then it has a unique stationary distribution which is also a limiting distribution. A
Markov chain with these 3 properties is called ergodic.
2/2