Discrete Time Markov Chains, Limiting
Distribution and Classification
Bo Friis Nielsen1
1 DTU
Informatics
02407 Stochastic Processes 3, September 13 2016
Bo Friis Nielsen
Limiting Distribution and Classification
Discrete time Markov chains
Today:
I
Discrete time Markov chains - invariant probability
distribution
I
Classification of states
I
Classification of chains
Next week
I
Poisson process
Two weeks from now
I
Birth- and Death Processes
Bo Friis Nielsen
Limiting Distribution and Classification
Regular Transition Probability Matrices
P = Pij ,
0 ≤ i, j ≤ N
Regular: If P k > 0 for some k
(n)
In that case limn→∞ Pij = πj
Theorem 4.1 (Page 168) let P be a regular transition
probability matrix on the states 0, 1, . . . , N. Then the limiting
distribution π = (π0 , π1 , πN ) is the unique nonnegative solution
of the equations
πj =
N
X
πk Pij ,
π = πP
k =0
N
X
πk = 1,
π1 = 1
k =0
Bo Friis Nielsen
Limiting Distribution and Classification
Interpretation of πj ’s
Bo Friis Nielsen
Limiting Distribution and Classification
Interpretation of πj ’s
I
(n)
Limiting probabilities limn→∞ Pij
Bo Friis Nielsen
= πj
Limiting Distribution and Classification
Interpretation of πj ’s
I
I
(n)
Limiting probabilities limn→∞ Pij = πj
P
(n)
Long term averages limn→∞ 11 m
n=1 Pij = πj
Bo Friis Nielsen
Limiting Distribution and Classification
Interpretation of πj ’s
(n)
I
Limiting probabilities limn→∞ Pij = πj
P
(n)
Long term averages limn→∞ 11 m
n=1 Pij = πj
I
Stationary distribution π = πP
I
Bo Friis Nielsen
Limiting Distribution and Classification
A Social Mobility Example
Father’s
Class
Lower
Middle
Upper
Son’s Class
Lower Middle Upper
0.40
0.50
0.10
0.05
0.70
0.25
0.05
0.50
0.45
0.0772 0.6250 0.2978
8
P = 0.0769 0.6250 0.2981
0.0769 0.6250 0.2981
π0 = 0.40π0 + 0.05π1 + 0.05π2
π1 = 0.50π0 + 0.70π1 + 0.50π2
π2 = 0.10π0 + 0.25π1 + 0.45π2
1 = π0 + π1 + π2
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
States which cannot be left, once entered
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
States which cannot be left, once entered - absorbing
states
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
finite
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
finite - postive recurrence/non-null recurrent
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
I
finite - postive recurrence/non-null recurrent
infinite
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
I
finite - postive recurrence/non-null recurrent
infinite - null recurrent
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
I
I
finite - postive recurrence/non-null recurrent
infinite - null recurrent
States where the return some time in the future is
uncertain
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
I
I
finite - postive recurrence/non-null recurrent
infinite - null recurrent
States where the return some time in the future is
uncertain - transient states
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
I
finite - postive recurrence/non-null recurrent
infinite - null recurrent
I
States where the return some time in the future is
uncertain - transient states
I
States which can only be visited at certain time epochs
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chain states
I
I
States which cannot be left, once entered - absorbing
states
States where the return some time in the future is certain recurrent or persistent states
I
The mean time to return can be
I
I
finite - postive recurrence/non-null recurrent
infinite - null recurrent
I
States where the return some time in the future is
uncertain - transient states
I
States which can only be visited at certain time epochs periodic states
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of States
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of States
I
(n)
j is accessible from i if Pij
Bo Friis Nielsen
> 0 for some n
Limiting Distribution and Classification
Classification of States
(n)
I
j is accessible from i if Pij
I
If j is accessible from i and i is accessible from j we say
that the two states communicate
Bo Friis Nielsen
> 0 for some n
Limiting Distribution and Classification
Classification of States
(n)
I
j is accessible from i if Pij
I
If j is accessible from i and i is accessible from j we say
that the two states communicate
Communicating states constitute equivalence classes (an
equivalence relation)
I
Bo Friis Nielsen
> 0 for some n
Limiting Distribution and Classification
Classification of States
(n)
I
j is accessible from i if Pij
I
If j is accessible from i and i is accessible from j we say
that the two states communicate
Communicating states constitute equivalence classes (an
equivalence relation)
I
I
> 0 for some n
i communicates with j and j communicates with k then i
and k communicates
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions.
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
=
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
fij
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
fij =
∞
X
(n)
fij
n=1
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
fij =
∞
X
(n)
fij
≤1
n=1
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
fij =
∞
X
(n)
fij
≤1
n=1
(n)
The probabilities fij
constitiute a probability distribution.
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
fij =
∞
X
(n)
fij
≤1
n=1
(n)
The probabilities fij constitiute a probability distribution. On
P
(n)
the contrary we cannot say anything in general on ∞
n=1 pij
Bo Friis Nielsen
Limiting Distribution and Classification
First passage and first return times
We can formalise the discussion of state classification by use of
a certain class of probability distributions - first passage time
distributions. Define the first passage probability
(n)
fij
= P{X1 6= j, X2 6= j, . . . , Xn−1 6= j, Xn = j|X0 = i}
This is the probability of reaching j for the first time at time n
having started in i.
What is the probability of ever reaching j?
fij =
∞
X
(n)
fij
≤1
n=1
(n)
The probabilities fij constitiute a probability distribution. On
P
(n)
the contrary we cannot say anything in general on ∞
n=1 pij
(the n-step transition probabilities)
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
A state is recurrent (persistent) if
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
A state is recurrent (persistent) if fii
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
A state is positive or non-null recurrent if
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
A state is positive or non-null recurrent if E(Ti )
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) =
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
A state is transient if
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
A state is transient if fii
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
A state is transient if fii < 1.
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
A state is transient if fii < 1.
In this case we define µi = ∞ for later convenience.
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
I
A state is transient if fii < 1.
In this case we define µi = ∞ for later convenience.
I
A peridoic state has nonzero pii (nk ) for some k .
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
I
A state is transient if fii < 1.
In this case we define µi = ∞ for later convenience.
I
A peridoic state has nonzero pii (nk ) for some k .
I
A state is ergodic
Bo Friis Nielsen
Limiting Distribution and Classification
(n)
State classification by fii
I
P
(n)
=1
A state is recurrent (persistent) if fii = ∞
f
n=1 ii
I
I
A state is positive or non-null recurrent if E(Ti ) < ∞.
P∞
(n)
E(Ti ) = n=1 nfii = µi
A state is null recurrent if E(Ti ) = µi = ∞
I
A state is transient if fii < 1.
In this case we define µi = ∞ for later convenience.
I
A peridoic state has nonzero pii (nk ) for some k .
I
A state is ergodic if it is positive recurrent and aperiodic.
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chains
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chains
I
We can identify subclasses of states with the same
properties
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chains
I
We can identify subclasses of states with the same
properties
I
All states which can mutually reach each other will be of
the same type
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chains
I
We can identify subclasses of states with the same
properties
I
All states which can mutually reach each other will be of
the same type
I
Once again the formal analysis is a little bit heavy,
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chains
I
We can identify subclasses of states with the same
properties
I
All states which can mutually reach each other will be of
the same type
I
Once again the formal analysis is a little bit heavy, but try to
stick to the fundamentals,
Bo Friis Nielsen
Limiting Distribution and Classification
Classification of Markov chains
I
We can identify subclasses of states with the same
properties
I
All states which can mutually reach each other will be of
the same type
I
Once again the formal analysis is a little bit heavy, but try to
stick to the fundamentals, definitions (concepts) and results
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of sets of intercommunicating states
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of sets of intercommunicating states
I
(a) i and j has the same period
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of sets of intercommunicating states
I
(a) i and j has the same period
I
(b) i is transient if and only if j is transient
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of sets of intercommunicating states
I
(a) i and j has the same period
I
(b) i is transient if and only if j is transient
I
(c) i is null persistent (null recurrent) if and only if j is null
persistent
Bo Friis Nielsen
Limiting Distribution and Classification
A set C of states is called
Bo Friis Nielsen
Limiting Distribution and Classification
A set C of states is called
I
(a) Closed if pij = 0 for all i ∈ C, j ∈
/C
Bo Friis Nielsen
Limiting Distribution and Classification
A set C of states is called
I
(a) Closed if pij = 0 for all i ∈ C, j ∈
/C
I
(b) Irreducible if i ↔ j for all i, j ∈ C.
Theorem
Bo Friis Nielsen
Limiting Distribution and Classification
A set C of states is called
I
(a) Closed if pij = 0 for all i ∈ C, j ∈
/C
I
(b) Irreducible if i ↔ j for all i, j ∈ C.
Theorem
Decomposition Theorem The state space S can be partitioned
uniquely as
S = T ∪ C1 ∪ C2 ∪ . . .
Bo Friis Nielsen
Limiting Distribution and Classification
A set C of states is called
I
(a) Closed if pij = 0 for all i ∈ C, j ∈
/C
I
(b) Irreducible if i ↔ j for all i, j ∈ C.
Theorem
Decomposition Theorem The state space S can be partitioned
uniquely as
S = T ∪ C1 ∪ C2 ∪ . . .
where T is the set of transient states, and the Ci are irreducible
closed sets of persistent states
Bo Friis Nielsen
Limiting Distribution and Classification
A set C of states is called
I
(a) Closed if pij = 0 for all i ∈ C, j ∈
/C
I
(b) Irreducible if i ↔ j for all i, j ∈ C.
Theorem
Decomposition Theorem The state space S can be partitioned
uniquely as
S = T ∪ C1 ∪ C2 ∪ . . .
where T is the set of transient states, and the Ci are irreducible
closed sets of persistent states
Lemma
If S is finite, then at least one state is persistent(recurrent) and
all persistent states are non-null (positive recurrent)
Bo Friis Nielsen
Limiting Distribution and Classification
Basic Limit Theorem
Theorem 4.3 The basic limit theorem of Markov chains
Bo Friis Nielsen
Limiting Distribution and Classification
Basic Limit Theorem
Theorem 4.3 The basic limit theorem of Markov chains
(a) Consider a recurrent irreducible aperiodic Markov
(n)
chain. Let Pii be the probability of entering state i
at the nth transition, n = 1, 2, . . . , given that
(0)
(n)
X0 = i. By our earlier convention Pii = 1. Let fii
be the probability of first returning to state i at the
(0)
nth transition n = 1, 2, . . . , where fii = 0. Then
(n)
lim Pii
n→∞
Bo Friis Nielsen
=P
∞
1
(n)
n=0 nfii
=
1
mi
Limiting Distribution and Classification
Basic Limit Theorem
Theorem 4.3 The basic limit theorem of Markov chains
(a) Consider a recurrent irreducible aperiodic Markov
(n)
chain. Let Pii be the probability of entering state i
at the nth transition, n = 1, 2, . . . , given that
(0)
(n)
X0 = i. By our earlier convention Pii = 1. Let fii
be the probability of first returning to state i at the
(0)
nth transition n = 1, 2, . . . , where fii = 0. Then
(n)
lim Pii
n→∞
=P
∞
1
(n)
n=0 nfii
=
1
mi
(b) under the same conditions as in (a),
(n)
(n)
limn→∞ Pji = limn→∞ Pii for all j.
Bo Friis Nielsen
Limiting Distribution and Classification
An example chain (random walk with reflecting
barriers)
P
Bo Friis Nielsen
Limiting Distribution and Classification
An example chain (random walk with reflecting
barriers)
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
Bo Friis Nielsen
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
Limiting Distribution and Classification
An example chain (random walk with reflecting
barriers)
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
With initial probability distribution p (0) = (1, 0, 0, 0, 0, 0, 0, 0) or
X0 = 1.
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 ,
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
fj1 ≥ 0.3j−1 ,
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
fj1 ≥ 0.3j−1 , j > 1.
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
fj1 ≥ 0.3j−1 , j > 1.
I
Thus all states communicate and the chain is irreducible.
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
fj1 ≥ 0.3j−1 , j > 1.
I
Thus all states communicate and the chain is irreducible.
Generally we won’t bother with bounds for the fij ’s.
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
fj1 ≥ 0.3j−1 , j > 1.
I
Thus all states communicate and the chain is irreducible.
Generally we won’t bother with bounds for the fij ’s.
I
Since the chain is finite all states are positive recurrent
Bo Friis Nielsen
Limiting Distribution and Classification
Properties of that chain
I
We have a finite number of states
I
From state 1 we can reach state j with a probability
f1j ≥ 0.4j−1 , j > 1.
I
From state j we can reach state 1 with a probability
fj1 ≥ 0.3j−1 , j > 1.
I
Thus all states communicate and the chain is irreducible.
Generally we won’t bother with bounds for the fij ’s.
I
Since the chain is finite all states are positive recurrent
I
A look on the behaviour of the chain
Bo Friis Nielsen
Limiting Distribution and Classification
A number of different sample paths Xn ’s
Bo Friis Nielsen
Limiting Distribution and Classification
A number of different sample paths Xn ’s
8
7
6
5
4
3
2
1
0
10
20
30
40
50
60
70
Bo Friis Nielsen
Limiting Distribution and Classification
A number of different sample paths Xn ’s
8
8
7
7
6
6
5
5
4
4
3
3
2
1
2
0
10
20
30
40
50
60
70
1
0
Bo Friis Nielsen
10
20
30
40
50
60
70
Limiting Distribution and Classification
A number of different sample paths Xn ’s
8
8
7
7
6
6
5
5
4
4
3
3
2
1
2
0
10
20
30
40
50
60
70
0
10
20
30
40
50
60
70
1
0
10
20
30
40
50
60
70
8
7
6
5
4
3
2
1
Bo Friis Nielsen
Limiting Distribution and Classification
A number of different sample paths Xn ’s
8
8
7
7
6
6
5
5
4
4
3
3
2
1
2
0
10
20
30
40
50
60
70
1
8
8
7
7
6
6
5
5
4
4
3
10
20
30
40
50
60
70
0
10
20
30
40
50
60
70
3
2
1
0
2
0
10
20
30
40
50
60
70
1
Bo Friis Nielsen
Limiting Distribution and Classification
The state probabilities
(n)
pj
Bo Friis Nielsen
Limiting Distribution and Classification
The state probabilities
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0
10
20
30
40
50
60
70
(n)
pj
Bo Friis Nielsen
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
Bo Friis Nielsen
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
Bo Friis Nielsen
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
I
If the chain is transient or null-persistent (null-recurrent)
Bo Friis Nielsen
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
I
If the chain is transient or null-persistent (null-recurrent)
(n)
pij → 0
Bo Friis Nielsen
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
I
If the chain is transient or null-persistent (null-recurrent)
(n)
pij → 0
I
If the chain is positive recurrent pij
(n)
Bo Friis Nielsen
→
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
I
If the chain is transient or null-persistent (null-recurrent)
(n)
pij → 0
I
If the chain is positive recurrent pij
(n)
Bo Friis Nielsen
→
1
µj
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
I
If the chain is transient or null-persistent (null-recurrent)
(n)
pij → 0
I
If the chain is positive recurrent pij
I
The limiting probability of Xn = j
(n)
Bo Friis Nielsen
→
1
µj
Limiting Distribution and Classification
Limiting distribution
For an irreducible aperiodic chain, we have that
(n)
pij
→
1
as n → ∞, for all i and j
µj
Three important remarks
I
If the chain is transient or null-persistent (null-recurrent)
(n)
pij → 0
I
If the chain is positive recurrent pij
I
The limiting probability of Xn = j does not depend on the
starting state X0 = i
(n)
Bo Friis Nielsen
→
1
µj
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n)
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P = p (n−1)
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P = p (n−1) by our
assumption of p (n) being constant
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P = p (n−1) by our
assumption of p (n) being constant
I
Expressed differently
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P = p (n−1) by our
assumption of p (n) being constant
I
Expressed differently π =
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P = p (n−1) by our
assumption of p (n) being constant
I
Expressed differently π = π
Bo Friis Nielsen
Limiting Distribution and Classification
The stationary distribution
I
A distribution that does not change with n
I
The elements of p (n) are all constant
I
The implication of this is p (n) = p (n−1) P = p (n−1) by our
assumption of p (n) being constant
I
Expressed differently π = πP
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
An irreducible chain has a stationary distribution π
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
An irreducible chain has a stationary distribution π if and only if
all the states are non-null persistent
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
An irreducible chain has a stationary distribution π if and only if
all the states are non-null persistent (positive recurrent);
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
An irreducible chain has a stationary distribution π if and only if
all the states are non-null persistent (positive recurrent);in this
case, π is the unique stationary distribution
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
An irreducible chain has a stationary distribution π if and only if
all the states are non-null persistent (positive recurrent);in this
case, π is the unique stationary distribution and is given by
πi = µ1i for each i ∈ S,
Bo Friis Nielsen
Limiting Distribution and Classification
Stationary distribution
Definition
The vector π is called a stationary distribution of the chain if π
has entries (πj : j ∈ S) such that
P
I (a) πj ≥ 0 for all j, and
j πj = 1
P
I (b) π = πP, which is to say that πj =
i πi pij for all j.
VERY IMPORTANT
An irreducible chain has a stationary distribution π if and only if
all the states are non-null persistent (positive recurrent);in this
case, π is the unique stationary distribution and is given by
πi = µ1i for each i ∈ S, where µi is the mean recurrence time of
i.
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
Bo Friis Nielsen
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
π = πP
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
Bo Friis Nielsen
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 +
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4 + π2
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4 + π2 ·
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4 + π2 · 0.3 +
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4 + π2 · 0.3 + π3
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4 + π2 · 0.3 + π3 · 0.3
Bo Friis Nielsen
Limiting Distribution and Classification
The example chain (random walk with reflecting
barriers)
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.7
P
Elementwise the matrix equation is πi = j πj pji
P=
0.6
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
0.0
0.0
0.0
0.0
0.0
0.0
0.4
0.3
0.3
π = πP
π1 = π1 · 0.6 + π2 · 0.3
π2 = π1 · 0.4 + π2 · 0.3 + π3 · 0.3
π3 = π2 · 0.4 + π3 · 0.3 + π4 · 0.3
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
πj
= πj−1 · 0.4 + πj · 0.3 + πj+1 · 0.3
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
πj
= πj−1 · 0.4 + πj · 0.3 + πj+1 · 0.3
π8 = π7 · 0.4 + π8 · 0.7
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
πj
= πj−1 · 0.4 + πj · 0.3 + πj+1 · 0.3
π8 = π7 · 0.4 + π8 · 0.7
Or
π2 =
1 − 0.6
π1
0.3
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
πj
= πj−1 · 0.4 + πj · 0.3 + πj+1 · 0.3
π8 = π7 · 0.4 + π8 · 0.7
Or
π2 =
πj+1 =
1 − 0.6
π1
0.3
1
((1 − 0.3)πj − 0.4πj−1 )
0.3
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
πj
= πj−1 · 0.4 + πj · 0.3 + πj+1 · 0.3
π8 = π7 · 0.4 + π8 · 0.7
Or
π2 =
πj+1 =
1 − 0.6
π1
0.3
1
((1 − 0.3)πj − 0.4πj−1 )
0.3
Can be solved recursively
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 · 0.6 + π2 · 0.3
πj
= πj−1 · 0.4 + πj · 0.3 + πj+1 · 0.3
π8 = π7 · 0.4 + π8 · 0.7
Or
π2 =
πj+1 =
1 − 0.6
π1
0.3
1
((1 − 0.3)πj − 0.4πj−1 )
0.3
Can be solved recursively to find:
πj =
0.4
0.3
Bo Friis Nielsen
j−1
π1
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
πj
j=1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
j=1
πj = 1,
8
X
j=1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
j=1
πj = 1,
8 X
0.4 j−1
π1
0.3
j=1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
j=1
πj = 1,
8 X
0.4 j−1
π1 = π1
0.3
j=1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
j=1
πj = 1,
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
j=1
Bo Friis Nielsen
k =0
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
πj = 1,
j=1
j=1
N
X
i=0
ai =
k =0
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
πj = 1,
j=1
k =0
j=1
N
X
i=0
ai =
1−aN+1
1−a
N < ∞, a 6= 1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
πj = 1,
j=1
k =0
j=1
N
X
i=0
ai =
1−aN+1
1−a
N +1
N < ∞, a 6= 1
N < ∞, a = 1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
πj = 1,
j=1
k =0
j=1
N
X
i=0
ai =
1−aN+1
1−a
1
1−a
N +1
Bo Friis Nielsen
N < ∞, a 6= 1
N < ∞, a = 1
N = ∞, |a| < 1
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
πj = 1,
j=1
k =0
j=1
N
X
ai =
i=0
1−aN+1
1−a
1
1−a
N +1
N < ∞, a 6= 1
N < ∞, a = 1
N = ∞, |a| < 1
Such that
1 = π1
0.4 8
0.3
− 0.4
0.3
1−
1
Bo Friis Nielsen
Limiting Distribution and Classification
The normalising condition
I
We note that we don’t have to use the last equation
I
We need a solution which is a probability distribution
8
X
7 8 X
X
0.4 k
0.4 j−1
π1 = π1
0.3
0.3
πj = 1,
j=1
k =0
j=1
N
X
ai =
i=0
1−aN+1
1−a
1
1−a
N +1
N < ∞, a 6= 1
N < ∞, a = 1
N = ∞, |a| < 1
Such that
1 = π1
0.4 8
0.3
− 0.4
0.3
1−
1
Bo Friis Nielsen
⇔ π1 =
1−
1−
0.4
0.3
0.4 8
0.3
Limiting Distribution and Classification
The solution of π = πP
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP
I
More or less straightforward,
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP
I
More or less straightforward, but one problem
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP
I
More or less straightforward, but one problem
I
if x is a solution such that x = xP
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP
I
More or less straightforward, but one problem
I
if x is a solution such that x = xP then obviously
(k x) = (k x)P is also a solution.
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP
I
More or less straightforward, but one problem
I
if x is a solution such that x = xP then obviously
(k x) = (k x)P is also a solution.
I
Recall the definition of eigenvalues/eigen vectors
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP
I
More or less straightforward, but one problem
I
if x is a solution such that x = xP then obviously
(k x) = (k x)P is also a solution.
I
Recall the definition of eigenvalues/eigen vectors
I
If Ay = λy we say that λ is an eigenvalue of A with an
associated eigenvector y. Here y is a right eigenvector,
there is also a left eigenvector
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP continued
I
The vector π is a left eigenvector of P.
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP continued
I
The vector π is a left eigenvector of P.
I
The main theorem says that there is a unique eigenvector
associated with the eigenvalue 1 of P
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP continued
I
The vector π is a left eigenvector of P.
I
The main theorem says that there is a unique eigenvector
associated with the eigenvalue 1 of P
I
In practice this means that the we can only solve but a
normalising condition
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP continued
I
The vector π is a left eigenvector of P.
I
The main theorem says that there is a unique eigenvector
associated with the eigenvalue 1 of P
I
In practice this means that the we can only solve but a
normalising condition
P
But we have the normalising condition by j πj = 1
I
Bo Friis Nielsen
Limiting Distribution and Classification
The solution of π = πP continued
I
The vector π is a left eigenvector of P.
I
The main theorem says that there is a unique eigenvector
associated with the eigenvalue 1 of P
I
In practice this means that the we can only solve but a
normalising condition
P
But we have the normalising condition by j πj = 1 this
can expressed as π1 = 1. Where
1
1
1= .
..
I
1
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
The stationary solution.
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
The stationary solution. If p (0) = π then p (n) = π for all n.
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution,
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n)
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
Markov chain has to be aperiodic too).
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi = π1i .
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi = π1i .
The mean number of visits in state j between two
successive visits to state i
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi = π1i .
The mean number of visits in state j between two
π
successive visits to state i is πji .
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi = π1i .
The mean number of visits in state j between two
π
successive visits to state i is πji .
The long run average probability of finding the Markov
chain in state i is πi .
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi = π1i .
The mean number of visits in state j between two
π
successive visits to state i is πji .
The long run average probability of finding the Markov
P
(k )
chain in state i is πi . πi = limn→∞ n1 nk =1 pi
Bo Friis Nielsen
Limiting Distribution and Classification
Roles of the solution to π = πP
For an irreducible Markov chain, (the condition we need to
verify)
I
I
I
I
I
The stationary solution. If p (0) = π then p (n) = π for all n.
The limiting distribution, i.e. p (n) → π for n → ∞ (the
(n)
Markov chain has to be aperiodic too). Also pij → πj .
The mean recurrence time for state i is µi = π1i .
The mean number of visits in state j between two
π
successive visits to state i is πji .
The long run average probability of finding the Markov
P
(k )
chain in state i is πi . πi = limn→∞ n1 nk =1 pi also true for
periodic chains.
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
Bo Friis Nielsen
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 +
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 + π2
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 + π2
π2
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 + π2
π2 = π1
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 + π2
π2 = π1 p2
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 + π2
π2 = π1 p2 + π3
Bo Friis Nielsen
Limiting Distribution and Classification
Example (null-recurrent) chain
P=
p1
1
0
0
0
0
...
p2
0
1
0
0
0
...
p3
0
0
1
0
0
...
p4
0
0
0
1
0
...
p5
0
0
0
0
1
...
...
...
...
...
...
...
...
For pj > 0 the chain is obviously irreducible.
The main theorem tells us that we can investigate directly for
π = πP.
π1 = π1 p1 + π2
π2 = π1 p2 + π3
Bo Friis Nielsen
πj = π1 pj + πj+1
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
Bo Friis Nielsen
πj = (1−p1 · · ·−pj−1 )π1
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1−p1 · · ·−pj−1 )π1
πj = (1 − p1 · · · − pj−1 )π1
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi
i=1
Bo Friis Nielsen
Limiting Distribution and Classification
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Bo Friis Nielsen
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
Bo Friis Nielsen
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
∞
X
πj = 1
j=1
Bo Friis Nielsen
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
Bo Friis Nielsen
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi =
i=j
Bo Friis Nielsen
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi = π1
i=j
Bo Friis Nielsen
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi = π1
i=j
Bo Friis Nielsen
∞
X
i=1
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi = π1
i=j
Bo Friis Nielsen
∞ X
i
X
i=1 j=1
Limiting Distribution and Classification
∞
X
i=j
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
i=j
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi = π1
i=j
Bo Friis Nielsen
∞ X
i
X
∞
X
pi
i=1 j=1
Limiting Distribution and Classification
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
pi ⇔ πj = π1
i=1
i=j
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi = π1
i=j
Bo Friis Nielsen
∞ X
i
X
∞
X
pi = π1
i=1 j=1
Limiting Distribution and Classification
pi
π1 = π1 p1 + π2
π2 = π1 p2 + π3
πj = π1 pj + πj+1
we get
π2 = (1−p1 )π1
π3 = (1−p1 −p2 )π1
πj = (1 − p1 · · · − pj−1 )π1 ⇔ πj = π1 1 −
πj = (1−p1 · · ·−pj−1 )π1
j−1
X
∞
X
pi ⇔ πj = π1
i=1
i=j
Normalisation
∞
X
j=1
πj = 1
∞
X
j=1
π1
∞
X
pi = π1
i=j
Bo Friis Nielsen
∞ X
i
X
i=1 j=1
pi = π1
∞
X
ipi
i=1
Limiting Distribution and Classification
pi
p22
p11
p12
1
2
p21
Bo Friis Nielsen
Limiting Distribution and Classification
p22
p11
p12
1
2
p21
P=
p11 p12
p21 p22
Bo Friis Nielsen
Limiting Distribution and Classification
Reversible Markov chains
I
Solve sequence of linear equations instead of the whole
system
Bo Friis Nielsen
Limiting Distribution and Classification
Reversible Markov chains
I
Solve sequence of linear equations instead of the whole
system
I
Local balance in probability flow as opposed to global
balance
Bo Friis Nielsen
Limiting Distribution and Classification
Reversible Markov chains
I
Solve sequence of linear equations instead of the whole
system
I
Local balance in probability flow as opposed to global
balance
I
Nice theoretical construction
Bo Friis Nielsen
Limiting Distribution and Classification
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πj
j
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
j
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
πi · 1
j
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
πi · 1 =
X
j
πj pji
j
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
πi · 1 =
X
j
πj pji
πi
j
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
πi · 1 =
X
j
j
Bo Friis Nielsen
πj pji
πi
X
pij
j
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
πi · 1 =
X
j
j
Bo Friis Nielsen
πj pji
πi
X
j
pij =
X
πj pji
j
Limiting Distribution and Classification
Local balance equations
πi =
X
πj pji
πi · 1 =
X
j
πj pji
πi
j
X
πi pij =
j
Bo Friis Nielsen
X
j
X
pij =
X
πj pji
j
πj pji
j
Limiting Distribution and Classification
Local balance equations
πi =
X
πi · 1 =
πj pji
X
j
πj pji
πi
j
X
πi pij =
j
X
j
X
pij =
X
πj pji
j
πj pji
j
Term for term we get
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πi · 1 =
πj pji
X
j
πj pji
πi
j
X
πi pij =
j
X
j
X
pij =
X
πj pji
j
πj pji
j
Term for term we get
πi pij = πj pji
Bo Friis Nielsen
Limiting Distribution and Classification
Local balance equations
πi =
X
πi · 1 =
πj pji
X
j
πj pji
πi
j
X
πi pij =
j
X
j
X
pij =
X
πj pji
j
πj pji
j
Term for term we get
πi pij = πj pji
If they are fulfilled for each i and j, the global balance equations
can be obtained by summation.
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j}
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
For a reversible chain (local balance)
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
For a reversible chain (local balance) this is πi pij = πj pji
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
For a reversible chain (local balance) this is πi pij = πj pji =
P{Xn−1 = j}
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
For a reversible chain (local balance) this is πi pij = πj pji =
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
For a reversible chain (local balance) this is πi pij = πj pji =
P{Xn−1 = j}P{Xn = i|Xn−1 = j} = P{Xn−1 = j ∩ Xn = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Why reversible?
P{Xn−1 = i ∩ Xn = j} = P{Xn−1 = i}P{Xn = j|Xn−1 = i}
= P{Xn−1 = i}pij
and for a stationary chain
πi pij
For a reversible chain (local balance) this is πi pij = πj pji =
P{Xn−1 = j}P{Xn = i|Xn−1 = j} = P{Xn−1 = j ∩ Xn = i} the
reversed sequence.
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
P{Xn−1 = j ∩ Xn = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
P{Xn = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}pji
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
=
P{Xn = i}
P{Xn = i}
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}pji
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
=
P{Xn = i}
P{Xn = i}
For a stationary chain we get
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}pji
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
=
P{Xn = i}
P{Xn = i}
For a stationary chain we get
πj pji
πi
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}pji
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
=
P{Xn = i}
P{Xn = i}
For a stationary chain we get
πj pji
πi
The chain is reversible if P{Xn−1 = j|Xn = i} = pij leading to
the local balance equations
Bo Friis Nielsen
Limiting Distribution and Classification
Another look at a similar question
P{Xn−1 = j|Xn = i} =
=
P{Xn−1 = j ∩ Xn = i}
P{Xn = i}
P{Xn−1 = j}pji
P{Xn−1 = j}P{Xn = i|Xn−1 = j}
=
P{Xn = i}
P{Xn = i}
For a stationary chain we get
πj pji
πi
The chain is reversible if P{Xn−1 = j|Xn = i} = pij leading to
the local balance equations
pij =
Bo Friis Nielsen
πj pji
πi
Limiting Distribution and Classification
Exercise 10 (16/12/91 ex.1)
In connection with an examination of the reliability of some
software intended for use in control of modern ferries one is
interested in examining a stochastic model of the use of a
control program.
The control program works as " state machine " i.e. it can be in
a number of different levels, 4 are considered here. The levels
depend on the physical state of the ferry. With every shift of
time unit while the program is run, the program will change from
level j to level k with probability pjk .
Bo Friis Nielsen
Limiting Distribution and Classification
Two possibilities are considered:
Bo Friis Nielsen
Limiting Distribution and Classification
Two possibilities are considered:
The program has no errors and will run continously shifting between the four
levels.
Bo Friis Nielsen
Limiting Distribution and Classification
Two possibilities are considered:
The program has no errors and will run continously shifting between the four
levels.
The program has a critical error. In this case it is possible that the error is
found, this happens with probality qi , i = 1, 2, 3, 4 depending on the level.
The error will be corrected immediately and the program will from then on be
without faults.
Bo Friis Nielsen
Limiting Distribution and Classification
Two possibilities are considered:
The program has no errors and will run continously shifting between the four
levels.
The program has a critical error. In this case it is possible that the error is
found, this happens with probality qi , i = 1, 2, 3, 4 depending on the level.
The error will be corrected immediately and the program will from then on be
without faults.
Alternatively the program can stop with a critical error (the ferry will continue
to sail, but without control). This happens with probability ri , i = 1, 2, 3, 4.
Bo Friis Nielsen
Limiting Distribution and Classification
Two possibilities are considered:
The program has no errors and will run continously shifting between the four
levels.
The program has a critical error. In this case it is possible that the error is
found, this happens with probality qi , i = 1, 2, 3, 4 depending on the level.
The error will be corrected immediately and the program will from then on be
without faults.
Alternatively the program can stop with a critical error (the ferry will continue
to sail, but without control). This happens with probability ri , i = 1, 2, 3, 4.
In general qi + ri < 1, a program with errors can thus work and the error is
not nescesarily discovered. It is assumed that detection of an error, as well
as the apperance of a fault happens coincidently with shift between levels.
Bo Friis Nielsen
Limiting Distribution and Classification
Two possibilities are considered:
The program has no errors and will run continously shifting between the four
levels.
The program has a critical error. In this case it is possible that the error is
found, this happens with probality qi , i = 1, 2, 3, 4 depending on the level.
The error will be corrected immediately and the program will from then on be
without faults.
Alternatively the program can stop with a critical error (the ferry will continue
to sail, but without control). This happens with probability ri , i = 1, 2, 3, 4.
In general qi + ri < 1, a program with errors can thus work and the error is
not nescesarily discovered. It is assumed that detection of an error, as well
as the apperance of a fault happens coincidently with shift between levels.
The program starts running in level 1, and it is known that the program
contains one critical error.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
1
A=
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
1
A=
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A=
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A = ~0
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A = ~0
P
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A = ~0
P
0
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A = ~0
P
0
~r
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A = ~0
P
0
~r Diag(qi )P
Bo Friis Nielsen
Limiting Distribution and Classification
Solution: Question 1
Formulate a stochastic process ( Markov chain) in discrete
time describing this system.
The model is a discrete time Markov chain. A possible
definition of states could be
0: The programme has stopped.
1-4: The programme is operating safely in level i.
5-8: The programme is operating in level i-4, the critical
error is not detected.
The transition matrix A is
~0
~0
1
A = ~0
P
0
~r Diag(qi )P Diag(Si )P
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain.
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
r1
r2
~r =
r3
r4
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
r1
r2
~r =
r3
r4
Diag(Si ) =
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
r1
r2
~r =
r3
r4
S1 0 0 0
0 S2 0 0
Diag(Si ) =
0 0 S3 0
0 0 0 S4
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
r1
r2
~r =
r3
r4
S1 0 0 0
0 S2 0 0
Diag(Si ) =
0 0 S3 0
0 0 0 S4
Bo Friis Nielsen
Si = 1 − ri − qi
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
r1
r2
~r =
r3
r4
S1 0 0 0
0 S2 0 0
Diag(Si ) =
0 0 S3 0
0 0 0 S4
Si = 1 − ri − qi
Diag(qi )
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
The model is a discrete time Markov chain. Where P = {pij }
r1
r2
~r =
r3
r4
S1 0 0 0
0 S2 0 0
Diag(Si ) =
0 0 S3 0
0 0 0 S4
Si = 1 − ri − qi
q1 0 0 0
0 q2 0 0
Diag(qi ) =
0 0 q3 0
0 0 0 q4
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
Or without matrix notation:
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
Or without matrix notation:
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
Or without matrix notation:
1
0
0
0
0
0
0
0
0
Bo Friis Nielsen
Limiting Distribution and Classification
Question 1 - continued
Or without matrix notation:
1
0
0
0
0
0
p11
p21
p31
p41
0
p12
p22
p32
p42
0
p13
p23
p33
p43
0
p14
p24
p34
p44
Bo Friis Nielsen
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
Limiting Distribution and Classification
Question 1 - continued
Or without matrix notation:
1
0
0
0
0
r1
r2
r3
r4
0
p11
p21
p31
p41
q1 p11
q2 p21
q3 p31
q4 p41
0
p12
p22
p32
p42
q1 p12
q2 p22
q3 p32
q4 p42
0
p13
p23
p33
p43
q1 p13
q2 p23
q3 p33
q4 p43
0
p14
p24
p34
p44
q1 p14
q2 p24
q3 p34
q4 p44
Bo Friis Nielsen
0
0
0
0
0
S1 p11
S2 p21
S3 p31
S4 p41
0
0
0
0
0
S1 p12
S2 p22
S3 p32
S4 p42
0
0
0
0
0
S1 p13
S2 p23
S3 p33
S4 p43
0
0
0
0
0
S1 p14
S2 p24
S3 p34
S4 p44
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
With reasonable assumptions on P (i.e. irreducible) we get
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
With reasonable assumptions on P (i.e. irreducible) we get
State 0
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
With reasonable assumptions on P (i.e. irreducible) we get
State 0 Absorbing
1
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
With reasonable assumptions on P (i.e. irreducible) we get
State 0 Absorbing
1 Positive recurrent
2
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
With reasonable assumptions on P (i.e. irreducible) we get
State 0 Absorbing
1 Positive recurrent
2 Positive recurrent
3 Positive recurrent
4 Positive recurrent
5
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 2
Characterise the states in the Markov chain.
With reasonable assumptions on P (i.e. irreducible) we get
State 0 Absorbing
1 Positive recurrent
2 Positive recurrent
3 Positive recurrent
4 Positive recurrent
5 Transient
6 Transient
7 Transient
8 Transient
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 3
We now consider the case where the stability of the system has
been assured, i.e. the error has been found and corrected, and
the program has been running for long time without errors. The
parameters are as follows.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 3
We now consider the case where the stability of the system has
been assured, i.e. the error has been found and corrected, and
the program has been running for long time without errors. The
parameters are as follows.
Pi,i+1 = 0.6
i = 1, 2, 3
Pi,i−1 = 0.2
i = 2, 3, 4
Pi,j = 0
|i − j| > 1
qi = 10−3i
ri = 10−3i−5
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 3
We now consider the case where the stability of the system has
been assured, i.e. the error has been found and corrected, and
the program has been running for long time without errors. The
parameters are as follows.
Pi,i+1 = 0.6
i = 1, 2, 3
Pi,i−1 = 0.2
i = 2, 3, 4
Pi,j = 0
|i − j| > 1
qi = 10−3i
ri = 10−3i−5
Characterise the stochastic proces, that describes the
stable system.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 3
We now consider the case where the stability of the system has
been assured, i.e. the error has been found and corrected, and
the program has been running for long time without errors. The
parameters are as follows.
Pi,i+1 = 0.6
i = 1, 2, 3
Pi,i−1 = 0.2
i = 2, 3, 4
Pi,j = 0
|i − j| > 1
qi = 10−3i
ri = 10−3i−5
Characterise the stochastic proces, that describes the
stable system.
The system becomes stable by reaching one of the states 1-4.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 3
We now consider the case where the stability of the system has
been assured, i.e. the error has been found and corrected, and
the program has been running for long time without errors. The
parameters are as follows.
Pi,i+1 = 0.6
i = 1, 2, 3
Pi,i−1 = 0.2
i = 2, 3, 4
Pi,j = 0
|i − j| > 1
qi = 10−3i
ri = 10−3i−5
Characterise the stochastic proces, that describes the
stable system.
The system becomes stable by reaching one of the states 1-4.
The process is ergodic from then on.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 3
We now consider the case where the stability of the system has
been assured, i.e. the error has been found and corrected, and
the program has been running for long time without errors. The
parameters are as follows.
Pi,i+1 = 0.6
i = 1, 2, 3
Pi,i−1 = 0.2
i = 2, 3, 4
Pi,j = 0
|i − j| > 1
qi = 10−3i
ri = 10−3i−5
Characterise the stochastic proces, that describes the
stable system.
The system becomes stable by reaching one of the states 1-4.
The process is ergodic from then on. The procces is a
reversible ergodic Markov chain in discrete time.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
πi = 3i−1 π1
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
πi = 3i−1 π1
4
X
3i−1 π1 = 1
i=1
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
πi = 3i−1 π1
4
X
3i−1 π1 = 1 ⇔ 40π1 = 1
i=1
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
πi = 3i−1 π1
4
X
3i−1 π1 = 1 ⇔ 40π1 = 1
i=1
π1 =
Bo Friis Nielsen
1
40
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
πi = 3i−1 π1
4
X
3i−1 π1 = 1 ⇔ 40π1 = 1
i=1
π1 =
The sum
P4
i=1 3
1
40
i−1
Bo Friis Nielsen
Limiting Distribution and Classification
Solution question 4
For what fraction of time will the system be in level 1.
We obtain the following steady state equations
πi = 3i−1 π1
4
X
3i−1 π1 = 1 ⇔ 40π1 = 1
i=1
π1 =
1
40
P
The sum 4i=1 3i−1 can be obtained by using
P4
i−1 = 1−34 = 40.
i=1 3
1−3
4
X
3i−1 π1 = 1 ⇔
i=1
Bo Friis Nielsen
1 − 34
π1 = 1
1−3
Limiting Distribution and Classification
© Copyright 2026 Paperzz