Time reversibility for continuous-time Markov Chains

Time reversibility for continuous-time Markov Chains. Branching Processes.
Time reversibility for continuous-time Markov
Chains. Branching Processes.
December 3, 2012, LTCC Stochastic Processes
Time reversibility for continuous-time Markov Chains. Branching Processes.
Outline
1 Time reversibility for continuous-time Markov Chains
Detailed balance equations
Example
Graph associated to a continuous-time Markov Chain
2 Full and detailed balance equations
Discrete-time Markov Chains
Continuous-time Markov Chains
3 Branching processes
Definition
Probability generating function
Mean and Variance of Zn
Total number of individuals
Probability of extinction by generation n
Probability of ultimate extinction
Time reversibility for continuous-time Markov Chains. Branching Processes.
Time reversibility for continuous-time Markov Chains
Detailed balance equations
Detailed balance equations
Similarly to the discrete-Markov Chains case, we have
Theorem
Let X(t), t ∈ [0, ∞), be a stationary irreducible continuous-time
Markov chain with generator Q. The chain is reversible
P if and only if
there exists a probability row vector π (0 ≤ πi ≤ 1, i∈S πi = 1)
such that
πi qij = πj qji for all i, j ∈ S.
If such a π exists, it is the equilibrium distribution of the chain.
Note that if a chain is stationary i.e. has an equilibrium distribution,
but there is no solution to the detailed balance equations, then the
chain is not reversible.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Time reversibility for continuous-time Markov Chains
Detailed balance equations
Thus
πi qij = πj qji for all i, j ∈ S.
These are called the detailed balance equations. In words, they say
that, for all pairs of states, the rate at which transitions occur from
state i to state j (πi qij ) balances the rate at which transitions occur
from j to i (πi qij ).
Time reversibility for continuous-time Markov Chains. Branching Processes.
Time reversibility for continuous-time Markov Chains
Example
A single server has a service rate µ, customers arrive individually at a
rate λ. Let X(t) be the number of customers in the queue (including
the customer currently served) and let S = {0, 1, 2, . . .}. The
generator Q is given by


−λ
λ
0
0
0 ...
 µ −(λ + µ)
λ
0
0 ... 


µ
−(λ + µ)
λ
0 ... 
Q=
 0
.
 0
0
µ
−(λ + µ) λ . . . 
...
...
...
...
...
We want to find the invariant distribution, if it exists.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Time reversibility for continuous-time Markov Chains
Example
The detailed balance equations are:
π0 λ = π1 µ, π1 λ = π2 µ, . . . , πn λ = πn+1 µ, for all n ≥ 2.
Let ρ =
λ
µ
be the traffic intensity. Solving the equations, we get
πi = π0 ρi , for all i ≥ 1.
P
Assume ρ < 1. Since i∈S πi = 1, we have
π0
∞
X
i=0
ρi =
π0
= 1, from which π0 = 1−ρ and πi = (1−ρ)ρi , i ≥ 1.
1−ρ
See also Example 2.8 from the lecture notes.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Time reversibility for continuous-time Markov Chains
Graph associated to a continuous-time Markov Chain
Graph associated to a continuous-time Markov Chain
A convenient representation of a continuous-time Markov chain is via
the associated graph: the states are the vertices of the graph, j and k
are joined by an edge, if qjk > 0 or qkj > 0. If a continuous-time
Markov chain is irreducible, the associated graph is connected. The
graph is a tree if the removal of any edge cuts the graph into two
unconnected components. Then we have the following:
Theorem
If the graph associated with a stationary irreducible continuous-time
Markov chain is a tree then the process is reversible.
(See also the in-class example).
Time reversibility for continuous-time Markov Chains. Branching Processes.
Full and detailed balance equations
Discrete-time Markov Chains
Full and detailed balance equations-DTMC
Let Xn , n ∈ {. . . , −2, −1, 0, 1, 2, . . .}, be an irreducible discrete-time
Markov chain with state space S and equilibrium
P distribution π
satisfying π = πP. Then, for every i, we have j∈S πj pji = πi , and
we have
X
X
X
X
πj pji = πi −πi pii = πi
pij −πi pii = πi
pij , since
pij = 1.
j∈S,j6=i
j∈S,j6=i
j∈S
j∈S
The equations
πi
X
j∈S,j6=i
pij =
X
πj pji
j∈S,j6=i
are called the full balance equations. They show that the probability
of leaving state i exactly balances the probability of entering state i.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Full and detailed balance equations
Continuous-time Markov Chains
Full and detailed balance equations-CTMC
Let X(t), t ∈ [0, ∞), be an irreducible continuous-time Markov chain
with state space S and equilibrium
distribution π satisfying πQ = 0.
P
Then, for every i, we have j∈S πj qji = 0, and we have
X
πj qji = −πi qii = πi
j∈S,j6=i
X
qij , since
j∈S,j6=i
X
qij = 0.
j∈S
The equations
πi
X
j∈S,j6=i
qij =
X
πj qji
j∈S,j6=i
are called the full balance equations. They show that for each state i,
the rate of transitions out of state i balances the rate of transitions into
i.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Definition
Definition
We will consider a random model for population growth in the
absence of spatial or any other resource constraints. So, consider a
population of individuals which evolves according to the following
rule: in every generation n = 0, 1, 2, . . . , each individual produces a
random number of offspring in the next generation, independently of
other individuals. The probability mass function for the offspring of
an individual is often called the offspring distribution and is given by
X
pi = P(number of offspring = i), i = 0, 1, 2, . . . , where
pi = 1.
i≥0
This model was introduced by F. Galton, in late 1800s, to study the
disappearance of family names. In his model, pi is the probability that
a man has i sons.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Definition
Simple branching process: A population starts with a progenitor
(who forms population number 0). Then he is split into i
offsprings with probability pi : these i offsprings constitute the
first generation. Each of these offsprings then independently split
into a random number of offsprings, determined by the density pi
and so on.
Notice that a branching process may either become extinct (we
say that the branching process dies out) or survive forever (we
say that the branching process survives). We are interested under
what conditions and with what probabilities these events occur.
Some generalizations of the simple branching process:
k individuals in generation 0.
Immigration: Wn immigrants arrive at nth generation and start to
reproduce.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Definition
Definition
The simple branching process (Galton-Watson Branching Process) is
defined as follows: We consider a model which starts, in the zeroth
generation, with a single individual who has a random number, X, of
offspring who form the first generation. Each member of the first
generation, independently, gives rise to a random number of offspring,
with the same distribution as X. These offspring form the second
generation, and so on. Let Zn denote the size of the nth generation, so
that Z0 = 1, Z1 = X, and given Zn = k, Zn+1 = Xn1 + . . . + Xnk where
all the variables Xnj are mutually independent and distributed as X.
Xnj can be thought of as the number of members of (n + 1)th
generation which are offsprings of the jth member of the nth
generation.
See also in-class explanation.
Examples of distributions of X: Bernoulli (p), Binomial (n, p),
Poisson µ.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Definition
Zn , n = 0; 1, . . . , is a discrete-time Markov chain with state
space S = {0, 1, 2, . . .} and homogenous transition probability
pij = P(Zn = j|Zn−1 = i) = P(
i
X
Xnk = j).
k=1
Without immigration a branching process either becomes extinct
or increases unboundedly.
Note that Xnj , n ≥ 1, j ≥ 1, are identically distributed (having a
common distribution {pi }, i ≥ 0) non-negative integer-valued
random variables.
Once the process hits zero, it stays at zero. In other words, if
Zi = 0, then Zi+1 = 0.
Generating functions are extremely helpful in solving sums of
independent random variables and thus provide a major tool in
the analysis of branching processes.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Definition
Some questions of interest
(1) Mean and Variance of size of the nth generation Zn of a
branching process.
(2) Total number of individuals up to and including generation n.
(3) What is the distribution of Zn (use the moment generating
function for this).
(4) What is the probability of extinction by generation n.
(5) Probability of ultimate extinction: necessary condition is that
P(X = 0) 6= 0.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability generating function
Probability generating function
Theorem
Let G(z) = E(zX ) be the probability generating function of X and
Gn (z) = E(zZn ). Then
Gn+1 (z) = G(G(G(. . . (z)))),
where G(G(G(. . . (z)))) is the (n + 1)-fold iterate of G(z).
Proof: We recall
Zn+1 = Xn1 + . . . + XnZn , Xni i.i.d, Xni , Zn r.v. on {0, 1, . . .}.
We have
Gn+1 (z) =
∞
X
P(Zn+1 = s)zs , where
s=0
P(Zn+1 = s) =
∞
X
P(Zn+1 = s|Zn = k)P(Zn = k) and
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability generating function
Gn+1 (z) =
=
∞
X
k=0
n=0
∞
X
∞
X
k=0
=
P(Zn = k)
∞
X
∞
X
P(Zn = k)
P(Zn+1 = s|Zn = k)zs
P(Xn1 + . . . + Xnk = s)zs (since Xni i.i.d)
n=0
P(Zn = k)(G(z))k = GZn (G(z)).
k=0
Since G1 (z) = G(z), we have now by induction
Gn+1 (z) = GZn ((G(z))k ) = GZn (G(z))
= GZn−1 (G(G((z)))) . . . = G(G(G(. . .))).
(See the in-class Binomial (1, p) example).
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Mean and Variance of Zn
Mean and Variance of Zn
Theorem
Assume that E(X) = µ and var (X) = σ 2 and Z0 = 1. Then
E(Zn ) = µn and var (Zn ) = σ 2 µn−1 (µn − 1)/(µ − 1), if µ 6= 1
and var (Zn ) = σ 2 n if µ = 1.
Proof:
E(Zn+1 ) = E[E(Zn+1 /Zn )] = µE(Zn ) (use moment generating function)
and
var (Zn+1 ) = E [var (Zn+1 /Zn )] + var (E(Zn+1 /Zn )) = σ 2 E(Zn )
+µ2 var (Zn ) = σ 2 µn + µ2 [σ 2 µn−1 + µ2 var (Zn−1 )] . . . .
(See also the in-class explanations and example).
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Total number of individuals
Total number of individuals
Let Tn be the total number of individuals up to and including
generation n. Then for E(X) = µ 6= 1
E(Tn ) = E(Z0 ) + E(Z1 ) + E(Z2 ) + . . . + E(Zn )
µn+1 − 1
= 1 + µ + µ2 + . . . + µn =
.
µ−1
If µ = 1, we have E(Tn ) = n + 1. Then
lim E(Tn ) =
n→∞
if µ < 1 and ∞ otherwise.
1
,
µ−1
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability of extinction by generation n
Probability of extinction by generation n
Let
θn+1 : = P(n + 1 th generation contains 0 individuals)
= P(extinction occurs by (n + 1)th generation)
= P(Zn+1 = 0) = Gn+1 (0) = G(Gn (0)) = G(θn ).
Since θ1 = G(θ0 ) = G(0), we can iteratively obtain θn , n = 2, 3, . . . .
(See also the in-class examples).
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability of ultimate extinction
Probability of ultimate extinction
Let the probability of ultimate extinction be
θ := lim θn ≤ 1.
n→∞
We only consider the case where 0 < P(X = 0) < 1 since the other
two cases are trivial. If P(X = 0) = 1 then the process is certain to
die out by generation 1 so that θ = 1. If P(X = 0) = 0 then the
process cannot possibly die out and θ = 0.
Theorem
When 0 < P(X = 0) < 1 the probability of eventual extinction is the
smallest positive solution of z = G(z).
(See also the in-class example).
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability of ultimate extinction
How do we compute θ?
Consider
G(θ) =
∞
X
P(X = k)θk , with 0 < G(0) = P(x = 0) < 1.
k=0
We also have G(1) = 1, G0 (1) > 0. For θ > 0, we have G”(θ) > 0, so
G(θ) is a convex increasing function for θ ∈ [0, 1]. Therefore the
solutions of θ = G(θ) are determined by slope of G(θ) at θ = 1, that
is, by G0 (1) = µ. There are three cases
If µ < 1 there is one solution θ∗ = 1, so extinction is certain.
If µ > 1 there are two solutions: θ1∗ < 1 and θ2∗ = 1. As θn is
increasing, we want the smaller solution, so extinction is NOT
certain.
If µ = 1 solution is θ∗ = 1, so extinction is certain.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability of ultimate extinction
Exam
Information on the exam will be posted in due time on the course
website.
The exam will be in the form of a take-home paper.
The exam will be given on the material studied covered during
the lectures and in the lecture notes (up to and including page 30
- only the definition of the Poisson process from page 30).
Additional mock exams and exercises with solutions on the
sections we covered, will be posted on the website a few weeks
before the exam.
Please try to solve the exercise sheets 1 and 2 (in full), exams
2008, 2011 and 2012 (in full), questions (1) and (2) from exam
2009, questions (a), (b), (c) and (d) from exam 2010, and
questions (a) and (b) from mock exams 2008 and 2009.
Please email me if you have any questions or difficulties with
any of the material on the course website.
Time reversibility for continuous-time Markov Chains. Branching Processes.
Branching processes
Probability of ultimate extinction
ANY QUESTIONS?