Last time
The cutting method
Markov Processes (FMSF15/MASC03)
Jimmy Olsson
Centre for Mathematical Sciences
Lund University, Sweden
Lecture 9
The cutting method
October 1, 2012
J. Olsson
Markov Processes, L9
(1)
Last time
The cutting method
Outline
1
Last time
2
The cutting method
J. Olsson
Markov Processes, L9
(2)
Last time
The cutting method
We are here Ð→ ●
1
Last time
2
The cutting method
J. Olsson
Markov Processes, L9
(3)
Last time
The cutting method
Example: the M/M/1 queue
Assume that customers arrive to a service centre according to a
Poisson process with intensity λ. These are taken care of by a
servant who needs an E(µ−1 )-distributed time to help each
customer. The service times are independent of the arrivals. If the
servant is occupied, an arriving customer lines up in a queue, which
can be arbitrary long. In this setting, let
X(t) = #customers in the system at t.
def.
Here “system” refers to the queue as well as the helping desk.
Problem
Motivate that {X(t); t ≥ 0} is a Markov process and find its
transition intensities.
J. Olsson
Markov Processes, L9
(4)
Last time
The cutting method
Conclusion
In the previous two examples (the Poisson process, the M/M/1
queue), the lack of memory of the exponential distribution is
essential.
To compute directly the intensities was, evidently, not trivial.
However, in practice one simply views the system as being
composed by two competing processes, one arrival process with
intensity λ and one departure process with intensity µ (hence the
notation “M/M/1”). The intensity graph can then be put down
immediately (ad hoc).
J. Olsson
Markov Processes, L9
(5)
Last time
The cutting method
Waiting times and jump probabilities
After having started in state i, the process moves, after an
E(qi−1 )-distributed time, to state j ≠ i with probability qij /qi . By
the Markov property, this is repeated; i.e. after an
E(qj−1 )-distributed time the process moves to state k ≠ j with
probability qjk /qj . We obtain the following algorithm for the
evolution of X, where {Tn }∞
n=1 are the jumping times.
let X̃0 = X(0) ← i with probability pi (0)
set T0 ← 0
for n = 1, 2, 3, . . . do
−1
let Yn ∼ E(qX̃
)
n−1
set Tn ← Tn−1 + Yn
let X̃n = X(Tn ) ← j with probability qX̃n−1 ,j /qX̃n−1
end for
J. Olsson
Markov Processes, L9
(6)
Last time
The cutting method
The embedded Markov chain
Obviously, the process X̃ = {X̃n }∞
n=0 defined in the algorithm
above, where X̃n is the state of X after the nth jump, forms a
time homogeneous Markov chain with transition probabilities
def.
p̃ij
⎧
⎪
⎪0
= ⎨
⎪
⎪
⎩qij /qi
def.
for i = j,
for i ≠ j.
Note that these transition probabilities are well defined as
∑j∈X p̃ij = 1 for all i ∈ X .
The chain X̃, which is typically referred to as the embedded
Markov chain, provides a useful tool for simulating Markov
processes with given intensities.
J. Olsson
Markov Processes, L9
(7)
Last time
The cutting method
Stationary Markov processes
The following definitions are entirely analogous to those in the case
of Markov chains.
Definition (stationary Markov process)
A discrete Markov process X is said to be stationary if the absolute
probability
pi (t) = P(X(t) = i)
is independent of t ≥ 0 for all i ∈ X .
Definition (stationary distribution)
A probability distribution {πi }i∈X on X is said to be a stationary
distribution (for X) if for all i ∈ X ,
pi (0) = πi ⇒ pi (t) = πi
J. Olsson
for all t ≥ 0.
Markov Processes, L9
(8)
Last time
The cutting method
The global balance equations
We proved the necessity in the following theorem.
Theorem (global balance)
Every stationary distribution {πi }i∈X of a Markov process X
satisfies
∑ πi qij = 0, ∀j ∈ X ,
i∈X
or, in the case of a finite X , πQ = 0, under the constraint
⎧
⎪
⎪∑i∈X πi = 1,
⎨
⎪
⎪
⎩πi ≥ 0, ∀i ∈ X .
Conversely, if a distribution {πi }i∈X satisfies these equations, then
it is a stationary distribution.
J. Olsson
Markov Processes, L9
(9)
Last time
The cutting method
Local balance
Definition (local balance)
A probability distribution {πi }i∈X is said to satisfy local balance if
πi qij = πj qji ,
∀(i, j) ∈ X 2 .
The quantity πi qij is often referred to as the flow from i to j.
Hence: at local balance, the flow from state i to j equals that
from j to i.
{πi }i∈X satisfies local balance ⇒ {πi }i∈X satisfies global
balance; indeed, for all (i, j) ∈ X 2 ,
∑ πi qij = ∑ πj qji = πj ∑ qji = 0.
l.b.
i∈X
i∈X
J. Olsson
i∈X
Markov Processes, L9
(10)
Last time
The cutting method
Local balance (cont.)
Consequently, {πi }i∈X satisfies local balance ⇒ {πi }i∈X is a
stationary distribution.
Warning: local balance is a strong condition that is satisfied by
far from all Markov processes having stationary distributions
(take for instance any chain having a pair of states i and j
such that qij > 0 and qji = 0).
J. Olsson
Markov Processes, L9
(11)
Last time
The cutting method
We are here Ð→ ●
1
Last time
2
The cutting method
J. Olsson
Markov Processes, L9
(12)
Last time
The cutting method
The cutting method
Theorem (the cutting method)
Let X be a discrete Markov process with state space X . A
distribution {πi }i∈X on X satisfies the global balance equations if
and only if for all subsets A ⊆ X ,
∑ ∑ πi qij = ∑ ∑ πj qji ,
i∈A j∈Ac
j∈Ac i∈A
under the constraint
⎧
⎪
⎪∑i∈X πi = 1,
⎨
⎪
⎪
⎩πi ≥ 0, ∀i ∈ X .
Interpretation: “the total probability flow between any two
complementary sets A and Ac is the same in both directions”.
J. Olsson
Markov Processes, L9
(13)
Last time
The cutting method
Example: Exercise 425
Consider a queueing system to which customers arrive as a Poisson
process with intensity λ. The system has two servers, A and B.
Each customer needs service from one of the servers, but which one
is unimportant. Service times are exponentially distributed, with
means 1/µA and 1/µB for servers A and B respectively. Here
1/µA < 1/µB , so server A is the faster one. If an arriving customer
finds both servers empty, he/she will therefore choose server A. If
an arriving customer finds both servers busy, he/she will join a
queue, waiting for service. This queue is common to both servers,
and the number of waiting spaces is infinite. When a server
becomes available and there are customers in the queue, the
customer first in line occupies the available server. Customers do
no interrupt their service to change server. Service times are
independent of each other and of the arrival process. Compute the
stationary distribution that there are no customers in the system.
J. Olsson
Markov Processes, L9
(14)
Last time
The cutting method
Conclusion
Two remarks:
The cutting method provides a general and flexible tool for
solving problems of this type.
In the modeling step we made an important observation: if
there are several “competing” events that lead to the same
transition, the intensities for these events are added. This is
related to the fact that the intensity of a minimum is the sum
of the individual intensities (see Lecture 6).
J. Olsson
Markov Processes, L9
(15)
Last time
The cutting method
Next time
We will
consider ergodic theory of Markov processes.
reconsider the Poisson process and study, in detail, some
interesting properties of the same.
Well met!
J. Olsson
Markov Processes, L9
(16)
© Copyright 2026 Paperzz