Time Reversibility and Burke`s Theorem

Queuing Analysis:
Time Reversibility and Burke’s Theorem
Hongwei Zhang
http://www.cs.wayne.edu/~hzhang
Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis.
Outline
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
Outline
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
Time-Reversed Markov Chains
{Xn: n=0,1,…} irreducible aperiodic Markov chain with transition
probabilities Pij
∑
∞
j =0
Pij = 1, i = 0,1,...
Unique stationary distribution (πj > 0) if and only if GBE holds, i.e.,
π j = ∑ i =0 πi Pij ,
∞
j = 0,1,...
Process in steady state:
Pr{ X n = j} = π j = lim Pr{ X n = j | X 0 = i}
n →∞
Starts at n=-∞, that is {Xn: n = …,-1,0,1,…}, or
Choose initial state according to the stationary distribution
How does {Xn} look “reversed” in time?
Time-Reversed Markov Chains
Define Yn=Xτ-n, for arbitrary τ>0
=> {Yn} is the reversed process.
Proposition 1:
{Yn} is a Markov chain with transition probabilities:
Pij* =
π j Pji
πi
,
i, j = 0,1,...
{Yn} has the same stationary distribution πj with the forward chain {Xn}
The reversed chain corresponds to the same process, looked at in the
reversed-time direction
Time-Reversed Markov Chains
Proof of Proposition 1:
Pij* = P{Ym = j | Ym−1 = i, Ym − 2 = i2 , K , Ym − k = ik }
= P{ X τ − m = j | X τ − m +1 = i, X τ − m+ 2 = i2 , K , X τ − m + k = ik }
= P{ X n = j | X n +1 = i, X n + 2 = i2 , K, X n + k = ik }
=
P{ X n = j , X n +1 = i, X n + 2 = i2 , K, X n + k = ik }
P{ X n +1 = i, X n + 2 = i2 , K, X n + k = ik }
=
P{ X n + 2 = i2 , K , X n + k = ik | X n = j , X n +1 = i}P{ X n = j , X n +1 = i}
P{ X n + 2 = i2 , K, X n + k = ik | X n +1 = i}P{ X n +1 = i}
=
P{ X n = j , X n +1 = i}
= P{ X n = j | X n +1 = i} = P{Ym = j | Ym −1 = i}
P{ X n +1 = i}
=
P{ X n +1 = i | X n = j}P{ X n = j} Pji π j
=
P{ X n +1 = i}
πi
∑ i=0 πi Pij* = ∑ i =0 πi
∞
∞
π j Pji
πi
= π j ∑ i = 0 Pji = π j
∞
Outline
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
Reversibility
Stochastic process {X(t)} is called reversible if (X(t1), X(t2),…, X(tn))
and (X(τ-t1), X(τ-t2),…, X(τ-tn)) have the same probability distribution,
for all τ, t1,…, tn
Markov chain {Xn} is reversible if and only if the transition
probabilities of forward and reversed chains are equal, i.e.,
Pij = Pij*
Detailed Balance Equations ↔ Reversibility
πi Pij = π j Pji ,
i, j = 0,1,...
Reversibility – Discrete-Time Chains
Theorem 1: If there exists a set of positive numbers {πj},
that sum up to 1 and satisfy:
π i Pij = π j Pji ,
i, j = 0,1,...
Then:
1.
{πj} is the unique stationary distribution
2.
The Markov chain is reversible
Example: Discrete-time birth-death processes are
reversible, since they satisfy the DBE
Example: Birth-Death Process
Pn −1,n
P01
0
1
2
Pn ,n −1
|i-j|>1
Detailed Balance Equations (DBE)
π n Pn ,n +1 = π n +1Pn +1,n
Pn ,n
n+1
Pn +1,n
One-dimensional Markov chain with transitions only between
neighboring states: Pij=0, if
Pn ,n +1
n
P10
P00
Sc
S
n = 0,1,...
Proof: GBE with S ={0,1,…,n} give:
n
∞
n
∞
∑∑π P =∑∑πP
j
j = 0 i = n +1
ji
i ij
j =0 i = n +1
⇒ π n Pn ,n +1 = π n +1Pn +1,n
Time-Reversed Markov Chains (Revisited)
Theorem 2: Irreducible Markov chain with transition probabilities Pij.
If there exist:
A set of transition probabilities Pij*, with ∑j Pij*=1, i ≥ 0, and
A set of positive numbers {πj}, that sum up to 1, such that
π i Pij * = π j Pji , i, j ≥ 0
Then:
Pij* are the transition probabilities of the reversed chain, and
{πj} is the stationary distribution of the forward and the reversed
chains
Remark: Used to find the stationary distribution, by guessing the
transition probabilities of the reversed chain – even if the process is
not reversible
Continuous-Time Markov Chains
{X(t): -∞< t <∞} irreducible aperiodic Markov chain with
transition rates qij, i≠j
Unique stationary distribution (pi > 0) if and only if:
p j ∑ i ≠ j q ji = ∑ i ≠ j pi qij ,
j = 0,1,...
Process in steady state – e.g., started at t =-∞:
Pr{ X (t ) = j} = p j = lim Pr{ X (t ) = j | X (0) = i}
t →∞
If {πj} is the stationary distribution of the embedded
discrete-time chain:
pj =
π j /ν j
∑ π /ν
i
i
, ν j ≡ ∑ i ≠ j q ji ,
i
j = 0,1,...
Reversed Continuous-Time Markov Chains
Reversed chain {Y(t)}, with Y(t)=X(τ-t), for arbitrary τ>0
Proposition 2:
1.
2.
{Y(t)} is a continuous-time Markov chain with transition rates:
pq
qij* = j ji , i, j = 0,1,..., i ≠ j
pi
{Y(t)} has the same stationary distribution {pj} with the forward
chain
Remark: The transition rate out of state i in the reversed chain is
equal to the transition rate out of state i in the forward chain
∑
j ≠i
*
ij
q
∑
=
j ≠i
p j q ji
pi
=
pi ∑ j ≠i qij
pi
= ∑ j ≠i qij = ν i ,
i = 0,1,...
Reversibility – Continuous-Time Chains
Markov chain {X(t)} is reversible if and only if the transition rates
of forward and reversed chains are equal qij = qij* , or equivalently
pi qij = p j q ji ,
i, j = 0,1,..., i ≠ j
i.e., Detailed Balance Equations ↔ Reversibility
Theorem 3: If there exists a set of positive numbers {pj}, that sum
up to 1 and satisfy:
pi qij = p j q ji ,
i, j = 0,1,..., i ≠ j
Then:
1.
{pj} is the unique stationary distribution
2.
The Markov chain is reversible
Example: Birth-Death Process
λ0
0
λ1
1
2
µ1
λn −1
µ2
Sc
S
n
µn
Transitions only between neighboring states
qi ,i +1 = λi , qi ,i −1 = µi , qij = 0, | i − j |> 1
Detailed Balance Equations
λn pn = µn +1 pn +1 , n = 0,1,...
Proof: GBE with S ={0,1,…,n} give:
n
∞
∑∑
j = 0 i = n +1
n
p j q ji = ∑
∞
∑
pi qij ⇒ λn pn = µn +1 pn +1
j = 0 i = n +1
M/M/1, M/M/c, M/M/∞
λn
n+1
µn+1
Reversed Continuous-Time Markov Chains
(Revisited)
Theorem 4: Irreducible continuous-time Markov chain with transition
rates qij. If there exist:
A set of transition rates qij*, with ∑j≠i qij* =∑j≠i qij, i ≥ 0, and
A set of positive numbers {pj}, that sum up to 1, such that
*
pi qij = p j q ji ,
Then:
i, j ≥ 0, i ≠ j
qij* are the transition rates of the reversed chain, and
{pj} is the stationary distribution of the forward and the reversed
chains
Remark: Used to find the stationary distribution, by guessing the
transition probabilities of the reversed chain – even if the process is not
reversible
Reversibility: Trees
4
q23
q12
q01
0
q10
Theorem 5:
q32
q21
q61
1
q16
3
2
5
q67
6
7
q76
Irreducible Markov chain, with transition rates that satisfy qij>0 ↔ qji>0
Form a graph for the chain, where states are the nodes, and for each
qij>0, there is a directed arc i→j
Then, if graph is a tree – contains no loops – then Markov chain is
reversible
Remarks:
Sufficient condition for reversibility
Generalization of one-dimensional birth-death process
Kolmogorov’s Criterion (Discrete Chain)
Detailed balance equations determine whether a Markov chain is
reversible or not, based on stationary distribution and transition
probabilities
Should be able to derive a reversibility criterion based only on the
transition probabilities!
Theorem 6: A discrete-time Markov chain is reversible if and only if:
Pi1i2 Pi2i3 L Pin−1in Pini1 = Pi1in Pinin−1 L Pi3i2 Pi2i1
for any finite sequence of states: i1, i2,…, in, and any n
Intuition: Probability of traversing any loop i1→i2→…→in→i1 is equal to
the probability of traversing the same loop in the reverse direction
i1→in→…→i2→i1
Kolmogorov’s Criterion (Continuous Chain)
Detailed balance equations determine whether a Markov chain is
reversible or not, based on stationary distribution and transition rates
Should be able to derive a reversibility criterion based only on the
transition rates!
Theorem 7: A continuous-time Markov chain is reversible if and only if:
qi1i2 qi2i3 L qin−1in qini1 = qi1in qinin−1 L qi3i2 qi2i1
for any finite sequence of states: i1, i2,…, in, and any n
Intuition: Product of transition rates along any loop i1→i2→…→in→i1 is
equal to the product of transition rates along the same loop traversed
in the reverse direction i1→in→…→i2→i1
Kolmogorov’s Criterion (proof)
Proof of Theorem 6:
Necessary: If the chain is reversible the DBE hold
π1 Pi1i2 = π 2 Pi2i1 

π 2 Pi2i3 = π3 Pi3i2 

M
 ⇒ Pi1i2 Pi2i3 L Pin−1in Pini1 = Pi1in Pinin−1 L Pi3i2 Pi2i1
π n −1 Pin−1in = π n Pinin−1 

π n Pini1 = π1 Pi1in 
Sufficient: Fixing two states i1=i, and in=j and summing over all states
i2,…, in-1 we have
Pi ,i2 Pi2i3 L Pin−1 , j Pji = Pij Pj ,in−1 L Pi3i2 Pi2 ,i ⇒ Pijn −1 Pji = Pij Pjin −1
Taking the limit n→∞
lim Pijn −1 ⋅ Pji = Pij ⋅ lim Pjin −1 ⇒ π j Pji = Pij π i
n →∞
n →∞
Example: M/M/2 Queue with Heterogeneous Servers
αλ
0
(1 − α )λ
λ
1A
µA
µB
µB
µA
1B
λ
λ
3
2
µ A + µB
µ A + µB
λ
M/M/2 queue. Servers A and B with service rates µA and µB respectively. When
the system empty, arrivals go to A with probability α and to B with probability
1-α. Otherwise, the head of the queue takes the first free server.
Need to keep track of which server is busy when there is 1 customer in the
system. Denote the two possible states by: 1A and 1B.
Reversibility: we only need to check the loop 0→1A→2→1B→0:
q0,1 A q1 A,2 q2,1B q1B ,0 = αλ ⋅ λ ⋅ µ A ⋅ µ B
q0,1B q1B ,2 q2,1 A q1 A,0 = (1 − α )λ ⋅ λ ⋅ µ B ⋅ µ A
Reversible if and only if α=1/2.
What happens when µA=µB, and α≠1/2?
Example: M/M/2 Queue with Heterogeneous Servers
αλ
S3
0
(1 − α )λ
λ
1A
µA
µB
µB
µA
1B
λ
3
2
µA + µB
µA + µB
λ
S2
S1
 λ 
pn = p2 

 µ A + µB 
λ
n −2
, n = 2,3,...
λ λ + α (µ A + µB )
µ A 2λ + µ A + µ B
λ λ + (1 − α )( µ A + µ B )
p1B = p0
2λ + µ A + µ B
µB
p1 A = p0
λ p0 = µ A p1 A + µ B p1B


( µ A + µ B ) p2 = λ ( p1 A + p1B )  ⇒
( µ A + λ ) p1 A = αλ p0 + µ B p2 
λ 2 λ + (1 − α )µ A + αµ B
p2 = p0
µ AµB
2λ + µ A + µ B

λ
λ 2 λ + (1 − α ) µ A + αµB 
p0 + p1 A + p1B + ∑ n =2 pn = 1 ⇒ p0 = 1 +

2λ + µ A + µ B
 µ A + µB − λ µ AµB

∞
−1
Outline
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
Multidimensional Markov Chains
Theorem 8:
{X1(t)}, {X2(t)}: independent Markov chains
{Xi(t)}: reversible
{X(t)}, with X(t)=(X1(t), X2(t)): vector-valued stochastic process
{X(t)} is a Markov chain
{X(t)} is reversible
Multidimensional Chains:
Queueing system with two classes of customers, each having its own
stochastic properties – track the number of customers from each class
Study the “joint” evolution of two queueing systems – track the number
of customers in each system
Example: Two Independent M/M/1 Queues
Two independent M/M/1 queues. The arrival and service rates at
queue i are λi and µi respectively. Assume ρi= λi/µi<1.
{(N1(t), N2(t))} is a Markov chain.
Probability of n1 customers at queue 1, and n2 at queue 2, at
steady-state
p( n1 , n2 ) = (1 − ρ1 )ρ1n1 ⋅ (1 − ρ 2 ) ρ 2n2 = p1 ( n1 ) ⋅ p2 ( n2 )
“Product-form” distribution
Generalizes for any number K of independent queues, M/M/1,
M/M/c, or M/M/∞. If pi(ni) is the stationary distribution of queue
i:
p( n1 , n2 ,K, nK ) = p1 ( n1 ) p2 ( n2 )K pK ( nK )
Example (contd.)
Stationary distribution:
n1
 λ  λ   λ  λ 
p( n1 , n2 ) =  1 − 1  1  1 − 2  2 
 µ1  µ1   µ2  µ2 
λ1
n2
03
Detailed Balance Equations:
µ1 p( n1 + 1, n2 ) = λ1 p( n1 , n2 )
µ2 p( n1 , n2 + 1) = λ2 p( n1 , n2 )
µ2
23
λ1
λ2
µ2
λ1
λ2
01
reversible – Kolmogorov criterion
λ1
µ2
00
λ1
λ2
µ2
λ1
µ2
λ1
µ2
31
µ1
λ2
µ2
λ1
20
µ1
µ2
λ2
21
10
µ1
λ2
32
µ1
λ2
λ1
µ1
11
µ2
µ2
µ1
µ1
Verify that the Markov chain is
λ2
22
µ1
λ2
λ1
µ1
12
µ2
33
µ1
02
λ2
λ1
13
µ1
λ2
λ1
λ2
µ2
30
µ1
Truncation of a Reversible Markov Chain
Theorem 9: {X(t)} reversible Markov process with state space S, and
stationary distribution {pj: j∈S}. Truncated to a set E⊂S, such that the
resulting chain {Y(t)} is irreducible. Then, {Y(t)} is reversible and has
stationary distribution:
pj
%p j =
, j∈E
∑ k∈E pk
Remark: This is the conditional probability that, in steady-state, the
original process is at state j, given that it is somewhere in E
Proof: Verify that:
p% j q ji = p% i qij ⇔
pj
∑ k∈E pk
∑ j∈E p% j = ∑ j∈E
pj
q ji =
∑ k∈E pk
=1
∑
pi
k∈E
pk
qij ⇔ p j q ji = pi qij , i , j ∈ S ; i ≠ j
Example: Two Queues with Joint Buffer
The two independent M/M/1 queues of the
previous example share a common buffer
of size B – arrival that finds B customers
waiting is blocked
λ1
03
µ1
λ2
State space restricted to
+
13
µ2
Distribution of truncated chain:
λ2
22
λ1
µ1
λ2
01
µ2
of normalization constant is
often tedious
λ1
λ2
11
µ2
−1
Theorem specifies joint distribution up to
the normalization constant
Calculation
µ2
λ2
λ1
12
λ1
00
λ2
λ1
31
µ1
µ1
µ2
λ1
10
µ1
µ2
21
µ1
Normalizing:


p(0,0) =  ∑ ρ1n1 ρ 2n2 
( n1 ,n2 )∈E

µ2
µ1
p( n1 , n2 ) = p(0,0) ⋅ ρ1n1 ρ 2n2 , ( n1 , n2 ) ∈ E
λ2
02
+
E = {( n1 , n2 ) : ( n1 − 1) + ( n2 − 1) ≤ B}
λ1
λ2
µ2
λ1
20
µ1
λ2
µ2
30
µ1
State diagram for B =2
Outline
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
Birth-death process
{X(t)} birth-death process with stationary distribution {pj}
Arrival epochs: points of increase for {X(t)}
Departure epoch: points of decrease for {X(t)}
{X(t)} completely determines the corresponding arrival and departure
processes
Arrivals
Departures
Forward & reversed chains of M/M/* queues
Poisson arrival process: λj=λ, for all j
Birth-death process called a (λ, µj)-process
Examples: M/M/1, M/M/c, M/M/∞ queues
Poisson arrivals → LAA: for any time t, future arrivals are independent
of {X(s): s≤t}
(λ, µj)-process at steady state is reversible: forward and reversed
chains are stochastically identical
=> Arrival processes of the forward and reversed chains are
stochastically identical
=> Arrival process of the reversed chain is Poisson with rate λ
+ “the arrival epochs of the reversed chain are the departure epochs of the
forward chain” => Departure process of the forward chain is Poisson
with rate λ
Forward & reversed chains of M/M/* queues
(contd.)
t
t
t
t
Reversed chain: arrivals after time t are independent of the chain
history up to time t (LAA)
=> Forward chain: departures prior to time t and future of the
chain {X(s): s≥t} are independent
Burke’s Theorem
Theorem 10: Consider an M/M/1, M/M/c, or M/M/∞ system with
arrival rate λ. Suppose that the system starts at steady-state. Then:
1.
The departure process is Poisson with rate λ
2.
At each time t, the number of customers in the system is
independent of the departure times prior to t
Fundamental result for study of networks of M/M/* queues, where
output process from one queue is the input process of another
Outline
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
Single-Server Queues in Tandem
Station 1
λ
Poisson
µ1
Station 2
λ
µ2
Customers arrive at queue 1 according to Poisson process with rate
λ.
Service times exponential with mean 1/µi. Assume service times of a
customer in the two queues are independent.
Assume ρi=λ/µi<1
What is the joint stationary distribution of N1 and N2 – number of
customers in each queue?
Result: in steady state the queues are independent and
p( n1 , n2 ) = (1 − ρ1 ) ρ1n1 ⋅ (1 − ρ 2 ) ρ 2n2 = p1 ( n1 ) ⋅ p2 ( n2 )
Single-Server Queues in Tandem
Station 1
λ
Poisson
µ1
Station 2
λ
µ2
Q1 is a M/M/1 queue. At steady state its departure process is Poisson
with rate λ. Thus Q2 is also M/M/1.
Marginal stationary distributions:
p1 (n1 ) = (1 − ρ1 ) ρ1n1 , n1 = 0,1,...
p2 (n2 ) = (1 − ρ 2 ) ρ 2n2 , n2 = 0,1,...
To complete the proof: establish independence at steady state
Q1 at steady state: at time t, N1(t) is independent of departures prior
to t, which are arrivals at Q2 up to t. Thus N1(t) and N2(t)
independent:
P{N1 (t ) = n1 , N 2 (t ) = n2 } = P{N1 (t ) = n1}P{N 2 (t ) = n2 } = p1 (n1 ) ⋅ P{N 2 (t ) = n2 }
Letting t→∞, the joint stationary distribution
p (n1 , n2 ) = p1 (n1 ) ⋅ p2 (n2 ) = (1 − ρ1 ) ρ1n ⋅ (1 − ρ 2 ) ρ 2n
1
2
Queues in Tandem
Theorem 11: Network consisting of K single-server queues in tandem.
Service times at queue i exponential with rate µi, independent of
service times at any queue j≠i. Arrivals at the first queue are Poisson
with rate λ. The stationary distribution of the network is:
K
p( n1 ,K, nK ) = ∏ (1 − ρi )ρini , ni = 0,1,...; i = 1,..., K
i =1
At steady state the queues are independent; the distribution of queue i
is that of an isolated M/M/1 queue with arrival and service rates λ and
µi
Are
pi ( ni ) = (1 − ρi )ρini , ni = 0,1,...
the queues independent if not in steady state? Are stochastic
processes {N1(t)} and {N2(t)} independent?
Queues in Tandem: State-Dependent Service
Rates
Theorem 12: Network consisting of K queues in tandem. Service times
at queue i exponential with rate µi(ni) when there are ni customers in
the queue – independent of service times at any queue j≠i. Arrivals at
the first queue are Poisson with rate λ. The stationary distribution of
the network is:
K
p( n1 ,K, nK ) = ∏ pi ( ni ), ni = 0,1,...; i = 1,..., K
i =1
where {pi(ni)} is the stationary distribution of queue i in isolation with
Poisson arrivals with rate λ
Examples: ./M/c and ./M/∞ queues
If queue i is ./M/∞, then:
( λ / µi )ni − λ / µi
pi ( ni ) =
e
, ni = 0,1,...
ni !
Summary
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Multi-dimensional Markov chains
Burke’s Theorem
Queues in Tandem