EE126 Discussion 9: Solutions - People @ EECS at UC Berkeley

EE126 Discussion 9: Solutions
Jérôme Thai
April 17, 2014
Problem 1. Let (N (t), t ≥ 0) be a Poisson process of rate λ. Find
P (N (2) − N (1) = 1 | N (3) = 2, N (4) − N (1) = 2)
Solution. Let
X1 := N (1), X2 := N (2) − N (1), X(3) := N (3) − N (2), X4 := N (4) − N (3)
Then X1 , X2 , X3 , X4 are independent and identically distributed random variables, with each of
them having a Poisson distribution with parameter λ. We are being asked to compute
P (X2 = 1 | X1 + X2 + X3 = 2, X2 + X3 + X4 = 2).
The conditioning event
{X1 + X2 + X3 = 2, X2 + X3 + X4 = 2}
can be realized with the following configurations for (X1 , X2 , X3 , X4 ) :
(2, 0, 0, 2), (1, 1, 0, 1), (1, 0, 1, 1), (0, 2, 0, 0), (0, 1, 1, 0), (0, 0, 2, 0)
with respective probabilities
λ4 −4λ 3 −4λ 3 −4λ λ2 −4λ 2 −4λ λ2 −4λ
e ,λ e ,λ e ,
e ,λ e ,
e
4
2
2
From this the desired conditional probability can be computed as
λ2 + λ3
,
2λ2 + 2λ3 + 14 λ4
which can be simplified to
4(1 + λ)
8 + 8λ + λ2
Problem 2. Let {Nt , t ≥ 0} be a Poisson process with rate λ. Let Sn denote the time of the n-th
event. Find
(a) the pdf of Sn
(b) E[S5 ]
(c) E[S4 | N (1) = 2]
(d) E[N (4) − N (2) | N (1) = 3]
Solution.
1
(a) First consider S2 . We see that
x
Z
fS1 (y)λe−λ(x−y) dy
fS2 (x) =
0
Z x
=
λe−λy λe−λ(x−y) dy
0
Z x
λ2 e−λx dy
=
0
= λ2 xe−λx .
Now, consider S3 . We have
x
Z
fS2 (y)λe−λ(x−y) dy
fS3 (x) =
Z 0x
λ2 ye−λy λe−λ(x−y) dy
=
0
Z x
=
λ3 ye−λx dy
0
= λ3
x2 −λx
e
.
2
This suggests the following result:
fSn (x) = λn
xn−1 −λx
e
.
(n − 1)!
Assume this is true for n. Then,
Z
x
fSn+1 (x) =
fSn (y)λe−λ(x−y) dy
0
Z x
y n−1 −λy −λ(x−y)
e
λe
dy
=
λn
(n − 1)!
0
Z x
y n−1 −λx
=
λn+1
e
dy
(n − 1)!
0
xn
= λn+1 e−λx ,
n!
which proves the result by induction.
(b) By linearity of expectation, E[S5 ] = 5/λ.
(c) Since exponentials and the Poisson process are memoryless,
E[S4 |N (1) = 2] = 1 + E[S2 ] = 1 +
2
.
λ
(d) By the memoryless property, the condition N (1) = 3 does not matter, so the answer is
2λ.
2
0
1
µ
...
3
2
2µ
2µ
2µ
Figure 1: The transition diagram for Problem 13.8.
Problem 3. A queue has Poisson arrivals with rate λ. It has two servers that work in parallel.
When there are at least two customers in the queue, two are being served. When there is only one
customer, only one server is active. The service times are i.i.d. exp(µ).
(a) Argue that the queue length is a Markov Chain.
(b) Draw the state transition diagram.
(c) Find the minimum value of µ so that the queue is positive recurrent and solve the balance
equations.
Solution.
(a) The queue length is a Markov chain because the exponential distribution and the Poisson
process are memoryless.
(b) The transition diagram is shown in Figure 1.
(c) We write the detailed balance equations. Thus,
π(1) =
π(i + 1) =
λ
π(0)
µ
λ
π(i), i ≥ 1.
2µ
λ i−1
Thus, π(i) = ( µλ )( 2µ
) π(0). Also, we know that
X
π(i) = 1.
i≥0
Thus,
π(0)(1 +
X λ λ
( )( )i−1 ) = 1.
µ 2µ
i≥1
The series converges if λ < 2µ. So the minimum value of µ for positive recurrence is µ > λ/2.
Then, solving the equation we have
π(0) =
Then,π(i) =
2µ−λ λ
λ i−1
2µ+λ ( µ )( 2µ )
for i ≥ 1.
3
2µ − λ
.
2µ + λ
0
1
µ
2µ
...
3
2
3µ
4µ
Figure 2: The transition diagram for Problem 13.10.
Problem 4. A continuous-time queue has Poisson arrivals with rate λ, and it is equipped with
infinitely many servers. The servers can work in parallel on multiple customers, but they are noncooperative in the sense that a single customer can only be served by one server. Thus, when there
are k customers in the queue, k servers are active. Suppose that the service time of each customer
is exponentially distributed with rate µ and they are i.i.d.
(a) Argue that the queue-length is a Markov chain. Draw the transition diagram of the Markov
chain.
(b) Prove that for all finite-values of λ and µ the Markov chain is positive recurrent and find
the invariant distribution.
Solution.
(a) The queue length is a Markov chain because the exponential distribution and the Poisson
process are memoryless. The state transition diagram is shown in Figure 2.
(b) The balance equations are
π(0)λ = µπ(1), so that π(1) = (λ/µ)π(0)
π(1)(λ + µ) = π(0)λ + π(2)2µ,
which combined with the previous equation gives
π(1)λ = π(2)2µ, so that π(2) = (λ/2µ)π(1) = (λ2 )/(2!µ2 )π(0).
Continuing in this way, we find that
π(n) =
λn
π(0),
n!µ
so that, after choosing π(0) so that the probabilities add up to one,
π(n) =
ρn −ρ
λ
e , ∀n ≥ 0, with ρ := .
n!
µ
Thus, the Markov chain admits an invariant distribution for all λ > 0, µ > 0, so that it is
positive recurrent.
4