1
UNIVERSITY OF LONDON
IMPERIAL COLLEGE OF SCIENCE, TECHNOLOGY
AND MEDICINE
Course:M3S4/M4S4
Setter:
Hand
Checker:
Crowder
Editor:
McCoy
External:
Kent
Date:
December 2003
BSc and MSci EXAMINATIONS (MATHEMATICS)
MAY-JUNE 2004
This paper is also taken for the relevant examination for the Associateship
SOLUTIONS
M3S4/M4S4 APPLIED PROBABILITY
© 2000 University of London
M3S4/M4S4 Page of 10.
1
2
1. SOLUTION
(a) [Seen]
(i) [2]
pn t h pn t . 1 h o h
pn 1 t . h o h
o h
(ii) [2]
Simplifying and rearranging the above, we obtain
pn t h pn t
o h
pn1 t pn t
h
h
so that, in the limit as h 0 ,
dpn t
pn 1 t pn t (1a)
dt
In the special case of n 0 the equation is
dp0 t
(1b)
p0 t
dt
(iii) [6]
Multiplying the equations in (1) by s n and summing over n 0,1, 2,... yields
d
d
p0 t pn t s n p0 t pn t s n pn 1 t s n
dt
n 1 dt
n 1
n 1
Then, using s, t pn t s n , this becomes
n 0
1 s
t
(iv) [3]
Integrating this equation yields
log 1 s t A s
so that
B s exp 1 s t
with A s ln B s , and using the condition given in the question that s,0 1
we obtain B s 1.
(v) [2]
This is the pgf of a Poisson distribution with parameter t
(b) [5] [Seen similar]
The number of casualties which have occurred by time t is a compound Poisson
process. In the lectures we derived that the expected number of casualties in such a
process is E S t t , where is the mean of the number of casualties at each
accident (full marks for either remembering this or deriving it). Since the mean of a
© 2000 University of London
M3S4/M4S4 Page of 10.
2
3
G0 q distribution is q p , we have that the mean number of casualties by time t
is qt p .
© 2000 University of London
M3S4/M4S4 Page of 10.
3
4
2. SOLUTION
[Part seen]
(i) [5]
P X x
P X x | T t P t dt
e t t
e t dt
x!
0
x
x
x!
e
t x
t dt
0
x
x!
x ! x 1
x
for x 0,1, 2,... .
This is a geometric distribution with parameter . In the lectures we called it
a G0 distribution.
(ii) [5]
We have a geometric distribution with p
0
0
so that
E x xp x xqp x qp
qp
d x
p
dp 0
d 1 qp p
dp 1 p q 2 q
Likewise, the pgf is
0
0
s s x qp x q s x p x q 1 sp s
(iii) [5]
From the lectures, the probability that the population will die out is given by the
smallest positive root of . Now, from q 1 p we obtain
2 p q 0
so that p q 1 0 with solutions q p and 1.
If q p the smaller of these is 1, and if q p the smaller is q p . In terms of and
, these are, respectively, 1 and, for ,
(iv) [5]
From the lectures, we know that the expected number born in generation n is n n ,
where is the mean of the offspring probability distribution. Hence,
© 2000 University of London
M3S4/M4S4 Page of 10.
4
5
n n . If the expected number of births up to and including the
nth generation is thus
2 ... n
1
As n this converges to
1
1
1
n
© 2000 University of London
M3S4/M4S4 Page of 10.
5
6
3. SOLUTION
(a) [Seen]
(i) [2] P , i 0 ,
i
1
(ii) [1] A set of states, C, which it is impossible to leave: pij 0 if i C , j C
(iii) [2] Two states communicate if there is a non-zero probability of moving (perhaps in
more than one step) from either state to the other. A communicating class is a set of
communicating states.
Since, by definition, each state communicates with itself, each state does lie in a
communicating class. To prove the assertion we need to show that no state lies in more
than one such class. This follows from the fact that if a state s lies in two such classes,
A and B, then it is possible to get from an state in A to any state in B, and vice versa, via
s. Thus A and B form a single communicating class. An informal argument like this is
acceptable, or a more formal one if they give it.
(b) [Unseen]
(i) [3]
(ii) [2] Communicating classes: {0,1}, {2}, {3}. Classes {2} and {3} are closed since,
once entered, they cannot be left.
(iii) [2] Any distribution of the form 0,0, p,1 p will be a stationary distribution.
(c) [Unseen]
(i) [6] Using the equations describing the first two elements and the fourth in Q we
obtain the equations
0.5 0 0.25 1 0
0.5 0 0.5 1 0.5 2 1
3 0.5 3
and combining these with
i 1
yields 0 0.25, 1 0.5, 2 0.25, 3 0 .
(ii) [2] The mean time between returns is 1 0 4
© 2000 University of London
M3S4/M4S4 Page of 10.
6
7
4. SOLUTION
(i) [2] [Seen] Traffic intensity is the ratio of the expected value of the service time
distribution to that of the inter-arrival time distribution. The traffic intensity of an
M / M /1 queue is .
(ii) [5] [Seen] For a steady state, we have
0 px1 px1 px
and
0 p1 p0 .
From these, p1 p0 , p2 2 p1 , and in general px x p0 .
(iii) [5] [Seen] For the px , x 0,1, 2,... to form a proper distribution, we must have
p
x
1 . This is only possible if 0 1 from which p 0 1 . The steady
0
state queue length distribution is thus geometric. In the lectures, we called it G0
(iv) [8] [Unseen] The mean queue length is 1 , either from memory
or by deriving the mean of the geometric distribution in (iii). Thus a . By
the memoryless property of the exponential distribution, the mean of the service time
distribution is 1 . Hence b 1 . From these two equations we obtain
a b 1 a , so that the mean inter-arrival time is b 1 a a .
© 2000 University of London
M3S4/M4S4 Page of 10.
7
8
5. SOLUTION
(a) [8] [Unseen] The equation is in Lagrange form with
f 2st
g s
h 2t
The auxiliary equations are
ds dt d
.
2 st s 2t
The first equation gives ds 2tdt from which c1 s t 2 .
ds
d
from which c2 s .
s
so that s s t 2 .
The second equation gives
Hence s s t 2
Now using the condition that se s when t 0 we obtain ses s s or
s es .
Hence the solution is s, t se s t .
2
(b) [Seen similar]
(i) [2] The bacterial population is not wiped out at time t if no drug treatment has occurred
by that time. Since the rate at which doses of treatment are administered is , the time
interval between treatments is distributed as M . Hence, P(no treatment by time t) =
exp t , so that PY t 0 exp t .
(ii) [2] Either: since no treatment has occurred by time t, the population behaves according
to a simple birth process, so that, from the lectures, its size at time t has a geometric
distribution G1 e t . Or, from the pgf given in the question, the distribution is a
geometric distribution G1 e t . So,
PY t y | Y t 0 e t 1 e t
y 1
y 1,2,...
(iii) [5] The unconditional probability distribution of Y t is given by
PY t 0 1 PY t 0 1 e t
and when y 0 ,
PY t y PY t y | Y t 0PY t 0
e t 1 e t
y 1
e t 1 e t
The pgf of Y t is given by
© 2000 University of London
e t
y 1
M3S4/M4S4 Page of 10.
8
9
s, t P Y t y s y
y 0
1 e t e
t
y 1
1 e t
1
1 e
t
y 1
sy
se
1 s 1 e t
t
e t s 1
1 s 1 e t
(iv) [3] Differentiating s, t with respect to s yields
s, t
e t
se t 1 e t
2
s
1 s 1 e t
1 s 1 e t
So that
s, t
s
e
s 1
e
t
e t
e
t
1 e
t
e2 t
t
© 2000 University of London
M3S4/M4S4 Page of 10.
9
© Copyright 2026 Paperzz