Poisson Process
Consider a random process representing the number of occurrences of an event up to time
t (over time interval (0, t]. Such a process is called a counting process and we shall
denote it by N (t ), t 0. Clearly N (t ), t 0 is a continuous-time discrete state
process and any of its realizations is non-decreasing function of time.
The counting process N (t ), t 0 is called Poisson’s process with the rate parameter if
(i)
(ii)
N(0)=0
N (t) is an independent increment process.
Thus the increments N (t2 ) N (t1 ) and N (t4 ) N (t3 ), et. are independent.
(iii)
P N (t ) 1 t 0(t )
where 0( t ) implies any function such that lim
t 0
(iv)
P N (t ) 2 0 t
0(t )
0.
t
The assumptions are valid for many applications. Some typical examples are
Number of alpha particles emitted by a radio active substance.
Number of binary packets received at switching node of a communication
network.
Number of cars arriving at a petrol pump during a particular interval of time.
P N (t t ) n = Probability of occurrence of n events up to time t t
P N (t ) n, N (t ) 0 P {N (t ) n 1, n(t ) 1} P N (t ) n 1, n(t ) 2
P N (t ) n (1 t 0(t )) P N (t ) n 1 (t 0(t )) P N (t ) (n 1) (0(
P N (t ) n, N (t ) 0 P {N (t ) n 1, n(t ) 1} P N (t ) n 1, n(t ) 2
P N (t ) n (1 t 0(t )) P N (t ) n 1 (t 0(t )) P N (t ) (n 1) (0(t ))
lim
t 0
P N (t t ) n P N (t ) n
t
P {N (t ) n} P {N (t ) n 1}
d
P N (t ) n P N (t ) n P N (t ) n 1
dt
(1)
The above is a first-order linear differential equation with initial condition
P(N (0) n) 0. . This differential equation can be solved recursively.
First consider the problem to find P N (t ) 0
From (1)
d
P N (t ) 0 P N (t ) 0
dt
P N (t ) 0 e t
Next to find P( N (t ) 1)
d
P( N (t ) 1 P N (t ) 1 P N (t ) 0
dt
P N (t ) 1 e t
with initial condition P N (0) 1 0.
Solving the above first-order linear differential equation we get
P N (t ) 1 tet
Now by mathematical indication we can show that
( t ) n e t
P N (t ) n
n!
Remark
(1) The parameter is called the rate or intensity of the Poisson process.
It can be shown that
( (t2 t1 )) n e (t2 t1 )
P N (t2 ) N (t1 ) n
n!
Thus the probability of the increments depends on the length of the interval t2 t1 and not
on the absolute times t2and t1. Thus the Poisson process is a process with stationary
increments.
(2) The independent and stationary increment properties help us to compute the joint
probability mass function of N (t ). For example,
P N (t1 ) n1 , N (t2 ) n2 P N (t1 ) n1}) P({N (t2 ) n2 /{N (t1 ) n1}
P N (t1 ) n1}) P({N (t2 ) N (t1 ) n2 n1
(t1 ) n1 e t1 ( (t2 t1 )) n2 n1 e (t2 t1 )
n1 !
(n2 n1 )!
Mean, Variance and Covariance of the Poisson process
We observe that at any time t 0, N (t ) is a Poisson random variable with the parameter
t. Therefore,
EN (t ) t
and var N (t ) t
Thus both the mean and variance of a Poisson process varies linearly with time.
As N (t ) is a random process with independent increment, we can readily show that
C N (t1 , t2 ) var( N (min(t1 , t2 )))
min(t1 , t2 )
RN (t1 , t2 ) C N (t1 , t2 ) EN (t1 ) EN (t 2 )
min(t1 , t2 ) 2t1t2
A typical realization of a Poisson process is shown below:
Example: A petrol pump serves on the average 30cars per hour. Find the probability that
during a period of 5 minutes (i) no car comes to the station, (ii) exactly 3 cars come to the
station and (iii) more than 3 cars come to the station.
Average arrival = 30 cars/hr
= ½ car/min
Probability of no car in 5 minutes
(i) P N (5) 0 e
1
5
2
e 2.5 0.0821
3
1
5
2 2.5
(ii) P N (5) 3
e
3!
(iii) P{N (5) 3} 1 0.21 0.79
Binomial model:
P = probability of car coming in 1 minute=1/2
n=5
P( X 0) (1 P)5 0.55
Inter-arrival time and Waiting time of Poisson Process:
o
T1
T2
tn
tn 1
t2
t1
Tn 1
Tn
Let Tn = time elapsed between (n-1) st event and nth event. The random process
{Tn , n 1, 2,..} represent the arrival time of the Poisson process.
T1 = time elapsed before the first event take place. Clearly T1 is a continuous random
variable.
Let us find out the probability P({T1 t})
P({T1 t}) P({0 event upto time t})
e t
FT1 (t ) 1 P{T1 t} 1 et
fT1 (t ) et t 0
Similarly,
P({Tn t}) P( {0 event occurs in the interval (tn 1 , tn 1 t )/(n-1)th event occurs at (0, tn 1 ] })
P( {0 event occurs in the interval (tn 1 , tn 1 t ]})
e t
fTn (t ) et
Thus the inter-arrival times of a Poisson process with the parameter are exponentially
distributed. with
fTn (t ) et
n0
Remark
We have seen that the inter-arrival times are identically distributed with the
exponential pdf. Further, we can easily prove that the inter-arrival times are
independent also. For example
FT1 ,T2 (t1 , t2 ) P({T1 t1 , T2 t2 })
P({T1 t1}) P({T2 t2 }/{T1 t1})
P({T1 t1}) P({zero event occurs in (T1 , T1 +t2 ) }/{one event occurs in (01 , T1 ]})
P({T1 t1}) P ({zero event occurs in (T1 , T1 +t2 ) })
P({T1 t1}) P({T2 t2 })
FT1 (t1 ) FT2 (t2 )
It is interesting to note that the converse of the above result is also true. If the
inter-arrival times between two events of a discrete state {N (t ), t 0} process are
1
exponentially distributed with mean , then {N (t ), t 0} is a Poisson process
with the parameter .
The exponential distribution of the inter-arrival process indicates that the arrival
process has no memory. Thus
P({Tn t0 t1 / Tn t1}) P({Tn t0 }) t0 , t1
Another important quantity is the waiting time Wn . This is the time that elapses before the
nth event occurs. Thus
n
Wn Ti
i 1
How to find the first order pdf of Wn is left as an exercise. Note that Wn is the sum of n
independent and identically distributed random variables.
n
n
n
i 1
i 1
EWn ETi ETi
and
n
n
n
i 1
i 1
2
var (Wn ) var (Ti ) var(Ti )
Example
The number of customers arriving at a service station is a Poisson process with a rate
10 customers per minute.
(a) What is the mean arrival time of the customers?
(b) What is the probability that the second customer will arrive 5 minutes after the
first customer has arrived?
(c) What is the average waiting time before the 10th customer arrives?
Semi-random Telegraph signal
Consider a two-state random process { X (t )} with the states X (t ) 1 and X (t ) 1.
Suppose X (t ) 1 if the number of events of the Poisson process N (t ) in the interval
(0, t ] is even and X (t ) 1 if the number of events of the Poisson process N (t ) in the
interval (0, t ] is odd. Such a process { X (t )} is called the semi-random telegraph signal
because the initial value X (0) is always 1.
p X ( t ) (1) P ({ X (t ) 1})
P({ N (t ) 0} P({ N (t ) 2}) ...
( t ) 2
....)
2!
e t cosh t
e t {1
and
p X ( t ) (1) P({ X (t ) 1})
P({ N (t ) 1} P({ N (t ) 3}) ...
( t ) 3
....)
1!
3!
e t sinh t
e t {
t
We can also find the conditional and joint probability mass functions. For example, for
t1 t2 ,
p X ( t1 ), X ( t2 ) (1,1) P ({ X (t1 ) 1}) P ({ X (t2 ) 1)}/{ X (t1 ) 1)}
e t1 cosh t1 P({ N (t2 ) is even )}/{N (t1 ) is even)}
e t1 cosh t1 P({ N (t2 ) N (t1 ) is even )}/{N (t1 ) is even)}
e t1 cosh t1 P({ N (t2 ) N (t1 ) is even )}
e t1 cosh t1e ( t2 t1 ) cosh (t2 t1 )
e t2 cosh t1 cosh (t2 t1 )
Similarly
p X ( t1 ), X ( t2 ) (1, 1) e t2 cosh t1 sinh (t2 t1 ),
p X ( t1 ), X ( t2 ) (1,1) e t2 sinh t1 sinh (t2 t1 )
p X ( t1 ), X ( t2 ) (1, 1) e t2 sinh t1 cosh (t2 t1 )
Mean, autocorrelation and autocovariance function of X (t )
EX (t ) 1 e t cosh t 1 e t sinh t
e t (cosh t sinh t )
e2 t
EX 2 (t ) 1 e t cosh t 1 e t sinh t
e t (cosh t sinh t )
e t et
1
var( X (t )) 1 e 4 t
For t1 t2
RX (t1 , t2 ) EX (t1 ) X (t2 )
11 p X ( t1 ), X ( t2 ) (1,1) 1 ( 1) p X ( t1 ), X (t2 ) (1, 1)
( 1) 1 p X ( t1 ), X ( t2 ) ( 1,1) ( 1) ( 1) p X ( t1 ), X ( t2 ) ( 1, 1)
e t2 cosh t1 cosh (t2 t1 ) e t2 cosh t1 sinh (t2 t1 )
e t2 sinh t1 sinh (t2 t1 ) e t2 sinh t1 cosh (t2 t1 )
e t2 cosh t1 (cosh (t2 t1 ) sinh (t2 t1 )) e t2 sinh t1 (cosh (t2 t1 ) 1 sinh (t2 t1 ))
e t2 e ( t2 t1 ) (cosh t1 sinh t1 )
e t2 e ( t2 t1 ) et1
e 2 ( t2 t1 )
Similarlrly for t1 t2
RX (t1 , t2 ) e 2 ( t1 t2 )
RX (t1 , t2 ) e 2 t1 t2
Random Telegraph signal
Consider a two-state random process {Y (t )} with the states Y (t ) 1 and Y (t ) 1.
1
1
Suppose P({Y (0) 1}) and P({Y (0) 1})
and Y (t ) changes polarity with an
2
2
equal probability with each occurrence of an event in a Poisson process of parameter .
Such a random process {Y (t )} is called a random telegraph signal and can be expressed
as
Y (t ) AX (t )
where { X (t )} is the semirandom telegraph signal and A is a random variable
1
1
independent of X (t ) with P ({ A 1}) and P({ A 1}) .
2
2
Clearly,
1
1
EA (1) 1 0
2
2
and
1
1
EA2 (1) 2 12 1
2
2
Therefore,
EY (t ) EAX (t )
EAEX (t )
A and X (t ) are independent
0
RY (t1 , t2 ) EAX (t1 ) AX (t2 )
EA2 EX (t1 ) X (t2 )
e
2 t1 t2
© Copyright 2025 Paperzz