Chapter 5 Special Discrete Distributions
§ 5.1 Bernoulli and Binomial Random Variables
• Definition: A random variable is called Bernoulli with
parameter p if its probability function is given by
p(x) = 1-p = q
=p
=0
if x = 0
if x = 1
otherwise.
* Note that E(X) = 0*P[X=0] + 1*P[X=1] = p,
Var(X) = E(X2) - [E(X)]2 = p - p2.
• If n Bernoulli trials with probability of success p are
performed independently, the number of successes is the
binomial random variable with parameters n and p.
• Theorem 5.1: Let X be the binomial random variable
with parameters n and p. Then p(x), the probability
function of X, is given by
p(x) = P{X=x] = C(n,x) px (1-p)n-x
=0
if x = 0, 1,...,n
otherwise
Proof: See text.
• Definition: The function p(x) given above is called the
binomial probability function with parameters (n,p).
* Note that p(x) = C(n,x) px(1-p)n-x = [(p + (1-p)]n = 1.
x=0,n
x=0,n
* Example 5.7 Given a binomial random variable X, find
the p such that P[X=k] is maximum.
Solution: P[X=k] = C(n,k)pk(1-p)n-k
Differentiating this probability with p and set it to zero to
find the optimal p;
d[C(n,k)pk(1-p)n-k]/dp = 0
C(n,k)[kpk-1(1-p)n-k - (n-k)pk(1-p)n-k-1] = 0
k(1-p) - (n-k)p = 0
k - kp - np + kp = 0
k = np
p = k/n.
So the optimal p is just k/n. Also note that
d2P[X=k]/d2p < 0, so at this optimal p, the probability is a
maximum.
• It can also be shown that for a binomial random
variable, p(x) is maximum at x = (n+1)p.
Proof: Note that from text
p(x)/p(x-1) = 1 + {(n +1)p - x}/x(1-p).
Note that p(x) > p(x-1), if x (n+1)p and p(x-1) > p(x)
when x > (n+1)p. So the maximum of p(x) occurs at
x = (n+1)p.
• Expectations and Variances of Binomial Random
Variables
* It can be easily shown that E(X) = np.
* Also, it can be shown that
E(X2) = x2C(n,x) px(1-p)n-x = [x2-x]C(n,x) px(1-p)n-x +
xC(n,x) px(1-p)n-x = [x2-x]C(n,x) px(1-p)n-x + np .
Now,
[x2-x]C(n,x) px(1-p)n-x = [x(x-1)][n!/x!(n-x)!]px(1-p)n-x
= n(n-1)p2 [(n-2)!/(x-2)!(n-x)!]px-2(1-p)n-x
= n2p2 - np2
E(X2) = n2p2 - np2+np and Var(X) = E(X2) - [E(X)]2
= np -np2 = np(1-p) = npq.
* Example 5.9 (Polya’s Urn Model)
An urn contains w white chips and b blue chips. A chip is
drawn and is put back with c more chips of the same color.
This procedure is repeated for (n-1) more times. Let W i ,
B i be the events that the ith draw is a white chip and blue
chip, respectively. Find P(W i) and P(B i ).
Solution: Obviously,
P(W 1) = w/(w+b) and P(B 1) = b/(w+b) .
We then want to find P(W 2) and P(B 2). First, find
P(W 2|W 1). Since the first chip is white, before the second
draw there are w+c white chips and b blue chips, hence
P(W 2|W 1) = (w+c )/(w+c +b) . Similarly,
P(B 2|W 1) = b/(w+c +b). So,
P(W 2)
= P(W 1) P(W 2|W 1) + P(B 1) P(W 2|B1)
= [w/(w+b)][(w+c)/(w+b+c)] + [b/(w+b)][w/(w+b+c)]
= w/(w+b) = P(W 1)
P(B 2) = 1 - P(W 2) = b/(w+b) = P(B 1).
P(W i) = P(W 2) = w/(w+b) , i = 1,2,...,n ,
P(B i) = P(B 2) = b/(w+b) , i = 1,2,...,n .
* Example 5.10: Two proofreaders found r and m
misprints in a book. Suppose that the probabilities that
they notice a misprint are p and q, respectively. Let these
two probabilities be independent. Also if they both
notice b errors. Then how many misprints are not
noticed?
Solution: Let there be M total misprints in the book.
Assuming the misprints found are approximately the
average numbers, i.e.,
Mp = r, Mq = m, and Mpq = b.
M = (Mp)(Mq)/(Mpq) = rm/b
So the number of unnoticed misprints is
M - (r+m-b) = rm/b - r -m +b = (r-b)(m-b)/b
§ 5.2 Poisson Random Variable
• Poisson approximation of the binomial random
variable: For a binomial random variable X, and as n,
p0, and np = is a constant.
P[X=i] = {n(n-1)(n-2)...(n-1+i)/ni }(i/i!)(1 - /n)n-i,
As n, p0, and np = is a constant, we have
P[X=i] ~ e-(i/i!).
• Definition: A discrete random variable X with possible
values 0, 1, 2, 3, ... is called Poisson with parameter ,
> 0, if
P[X=i] ~ e-(i/i!),
i = 0, 1, 2, 3, ...
* It can be shown that for a Poisson random variable X,
E(X) = and Var(X) = .
* Examples of Poisson random variable include
- number of defective fuses produced in a year
- number of customers in an hour
- number of telephone calls in a given period
* Examples 5.11 -- 5.14
* It can be shown that for real problems and even small n,
the Poisson approximation of the binomial probability
function is usually very close.
Poisson Process
• Let N(t) be the number of events that have occurred in
[0,t]. Then the set of random variables {N(t): t > 0} is a
counting process.
• Definition: A counting process {N(t): t > 0} is said to be
Poisson (Poisson process) if N(0) = 0, it is stationary,
possesses independent increments, and satisfies
lim P{N(h) > 1}/h = 0
h0
* The last condition merely means that the probability of
two or more events occur at a small interval is negligible,
or for stationary processes, this means that simultaneous
occurrence of two or more events is impossible.
• Theorem 5.2: If {N(t): t > 0} is a Poisson process, then
there exists a such that
P{N(t) = n} = (t)ne-t/n!.
That is for all t > 0, N(t) is a Poisson random variable
with parameter t. Hence E[N(t)] = t and E[N(1)] = .
Sometimes is called the rate of the Poisson process.
* see Examples 5.15 -- 5.16
* Example 5.17 A fisherman catches fish at a Poisson rate
of two per hour. What is the probability that he catches
one fish by 10:30 am and three fish by noon if he starts at
10:00 am?
Solution: We want to find P{N(0.5) = 1 N(2)-N(0.5)=2}.
Note that these two events are independent since the
process is Poisson, so we want to find
P = P{N(0.5) = 1}P{N(2)-N(0.5)=2}.
Since Poisson process is stationary, so P{N(2)-N(0.5)=2} =
P{N(1.5)=2}. Thus
P = P{N(0.5) = 1}P{N(1.5)=2} = {11e-1/1!}{32e-3/2!}
§ 5.3 Other Discrete Random Variables
Geometric Random Variables
• In a sequence of success-failure Bernoulli trials, each
with probability of success p, let X be the number of trials
until the first success occurs, then X is a geometric
random variable with
P(X=n) = (1-p)n-1p,
* Note that
n=1,
(1-p)n-1p
n = 1, 2,....
= p[ 1/1-(1-p)] = 1.
• Definition: The probability function shown above for
n = 1, 2, ... and 0 elsewhere is called geometric.
* E(X) = p
n=1,
n(1-p)n-1
= -p*d([1-p]/p)/dp
* E(X2) = p
=p
= -p
n=1,
d(1-p)n /dp
= -p* [-p-(1-p)/p2] = 1/p.
n2(1-p)n-1
n=1,
d2(1-p)n+1
n=1,
= p
(n2+n)(1-p)n-1 - (1/p)
/d2p - (1/p) = p*d2([1-p]2/p)/d2p - (1/p)
= 2/p2 - 1/p = (2-p)/p2 .
* Var(X) = E(X2) - E(X)2 = (2-p)/p2 - 1/p2 = (1-p)/p2 .
* see Examples 5.19, 5.20.
© Copyright 2026 Paperzz