Sums and sample means of
independent random variables
Markov’s inequality
• {Xi:i:1,..n}: i.i.d (independent and identically
distributed rv)
• X is a nonnegative rv and a>0:
P( X ≥ a) ≤
S n = X 1 + ... + X n
X + ... + X n
Mn = 1
n
E( X )
a
• Prf. ...
As nƶ, ?
1
2
Example
Chebyshev’s inequality
• X is a rv with µ and σ2. For k>0:
• X~U(0,4)
P(| X − µ |≥ k ) ≤
σ2
k2
• Prf. ...
3
Example
4
Weak Law of Large Numbers (WLLN)
• X is a rv with µ and σ2. For ε>0:
• X~U(0,4)
n
lim P(| M n − µ |≥ ε ) = 0,
n →∞
where M n =
∑X
i =1
n
i
.
• Prf. ...
5
6
1
Example
Example: Polling
• Probabilities and frequencies
• A: an event
• p=P(A): probability of event A
1 if A occurs
Xi =
0 otherwise
X + ... + X n
Mn = 1
n
• p: fraction of voters who support a particular
candidate
• n: number of “randomly selected” voters
P(| M n − µ |≥ ε ) = ?
If ε=0.1 and n=100:
E[Xi] = ?
If we need P(| M n − p |≥ 0.01) < 0.05, n = ?
As nÆ∞, Mn Æ ?
7
8
Central Limit Theorem (CLT)
Central Limit Theorem (CLT)
• X1, …, Xn be independent identically distributed (iid)
rvs with E(Xi) = µ. Then:
• X1, …, Xn be independent identically distributed (iid)
rvs with E(Xi) = µ. Then:
n
Zn =
S − nµ
Zn = n
, E[Z n ] = ? and var(Z n ) = ?
σ n
S n − nµ
=
σ n
∑X
i =1
i
− nµ
σ n
→ Z n ~ N (0,1) as
n
∑ X i − nµ
1
→∞
i =1
P
≤ a n
→
σ n
2π
9
Example
a
∫e
−
n → ∞,
x2
2
dx
−∞
Normal approximation by CLT...
10
Example
• 100 packages on a plane
• Wk: weight of kth package
• Wk ~ iid with U(5,50)
(iid: independent and identically distributed)
• A machine processes parts one at a time
• Pk: processing time of kth part
• Pk ~ iid with U(1, 5)
(iid: independent and identically distributed)
• Approximate the probability that the total weight
will exceed 3000.
• Approximate the probability that the number of
parts processed within 320 time units, denote it by
N320, is at least 100.
11
12
2
Normal approximation to Binomial
(DeMoivre-Laplace theorem)
Example: Polling
• Prop: Let X be a binomial rv based on n trials with success
probability p, P[X ≤x] = B(x; n, p) = area under the normal
curve to the left of x + 0.5
• p: fraction of voters who support a particular
candidate
• n: number of “randomly selected” voters
If ε=0.1 and n=100:
x + 0.5 − np
P{X ≤ x} ≅ Φ
np(1 − p)
P(| M n − µ |≥ ε ) = ?
or equivalently:
If we need P(| M n − p |≥ 0.01) < 0.05, n = ?
13
Example
l + 0.5 − np
− Φ k + 0.5 − np
P{k ≤ X ≤ l} ≅ Φ
np (1 − p)
np (1 − p)
14
Strong Law of Large Numbers (SLLN)
• X~binomial(36,0.5)
• X1, …, Xn be independent identically distributed (iid)
rvs with E(Xi) = µ. Then:
P ( X ≤ 21) = 0.8785
P ( X = 19) = 0.1251
n
∑X
i =1
n
i
→ µ as
• Approximations?
or equivalently :
15
n → ∞ with probability 1.
n
Xi
∑
P lim i =1
=µ
n →∞ n
= 1.
16
Example:
Discrete time arrival process
Example
• Ik = {2k, 2k+1, ..., 2k+1-1}
• length(Ik) = 2k
• During each interval Ik, there is exactly one arrival, all times
within an interval are equally likely.
• Arrival times within different intervals are independent.
• X1, X2, …, be a sequence of iid rvs each with U(0,1)
distribution
• Yn= min{X1,..., Xn}
• Show that YnÆ0 in probability: lim P (Yn ≠ 0) = ?
n →∞
• Show that YnÆ0 with probability 1: P ( lim Yn ≠ 0) = ?
n →∞
1 if there is an arrival at time n
Yn =
otherwise
0
lim P(Yn ≠ 0) = ?
n →∞
17
P(lim Yn ≠ 0) = ?
n →∞
18
3
© Copyright 2026 Paperzz