Outline: Chapter 6 Sums of RVs Expected Values of Sums I (6.1

Outline: Chapter 6 Sums of RVs
2
…
…
204312 PROBABILITY AND
RANDOM PROCESSES FOR
COMPUTER ENGINEERS
…
…
Lecture 9:
Chapters
p
6.1-6.5
…
EExpected
d Values
V l
off Sums
S
(6.1,
(6 1 Y&G)
PDF of the Sum of Two RVs (6.2, Y&G)
Moment Generating Functions (6.3, Y&G)
MGF of the Sums of Independent RVs (6.4,
(6 4
Y&G)
R d Sums
Random
S
off Independent
I d
d RVs
RV (6.5,
(6 5 Y&G)
1st Semester, 2007
Monchai Sopitkamon, Ph.D.
Expected Values of Sums I (6.1)
3
Expected Values of Sums II
4
…
…
…
…
Sum of the RVs: X1+ " + Xn appears in many prob
theory and applications.
Deriving prob model of the sum is more difficult
than analyzing
y g the RVs themselves.
First, we’re interested in considering the expected
value of the sums,
sums instead of the sum itself
itself.
For any set of RVs X1, …, Xn, the expected value of
Wn = X1+ " + Xn is
i
E (Wn ) = E ( X 1 ) + E ( X 2 ) +" E ( X n )
…
…
In other words, the expected value of the sum = sum
of the expected values no matter if X1, …, Xn are
independent or not.
For the variance of Wn, we have:
n
n −1
Var (Wn ) = ∑ Var ( X i ) + 2∑
i =1
…
n
∑ Cov( X , X
i =1 j =i +1
i
j
)
When X1, …, Xn are uncorrelated, Cov(Xi, Xj) = 0
for i ≠ j and the variance of the sum = sum of the
variance: Var (Wn ) = Var ( X 1 ) + " + Var ( X n )
Expected Values of Sums III
5
PDF of the Sum of Two RVs I (6.2)
6
…
EEx.6.1:
6 1 X0, X1, X2, … is a sequence off RVs
RV withh expected
d
values E(Xi) = 0 and covariances, Cov(Xi, Xj) = 0.8|i-j|. Find
the expected value and variance of a RV Yi defined as sum
of 3 consecutive values of the random sequence Yi = Xi +
Xi-1 + Xi-2
From the expectation eq.,
E (Yi ) = E ( X i ) + E ( X i −1 ) + E ( X i − 2 ) = 0
Applying the Var eq., we obtain for each i,
Var(Yi) = Var(Xi) + Var(Xi-1) + Var(Xi-2) +
2C (Xi, Xi-1) + 2Cov(X
2Cov(X
2C (Xi, Xi-2) +
2Cov(Xi-1, Xi-2)
= 3 x 0.8
0 80 + 2 x 0.8
0 81 + 2 x 0.8
0 82 + 2 x 0.8
0 81
= 3 x 1 + 4 x 0.81 + 2 x 0.82 = 7.48
…
…
fW ( w) = ∫
∞
−∞
f X ,Y ( x, w − x) dx = ∫
∞
−∞
f X ,Y ( w − y, y ) dy
Y
w
X+Y≤W
w
PDF of the Sum of Two RVs II
7
First, find PDF of sum W of X + Y by first finding
the CDF FW(w) by integrating joint PDF fX, Y(x, y)
over region X + Y ≤ w
The PDF of W = X + Y is
X
PDF of the Sum of Two RVs III
8
…
Ex.6.4: Find the PDF of W = X + Y when X and Y
have the joint PDF
…
⎧2 0 ≤ y ≤ 1, 0 ≤ x ≤ 1, x + y ≤ 1,
f X ,Y ( x, y ) = ⎨
⎩0 otherwise.
Y
1
Since 0 ≤ X + Y = W ≤ 1, fW(w) = 0 for w < 0 or
w > 1.
1 For
F 0 ≤ w ≤ 1,
1
w
fW ( w) = ∫ 2 dx = 2 w,
y=w-x
w
0
or,
w 1
X
0 ≤ w ≤1
⎧2w 0 ≤ w ≤ 1,
fW ( w) = ⎨
otherwise.
⎩0
When X and Y are independent RVs, the PDF of W
= X + Y is
fW ( w) = ∫
∞
−∞
…
…
f X ( w − y ) fY (y) dx = ∫
∞
−∞
f X ( x) fY ( w − x) dx
Since joint PDF fX,Y(x, y) = fX(x) fY(y)
The combination of two univariate functions,
functions fX(⋅) and fY(⋅ ),)
to produce a third function, fW(⋅) is called convolution.
When X and Y are independent integer-valued
integer valued discrete
RVs, the PMF of W = X + Y is a convolution:
PW ( w) =
∞
∑P
k =−∞
X
(k ) PY ( w − k )
PDF of the Sum of Two RVs IV
PDF of the Sum of Two RVs V
9
10
…
Ex.Quiz6.2:
Ex
Quiz6 2: Let X and Y be independent
exponential RVs with expected values E(X) = 1/3
and E(Y) = ½. Find the PDF of W = X + Y.
Since, for any exponential RV X, E(X) = 1/λ, hence
λ = 1/E(X).
1/E(X)
Therefore, RVs X and Y have PDFs
⎧3e −3 x
f X ( x) = ⎨
⎩0
⎧2e −2 y
fY ( y ) = ⎨
⎩0
x≥0
otherwise
fW ( w) = e −3 w e y
w
0
= 6(e−2 w − e −3 w )
Since fW(w) = 0 for w < 0, a complete expression for
the PDF of W is
⎧6e −2 w (1 − e − w ) w ≥ 0,
0
fW ( w) = ⎨
otherwise.
⎩0
y≥0
otherwise
Since X and Y are nonnegative, W = X + Y is
nonnegative. The PDF of W = X + Y is
fW ( w) = ∫
∞
−∞
∞
f X ( w − y ) fY (y) dx = 6 ∫ e −3( w− y ) e −2 y dy
0
Moment Generating Functions II
Moment Generating Functions (6.3)
11
12
…
…
…
…
PDF off sum off ind.
i d RVs
RV X1, …, Xn, is
i a seq. off convolutions
l ti
iinvolving
l i
PDFs f X ( x), f X ( x),..., f X ( x)
1
2
n
In prob theory, we use transform methods to replace convolution of
PDF b
PDFs
by multiplication
l i li i off transforms.
f
This transform of a PDF or PMF is called moment generating function
For a RV X, the MGF of X is
MGFs for the families
of RVs defined in
previous chapters
φ X ( s ) = E (e sX )
…
F a continuous RV X,
For
X
∞
φ X ( s ) = ∫ e sX f X ( x) dx
−∞
…
For a discrete RV X,
φX (s) =
∑e
yi ∈SY
sX
X
PY ( yi )
Probability
2/E
P b bilit and
d Stochastic
St h ti Processes,
P
by Roy D. Yates and David J. Goodman
Copyright © 2005 by John Wiley & Sons,
Inc. All rights reserved.
Moment Generating Functions III
13
Moment Generating Functions IV
14
…
A RV X with MGF
M φX(s) has nth moment
E( X n ) =
…
Ex.6.5: The second moment is the mean square
d φ X ( s)
ds n s =0
n
value:
E( X 2 ) =
Ex.6.5: X is an exponential
p
RV w// MGF φX((s)) = λ/(λ − s).)
What are the first and second moments of X? Write a
general expression for the nth moment.
The first moment is the expected value:
dφ ( s )
λ
E( X ) = X
=
ds s =0 ( λ − s )2
=
s =0
=
s =0
2
λ2
Proceeding in this way
way, the nth moment of X is
d nφ X ( s )
n !λ
E( X ) =
=
n +1
n
ds
(λ − s)
s =0
1
=
n
λ
Moment Generating Functions V
15
d 2φ X ( s )
2λ
=
ds 2 s =0 ( λ − s )3
s =0
n!
λn
Moment Generating Functions VI
16
…
…
…
MGF of a linear transformation of a RV X in terms
of φX(s)
The MGF of Y = aX + b is φY(s) = esbφX(as)
Ex Quiz6 3: RV K has PDF
Ex.Quiz6.3:
⎧0.2 k = 0,..., 4,
PK (k ) = ⎨
otherwise
⎩0
Use the MGF φK((s)) to find the first,, second,, third and
fourth moments of K
The MGF of K is
We find the moments by taking derivatives. The first
derivative of φK(s) is
Evaluating the derivative at s = 0 yields
Moment Generating Functions VII
17
18
To find higher-order moments, we continue to take
derivatives:
MGF of the Sums of Independent RVs
(6.4)
…
…
MGFs are useful for computing sums of independent
RVs, because if X and Y are ind., the MGF of W =
X + Y is the product:
φW ( s ) = E ( e sX ) E ( e sY ) = φ X ( s )φY ( s )
For a set of ind. RVs X1, …, Xn, the moment
generating function W = X1+ " + Xn is
φW ( s ) = φ X ( s )φ X ( s)"φ X ( s)
1
2
n
When X1, …, Xn are iid, each with MGF φXi(s) = φX(s)
φW ( s) = [φ X ( s)]
n
MGF of the Sums of Independent RVs II
19
20
…
Ex.6.6: J and K are ind. RVs w/ PMFs
⎧0.2 j = 1,
⎪0.6 j = 2,
⎪
PJ ( j ) = ⎨
⎪0.2 j = 3,
⎪⎩0
otherwise;
⎧0.5 k = −1,,
⎪
PK (k ) = ⎨0.5 k = 1,
⎪0
otherwise.
⎩
E(M3)
Find the MGF of M = J + K. What are
J and
d K have
h
moment generating
i ffunctions
i
φJ ( s ) = 0.2e s + 0.6e 2 s + 0.2e3s ,
and PM(m)?
φK ( s ) = 0.5e − s + 0.5e s
Since J and K are independent, M = J + K has MGF
φM ( s ) = φJ ( s )φK ( s ) = ( 0.2
0 2e s + 00.6
6e 2 s + 00.2
2e3 s )( 00.5
5e − s + 00.55e s )
= 0.1 + 0.3e s + 0.2e 2 s + 0.3e3 s + 0.1e 4 s
MGF of the Sums of Independent RVs
III
To find E(M3),) the third moment of M,
M we differentiate
φM(s) 3 times:
E (M ) =
3
d 3φM ( s )
ds
=
3
s =0
(
d 3 0.1 + 0.3e s + 0.2e 2 s + 0.3e3 s + 0.1e 4 s
ds
( )
( )
)
3
( )
s =0
= 0.3
0 3e + 00.2
2 2 e + 00.3
3 3 e + 00.11 4 e
= 16.4
16 4
s =0
The value of PM(m) at any value of m is the coefficient of
ems in φM(s):
()
φM ( s ) = E ( e sM ) = 0.1 + 0.3e s + 0.2e2 s + 0.3e3 s + 0.1e4 s
s
3
2s
PM(0) PM(1)
The complete expression for the
PMF of M is
3
3s
PM(2)
3
4s
PM(3)
PM(4)
m = 0, 4,
m = 1,3,
13
m = 2,
otherwise.
⎧0.1
⎪0 3
⎪0.3
PM (m) = ⎨
⎪0.2
⎪⎩0
21
MGF of the Sums of Independent RVs
IV
…
Random Sums of Independent RVs (6.5)
22
Other properties of the sums of ind. RVs
…
X1, …, Xn are ind. Poisson RVs, W = X1+ " + Xn is a
Poisson RV. [Sum of ind. Poisson RVs is a Poisson RV]
† The sum of n ind. Gaussian RVs W = X1+" + Xn is a
Gaussian RV.
† If X1, …, Xn are iid exponential (λ) RVs, then W = X1+ "
+ Xn has Erlang PDF
† If
⎧ λ n wn −1e − λ w
⎪
⎪ ( n − 1) !
fW ( w) = ⎨
⎪0
⎪
⎩
…
…
…
…
w ≥ 0,
0
otherwise
th i
Random Sums of Independent RVs II
23
Sometimes
S
ti
the
th number
b off terms
t
in
i th
the sum off iid RVs
RV is
i also
l
a RV.
The resultant RV, R, is a random sum of iid RVs.
Given a RV N and a sequence of iid RVs X1, X2, …, let R =
X 1+ " + X N
For example, the number of buses (N) arriving with the
number of people on the ith bus (Ki) during a minute at a
p has a total number of people
p p arriving
g as R = K1+"
bus stop
+ KN
Or the number N of packets transmitted over a LAN
segmentt in
i 1 minute
i t hhas the
th number
b off successfully
f ll decoded
d d d
packets in this time span as R = X1+ " + XN
where Xi is 1 if the ith p
packet is decoded correctlyy and 0
otherwise. Since N is random, R is not usual binomial RV.
Random Sums of Independent RVs III
24
…
…
Ex.6.12:
6 2 Let {X
{ 1, X2, …}} b
be a collection
ll
off iid
d RVs, eachh
w/ MGF φX(s), and let N be a nonnegative integervalued RV that is ind.
ind of {X1, X2, …}.
} The random sum R
= X1+ " + XN has moment generating function
φR(s) = φN(ln φX(s))
Ex.6.9:
Ex
6 9 The no
no. of pages N in a fax transmission has a
geometric PMF w/ expected value 1/q = 4. The no. of
bits K in a fax page also has a geometric dist w/
expected value 1/p = 105 bits, ind. of no. of bits in any
other page and ind. of no. of pages. Find the MGF of
B, the total no. of bits in a fax transmission.
When the ith page has Ki bits, the total no. of bits is the
random sum B = K1+ " + KN. Thus φB(s) = φN(ln φK(s))
From the table, we get
qe s
φN ( s ) =
,
1 − (1 − q ) e s
pe s
φK ( s ) =
1 − (1 − p ) e s
To compute φB(s), we replace s with ln φK(s) for every s in
φN(s). Or, we can replace es with φK(s) for every es in φN(s).
This yields
s
⎛
⎞
pe
q ⎜⎜
⎟
1 − (1 − p ) e s ⎟⎠
⎝
φB ( s ) =
⎛
pe s
1 − (1 − q ) ⎜⎜
s
⎝ 1 − (1 − p ) e
⎞
⎟⎟
⎠
=
pqe s
1 − (1 − pq ) e s
Random Sums of Independent RVs IV
25
Random Sums of Independent RVs V
26
By comparing φK(s) (or φN(s)) and φB(s), we see that B has
the MGF of a geometric (pq = 1/(4x105)) RV w/
expected
d value
l 1/(pq)
1/( ) = 400,000
400 000 bits.
bi Therefore,
Th f
B
has the geometric PMF (referring to the same table)
⎪⎧ pq (1 − pq )
PB (b) = ⎨
⎪⎩0
bb−
−1
…
h random
d sum off iid
d RVs R = X1+ " + XN,
For the
E(R) = E(N)E(X)
Var(R) = E(N)Var(X) + Var(N)(E(X))2
Randomness Randomness
of X
of N
b = 1, 2,...,
otherwise.
…
…
HW 6
Random Sums of Independent RVs VI
27
Suppose N is deterministic Æ N = n, μN = n and Var(N)
= 0. Therefore, random sum R is deterministic sum R =
X1+ " + Xn and Var(R) = nVar(X).
Suppose N is random but each Xi is deterministic
constant x. Thus μX = x and Var(X) = 0. Random sum
becomes R = Nx and Var(R) = x2Var(N).
Var(N)
28
…
…
Note
N
t that
th t N mustt b
be independent
i d
d t off the
th RVs
RV X1, X2, …. Or,
O
the number of terms in the random sum cannot depend on
the actual values of the terms in the sum.
Ex.6.10: Let X1, X2, … be a seq. of ind. Gaussian (100, 10)
RVs. If K is a Poisson (1) RV ind. of X1, X2, …, find the
expected value and variance of R = X1+ " + XK.
From E(R) = E(N)E(X) = E(K)E(X), we have E(X) = 100.
p
value,, E(K),
( ), of a Poisson RV = λ
From the fact that expected
= 1, thus, E(R) = E(K)E(X) = 100.
From Var(R) = E(N)Var(X) + Var(N)(E(X))2
= E(K)Var(X) + Var(K)(E(X))2
From the fact that variance, Var(K), of a Poisson RV = λ = 1,
Var(R) = 1x102 + 1x1002 = 100 + 10,000
10 000 = 10,100.
10 100
Problem
6.1.3
621
6.2.1
6.3.2
6.4.2
6.5.1