expected value of g(X,Y)

expected value of g (X , Y )
P P
E (g (X , Y )) = x y g (x, y )p(x, y ) if X and Y are discrete.
RR
or E (g (X , Y )) =
g (x, y )f (x, y )dxdy if X and Y are
continuous.
Theorem: Let X and Y be two r.v.s and a and b are constants, then
E (aX + bY ) = aE (X ) + bE (Y ) provided E (X ) and E (Y ) are
finite.
Corollary: Let W1 , W2 , · · · , Wn be any r.v.s for which
E (Wi ) < ∞, i = 1, · · · , n and let a1 , a2 , · · · , an be constants. Then
E (a1 W1 + a2 W2 + · · · + an Wn ) =
a1 E (W1 ) + a2 E (W2 ) + · · · + an E (Wn ).
Example
3.9.4. A secretary put n letters into n envelops handed to her at
random. How many people, on average, will receive their correct
mail?
Let Xi be the number of correct mails put into the ith envelope.
Then P(Xi = 1) = n1 , P(Xi = 0) = 1 − n1 and E (Xi ) = n1 .
Let the
P number of envelopes correctly stuffed be X , then
X = ni=1 XP
i.
So E (X ) = ni=1 E (Xi ) = 1.
E (XY )
P P
E (XY ) = x y xyp(x, y ) if X and Y are discrete.
RR
or E (g (X , Y )) =
xyf (x, y )dxdy if X and Y are continuous.
Theorem: If X and
are independent,PE (XY
PY P
P ) = E (X )E (Y ).
proof:
E
(XY
)
=
xyp(x,
y
)
=
x
y
x
y xypX (x)PY (y ) =
P
P
xp(x)
yp(y
)
=
E
(X
)E
(Y
).
x
y
Exercise
3.9.1. suppose r chips are drawn with replacement from an urn
containing n chips numbered 1 to n. Let V be the sum of the
numbers drawn. Find E (V ).
3.9.3. Suppose that fX ,Y (x, y ) = 32 (x + 2y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1.
Find E (X + Y ).
Variance of a sum of r.v.s
definition of covariance:
Cov (X , Y ) = E [(X − µX )(Y − µY )] = E (XY ) − µX µY .
example
Given the joint pdf
24xy , 0 ≤ x ≤ 1, 0 ≤ y ≤ 1, x + y ≤ 1
f (x, y ) =
0, otherwise
R∞
R 1−x
fX (x) = −∞ f (x, y )dy = 0 24xydy = 12x(1 − x 2 ), 0 ≤ x ≤ 1.
Similarly, fY (y ) = 12y (1 − y 2 ), 0 ≤ y ≤ 1, and µX = µY = 52 .
Z
1 Z 1−x
E (XY ) =
Z
xy 24xydydx = 8
0
0
Thus Cov (X , Y ) =
0
2
15
2
− ( 25 )( 52 ) = − 75
.
1
x 2 (1 − x)3 dx =
2
.
15
Theorem: Suppose X and Y are r.v.s with finite variances, and a
and b are constants. Then
Var (aX + bY ) = a2 Var (X ) + b 2 Var (Y ) + 2abCov (X , Y ).
Proof: Page 190.
Example
Example 3.9.8. For the joint pdf
fX ,Y (x, y ) = x + y , 0 ≤ x ≤ 1, 0 ≤ y ≤ 1. Find Var (X + Y ).
Note Var (X + Y ) = Var (X ) + Var (Y ) + 2Cov (X , Y ).
R1
fX (x) = 0 (x + y )dy = x + 12 , 0 ≤ x ≤ 1.
R1
7
µX = E (X ) = 0 x(x + 12 )dx = 12
.
R
1
1
5
2
2
E (X ) = 0 x (x + 2 )dx = 12 and
11
Var (X ) = E (X 2 ) − µ2X = 144
.
R1R1
Next E (XY ) = 0 0 xy (x + y )dxdy = 13 ,
so Cov (X , Y ) = E (XY ) − µX µY = −1/144.
Finally, Var (X + Y ) = 2 ∗ 11/144 − 2 ∗ 1/144 = 5/36.
variance of a sum of r.v.s
W1 , ·P
· · , Wn are r.v.s
finite variances,
Pwith
Pthen
n
n
2
Var ( i=1 ai Wi ) = i=1 ai Var (Wi ) + 2 i<j ai aj Cov (Wi , Wj ).
If W1P
, · · · , Wn are independent,
then
P
Var ( ni=1 ai Wi ) = ni=1 ai2 Var (Wi ).
In particular,
Var (W1 + W2 + · · · + Wn ) = Var (W1 ) + Var (W2 ) + · · · + Var (Wn ).
Example
3.9.11. Let X̄n be the mean of a random sample of n observations
X1 , · · · , Xn with E (Xi ) = µ, Var (Xi ) = σ 2 .
Then X̄n = n1 X1 + · · · + n1 Xn ,
2
and Var (X̄n ) = ( n1 )2 Var (X1 ) + · · · + ( n1 )2 Var (Xn ) = σn .
Exercises
3.9. 20. Let X ∼ binomial (n, pX ), Y ∼ binomial (m, pY ).
Assume X and Y are independent. Find E (W ), Var (W ) where
W = 4X + 6Y .
3.9.16. Let X and Y be r.v.s with
f (x, y ) = 1, −y ≤ x ≤ y , 0 ≤ y ≤ 1.
Show that Cov (X , Y ) = 0 but X and Y are not independent.