Math 151 Homework 9 Solutions (Winter 2015)
Problem 7.25
Observe that event {N ≥ n} is same as the event {X1 ≥ X2 ≥ · · · ≥ Xn−1 }. Since Xi are
independent fX1 ,...,Xn−1 (x1 , . . . , xn−1 ) = f (x1 ) · · · f (xn−1 ) where f is the density function of X1 .
So as in problem 6.T.11 we get
Z ∞ Z x1
Z xn−2
P (N ≥ n) =
···
f (x1 ) · · · f (xn−1 )dxn−1 . . . dx2 dx1
−∞
−∞
−∞
1
=
(n − 1)!
Hence
∞
X
E[N ] =
P (N ≥ n) =
n=1
∞
X
1
=e
(n − 1)!
n=1
Problem 7.26
(a) Observe that {max(X1 , . . . , Xn ) < x} and {Xi < x for all 1 ≤ i ≤ n} =
n
\
{Xi < x} are
i=1
same events. By using independence of Xi we get
n
\
P (max(X1 , . . . , Xn ) < x) = P
!
{Xi < x}
=
i=1
n
Y
P (Xi < x) = xn
i=1
for all 0 < x < 1. So
Z
∞
Z
P (max(X1 , . . . , Xn ) ≥ x)dx =
E[max(X1 , . . . , Xn )] =
0
1
(1 − xn )dx =
0
(b) Similarly {min(X1 , . . . , Xn ) ≥ x} and {Xi ≥ x for all 1 ≤ i ≤ n} =
n
\
n
n+1
{Xi ≥ x} are same
i=1
events. By using independence of Xi we get
P (min(X1 , . . . , Xn ) ≥ x) = P
n
\
!
{Xi ≥ x}
i=1
1
=
n
Y
i=1
P (Xi ≥ x) = (1 − x)n
for all 0 < x < 1. So
Z
∞
Z
P (min(X1 , . . . , Xn ) ≥ x)dx =
E[min(X1 , . . . , Xn )] =
0
1
(1 − x)n dx =
0
1
n+1
Problem 7.42
Label women with W1 , . . . , W10 and men
with M1 , . . . , M10 . Let Xi ve 1 or 0 depending on ith
P10
man paired with a woman. Then X = i=1 Xi is the total number of pairs consisting of a man
10
we get
and woman. Since E[Xi ] = P (Xi = 1) = 19
E[X] =
10
X
E[Xi ] =
i=1
10
X
10
i=1
19
=
100
19
9 10
we get
17 19
2
X
100
10 9 10
100
16200
2
V ar(X) = E[X] + 2
E[Xi Xj ] − (E[X]) =
+2
−
=
19
2
17
19
19
6137
i<j
Since E[Xi Xj ] = P (Xi = 1, Xj = 1) = P (Xi = 1|Xj = 1)P (Xj = 1) =
Now P
suppose Wi and Mi are married and let Yi be 1 or 0 depending on Mi pair with Wi . Then
10
1
Y = i=1 Yi is the total number of pairs consisting of married couple. Since E[Yi ] = P (Yi = 1) = 19
we get
10
10
X
X
1
10
E[Yi ] =
E[Y ] =
=
19
19
i=1
i=1
1 1
we get
17 19
2
X
1
10 1 10
10
3240
V ar(Y ) = E[Y ] + 2
E[Yi Yj ] − (E[Y ])2 =
+2
−
=
19
2
17
19
19
6137
i<j
Since E[Yi Yj ] = P (Yi = 1, Yj = 1) = P (Yi = 1|Yj = 1)P (Yj = 1) =
Problem 7.44
Since F1 = F2 we can assume that all arrangements of n 1’s and m 2’s are equally likely. As in
example 2k let Ii to be 1 or 0 depending on the start of a run at the ith position. Then as in
n
nm
example 2k we get E[Ii ] = n+m
if i = 1 and E[Ii ] = (n+m)(n+m−1)
. So
E[R] =
n+m
X
i=1
E[Ii ] =
n
nm
n(m + 1)
+ (n + m − 1)
=
n+m
(n + m)(n + m − 1)
n+m
Now, observe that for 2 ≤ i ≤ n + m I1 Ii = 1 only if the first and ith are 1 and (i − 1)th is 2. Thus
E[I1 I2 ] = 0 and for 3 ≤ i ≤ n + m
E[I1 Ii ] =
n
m−1
n−1
nm(n − 1)
=
n+mn+m−1n+m−2
(n + m)(n + m − 1)(n + m − 2)
2
Similarly for 2 ≤ i < j ≤ n + m I1 Ii = 1 only if the ith and jth are 1 and (i − 1)th and (j − 1)th
are 2. Thus E[Ii Ii+1 ] = 0 and for 1 ≤ i < j ≤ n + m with j 6= i + 1
E[I1 Ii ] =
n
m−1
n−1
m−1
nm(n − 1)(m − 1)
=
n+mn+m−1n+m−2n+m−3
(n + m)(n + m − 1)(n + m − 2)(n + m − 3)
So
X
X
E[Ii Ij ] =
1≤i<j≤n+m
X
E[I1 Ij ] +
2≤j≤n+m
E[Ii Ij ]
2≤i<j≤n+m
nm(n − 1)
= (n + m − 2)
(n + m)(n + m − 1)(n + m − 2)
n+m−2
nm(n − 1)(m − 1)
+
(n + m)(n + m − 1)(n + m − 2)(n + m − 3)
2
nm(n − 1)
nm(n − 1)(m − 1)
=
+
(n + m)(n + m − 1) 2(n + m)(n + m − 1)
nm(n − 1)(m + 1)
=
2(n + m)(n + m − 1)
So
X
E[R2 ] = E[R] + 2
E[Ii Ij ]
1≤i<j≤n+m
nm(n − 1)(m + 1)
n(m + 1)
+
n+m
(n + m)(n + m − 1)
n(m + 1)(nm + n − 1)
=
(n + m)(n + m − 1)
=
Hence
n(m + 1)(nm + n − 1)
−
V ar(R) = E[R ] − E[R] =
(n + m)(n + m − 1)
nm(n − 1)(m + 1)
=
(n + m)2 (n + m − 1)
2
2
n(m + 1)
n+m
2
Problem 7.48
(a) Since X has geometric distribution with success probability 1/6 we deduce that E[X] = 6.
(b,c) Let’s find E[X|Y = k] for any k ≥ 1. Observe that for any j ≤ k − 1 we have
j−1
1 4
P (X = j|Y = k) =
5 5
and for k ≤ j we have
1
P (X = j|Y = k) =
6
3
j−1−k k−1
5
4
6
5
Since P (X = j|Y = j) = 0 we get
∞
X
k−1
X
1
E[X|Y = k] =
jP (X = j|Y = k) =
j
5
j=1
j=1
j−1
j−1−k k−1
∞
X
4
4
1 5
+
j
5
6 6
5
j=k+1
j−1 X
j−1−k k−1 X
j−1−k k−1
k
∞
4
4
1 5
1 5
1 4
−
+
=
j
j
j
5 5
6 6
5
6 6
5
j=1
j=1
j=1
k−1
X
−k k−1 X
j−1
j−1 −k k−1 X
k−1 j−1
k
∞
1X
4
1 5
4
5
4
5
1 5
=
j
−
+
j
j
5 j=1
5
6 6
5
6
6
5
6 6
j=1
j=1
−k
k−1
1 5
1 (k − 1)(4/5)k − k(4/5)k−1 + 1
4
k(5/6)k+1 − (k + 1)(5/6)k + 1
−
=
5
(1/5)2
6 6
5
(1/6)2
−k k−1
5
4
+
6
6
5
k−1
4
=5+2
5
So E[X|Y = 1] = 5 + 2 = 7 and E[X|Y = 5] = 5 + 2
4 4
5
= 5.8192.
Problem 7.53
Let Y be 1, 2 or 3 depending on the first choice of the prisoner and let X be the total number of
days until the prisoner reaches freedom. Then E[X|Y = 1] = 2 + E[X], E[X|Y = 2] = 4 + E[X]
and E[X|Y = 3] = 1. So
E[X] = E[X|Y = 1]P (Y = 1) + E[X|Y = 2]P (Y = 2) + E[X|Y = 3]P (Y = 3)
= (2 + E[X])(0.5) + (4 + E[X])(0.3) + (1)(0.2) = 0.8E[X] + 2.4
Solving for E[X] we get E[X] = 12.
Problem 7.69
Since Fλ (x) = P (λ ≤ x) = 1 − e−x we have fλ (x) = e−x . If A be the number of accidents, then
n
pA|λ (n, x) = e−x xn! .
(a)
Z
∞
P (A = 0) =
Z
∞
pA|λ (0|x)fλ (x)dx =
0
e−x e−x dx =
0
Z
∞
e−2x dx =
0
1
2
(b)
Z
P (A = 3) =
∞
Z
pA|λ (3|x)fλ (x)dx =
0
0
4
∞
e−x
1
x3 −x
e dx =
3!
6
Z
0
∞
x3 e−2x dx
=
3
∞
1
x
3x2
3x 3 −2x
13
1
−
+
+
+
e
=
=
6
2
4
4
8
68
16
0
Now observe that
fλ|A (x|0) =
fA|λ (0|x)fλ (x)
e−x e−x
=
= 2e−2x
1
P (A = 0)
2
So if B is the number of accidents in the next year then
Z ∞
Z ∞
x3
e−x
P (B = 3|A = 0, λ = x)fλ|A (x|0)dx =
2e−2x dx
P (B = 3|A = 0) =
3!
0
0
3
∞
Z ∞
2
1
1
x
x
2x
2
2
1 2
3 −3x
−2x
=
x e
dx =
−
+
+
+
e
=
=
3 0
3
3
3
9
27
3
27
81
0
Problem 7.72
(a) Let V denote the p-value of the coin. Then
Z 1
Z
P (N ≥ i) =
P (N ≥ i|V = x)dx =
0
(b) P (N = i) = P (N ≥ i) − P (N ≥ i + 1) =
(c) E[N ] =
∞
X
i=1
iP (N = i) =
∞
X
i=1
1
xi−1 dx =
0
1
i
1
1
1
−
=
.
i
i+1
i(i + 1)
1
= ∞.
i+1
Problem 7.75
Since the moment generating function uniquely defines the distribution of given random variable, X
has Poisson distribution with parameter 4 and Y has binomial distribution with parameters n = 10
and p = 3/4.
(a) Since X and Y are non-negative and independent
P (X + Y = 2) = P (X = 2|Y = 0)P (Y = 0) + P (X = 1|Y = 1)P (Y = 1) + P (X = 0|Y = 2)P (Y = 2)
= P (X = 2)P (Y = 0) + P (X = 1)P (Y = 1) + P (X = 0)P (Y = 2)
0 10
1 9
2 8
22 10
3
1
21 10
3
1
20 10
3
1
= e−2
+ e−2
+ e−2
2! 0
4
4
1! 1
4
4
0! 2
4
4
467e−2
=
220
(b) Since X and Y are independent
P (XY = 0) = P ({X = 0} ∪ {Y = 0}) = P (X = 0) + P (Y = 0) − P (X = 0)P (Y = 0)
5
= e−2 +
10
10
1
1
− e−2
4
4
(c) Again since X and Y are independent
E[XY ] = E[X]E[Y ] =
0
MX
(0)MY0
3
(0) = (2) 10 ·
4
= 15
Problem 7.T.10
Since Xi are independent and identically distributed
Xj
Xs
E Pn
= E Pn
i=1 Xi
i=1 Xi
for any 1 ≤ j, s ≤ n. So for any 1 ≤ j ≤ n
Pn
X
n
Xs
Xs
Xj
P
P
1 = E Ps=1
=
E
=
nE
n
n
n
i=1 Xi
i=1 Xi
i=1 Xi
s=1
Thus for any 1 ≤ j ≤ n
Xj
E Pn
=
i=1 Xi
1
n
So
" Pk
E
Pj=1
n
i=1
Xj
#
Xi
=
k
X
E Pn
i=1
j=1
Xj
Xi
=
n
X
k
1
=
n
n
j=1
(1)
Problem 7.T.11
Let Ai be the
Prevent that outcome i never occurs and let Ii be the indicator function of this event.
Then X = i=1 Ii is the number of outcomes that never occur. Thus by linearity of expectation
we get
E[X] =
r
X
i=1
E[Ii ] =
r
X
i=1
P (Ai ) =
r
X
(1 − Pi )n
(2)
i=1
Pr
Use Lagrange multipliers method to minimize f (P1 , . . . , Pn ) = i=1 (1 − Pi )n under the constraint
g(P1 , . . . , Pn ) = P1 + · · · + Pn = 1. Since Og = (1, . . . , 1) is a constant vector and since Of = λOg
we deduce that n(1 − Pi )n−1 is constant for 1 ≤ i ≤ r. Thus P1 = · · · = Pr and therefore Pi = 1/r
for i = 1, . . . , r.
6
© Copyright 2026 Paperzz