1. (WMS 2.7) 2.9, 2.17, 2.25, 2.43, 2.57, 2.75, 2.77, 2.79, 2.87, 2.95, 2.111, 2.115, 2.125, 2.137, 2.151,
2.155, 2.156, 2.159, 2.173
2. (WMS 2.8)
(a)
(b)
Answer: 9+36-3=42
(c)
Answer: 36-3=33
Answer: 60-36-6=18
3. (WMS 2.16)
(a)
(b)
Answer: .333
(c)
Answer:
1/3+1/15 = .4
(d)
Answer:
1/3+1/16 = .396
Answer: .204
4. (WMS 2.30)
Answer: Ω = {ABC, ACB, BAC, BCA, CAB, CBA}
A random guess will have the best wine 1st or 2nd 4/6 times.
5. (WMS 2.38)
Answer: (4)(3)(4)(5) = 240
6. (WMS 2.46) Ten teams must be matched in the first round of a tournament. How many different ways
can the 5 game
matchups
be
made?
10
8
6
4
Answer:
= 113400
2
2
2
2
7. (WMS 2.56) A student knows how to answer 6 out of 10 potential test problems. The test will consist
of 5 of the 10 chosen at random. What is the probability that the student knows them all?
Answer: Method
1: (6/10)(5/9)(4/8)(3/7)(2/6) = .024
0 1
6
@ A
5
Method 2: 0 1 = .024
10
@ A
5
8. (WMS 2.69)
Answer:
n
n
+
k
k−1
n!
n!
+
k!(n − k)! (k − 1)!(n − k − 1)!
n!(n − k + 1)
n!k
=
+
k!(n − k + 1)! k!(n − k + 1)!
(n + 1)!
=
k!(n + 1 − k)!
n+1
=
k
=
9. (WMS 2.74)
(a)
Answer: P (A|D) = 2/3 6= P (A), so not indep
(b)
Answer: P (B|D) = .3 = P (B), so indep
1
(c)
Answer: P (C|D) = 1/30 6= P (C), so not indep
10. (WMS 2.82) If A ⊂ B and P (A), P (B) > 0, show that P (B|A) = 1 and P (A|B) = P (A)/P (B).
Answer: Since A ⊂ B, we know P (A) = P (A ∩ B) = P (A)P (B|A). Therefore P (B|A) = 1, and the
second equation is Bayes thm.
11. (WMS 2.90)
(a)
Answer: 1 − (49/50)2 = .0396
(b)
Answer: no, it would be 1 − (49/50)50 = .636
12. (WMS 2.94) Two smoke detectors have the following probabilities of detecting smoke: P (A) = .95,
P (B) = .9, P (A ∩ B) = .88. Find the probability P (A ∪ B), that the smoke will be detected.
Answer: P (A ∪ B) = .95 + .9 − .88 = .97
13. (WMS 2.104)
Answer:
P (A ∩ B) = P (A ∪ B) = 1 − P (A ∪ B) = 1 − (P (A) + P (B) − P (A ∩ B)) ≥ 1 − P (A) − P (B)
14. (WMS 2.118)
Answer: The only way not to make it is if the first 4 doners don’t match. So 1 − (.6)4 = .8704
15. (WMS 2.130)
Answer: P (S|L) = .22, P (S|L) = .14, and P (L) = .0004. Bayes rule says
P (L|S) =
P (S|L)P (L)
(.22)(.0004)
=
= .0006284
(.22)(.0004)
+ (.14)(.9996)
P (S|L)P (L) + P (S|L)P (L)
16. (WMS 2.146)
Answer: (12/51)(11/50)(10/49)(9/48) = .00198
17. (WMS 2.158) A bowl contains w white and b black balls. One ball is selected, it’s color noted, and
returned to the bowl along with n additional balls of the same color. Then another ball is selected,
and it turns out to be black. Find the conditional probability that the first ball was white.
Answer:
p(w1 |b2 ) =
p(b2 |w1 )p(w1 )
=
p(b2 |w1 )p(w1 ) + p(b2 |b1 )p(b1 )
b
w
w+b+n w+b
b
w
b+n
b
w+b+n w+b + w+b+n w+b
18. (WMS 2.170)
Answer: 3/7
2
=
bw
w
=
bw + (b + n)b
w+b+n
19. Suppose a team has a 55% chance of winning a particular game in a best of n series. What is the
smallest value of n (must be odd) such that the better team has a 95% chance of winning the series?
Answer: experimentally solve P (x ≥ (n + 1)/2) ≥ .95 to get n = 269
20. (WMS) 3.1, 3.9, 3.12, 3.15, 3.23, 3.41, 3.59, 3.71, 3.82, 3.87
21. (WMS 3.6) Balls numbered 1-5 are in bowl. Two are selected. Find prob dists for
(a) The larger number.
x P(x)
1
0
.1
2
Answer:
3
.2
.3
4
5
.4
(b) The sum.
Answer:
x
3
4
5
6
7
8
9
P(x)
.1
.1
.2
.2
.2
.1
.1
22. (WMS 3.10)
Answer: p(y) = (.2)(.8)y−1 for y ≥ 1
23. (WMS 3.14)
Answer: µ = 7.9, σ = 2.175, prob in µ ± 2σ or [4, 12] is .96
24. (WMS 3.16)
Answer: This is a uniform distribution with p(y) =
2
2
E[Y 2 ] − µ2 = n(n+1)(2n+1)
− n+1
= 4n −n−3
6
2
12
1
n
for y ∈ [1, n]. Thus µ =
n+1
2
and σ 2 =
25. (WMS 3.19)
Answer: solve (C − 15)(.98) + (C − 1015)(.02) = 50 to get C = 85
26. (WMS 3.34)
Answer: µ = 13 and σ 2 = 41
27. (WMS 3.40)
Answer: .109, .9994, .8441, .5886
28. (WMS 3.48)
Answer: exactly 4: .328, at least 1: .99999, if n ≥ 3 then there is at least .999 probability of
detection
29. (WMS 3.55)
3
Answer:
E[Y (Y − 1)(Y − 2)]
=
n X
n
k
k=0
pk q n−k k(k − 1)(k − 2)
= n(n − 1)(n − 2)p3
n X
n−3
k=3
= n(n − 1)(n − 2)p
3
k−3
n−3
X
k=0
pk−3 q n−k
n − 3 k n−3−k
p q
k
= n(n − 1)(n − 2)p3
So E[Y 3 ] = E[Y (Y − 1)(Y − 2)] + 3E[Y 2 ] − 2E[Y ] = n(n − 1)(n − 2)p3 + 3(n2 p2 + npq) − 2np
30. (WMS 3.58)
Answer: µ = .4 and σ 2 = .36, so 3E[Y 2 ] + E[Y ] + 2 = 3(σ 2 + µ2 ) + µ + 2 = 3.96
31. (WMS 3.70)
Answer: p(3) = (.8)2 (.2) = .128; p(x > 10) = (.8)10 = .1074
32. (WMS 3.77) If Y ∼ G(p), show that P(y odd)=
1
Answer: p(1 + q 2 + q 4 + · · · ) = p 1−q
2
4
p
1−q 2 .
33. (WMS) 3.123, 3.135, 4.15, 4.31, 4.56
34. (WMS 3.122)
Answer: P (x ≤ 3) = .082; P (x ≥ 2) = .9927; P (x = 5) = .1277
35. (WMS 3.138)
Answer:
E[Y (Y − 1)] =
∞
X
y(y − 1)
y=0
∞
X
λy−2 −λ
λy −λ
e = λ2
e = λ2
y!
(y
−
2)!
y=2
So σ 2 = E[Y 2 ] − E[Y ]2 = E[Y (Y − 1)] + E[Y ] − E[Y ]2 = λ2 + λ − λ2 = λ
p(y)
36. (WMS 3.142) Let Y ∼ P oisson(λ). Show that p(y−1)
= λy , and thus p(y) increases as long as y < λ,
and then decreases thereafter.
Answer:
p(y)
λy e−λ /y!
λ
= y−1 −λ
=
p(y − 1)
λ
e /(y − 1)!
y
37. (WMS 3.143) Find the mode of X ∼ P oisson(5.3).
Answer: by the previous problem, it is p(x) is maximized at x = 5
38. (WMS 4.8)
Answer: k = 6, F (1) − F (.4) = .648, same, F (.4)/F (.8) = .393, same
39. (WMS 4.10)
Answer: .8646
40. (WMS 4.17)
Answer: 1.5; F (y) = 12 (y 3 + y 2 ); F (−1) = 0), F (0) = 0, F (1) = 1; F (.5) = .1875,
1−F (.5)
1−F (.25)
= .8456
41. (WMS 4.21)
Answer: µ = .708 and σ 2 = .0487
42. (WMS 4.26) If Y is a random variable with PDF f (y), prove that E[aY + b] = aE[Y ] + b and
V (aY + b) = a2 V (Y ).
Answer:
Z
Z
Z
E(aY + b) = (ay + b)f (y)dy = a
yf (y)dy + b
f (y)dy = aE[y] + b
R
R
2
2
R
2
V (aY + b) = E[((aY + b) − (aE[y] + b)) ] = E[a (Y − E[Y ]) ] = a2 E[(Y − E[Y ])2 ] = a2 V (Y )
43. (WMS 4.43) Let A be the area of a circle with radius r distributed uniformly on [0, 1]. Find the mean
and variance of A.
R1
R1
2
Answer: E[A] = E[πr2 ] = π 0 r2 dr = π3 and E[A2 ] = π 2 0 r4 dr = π5 , so V (A) = π 2 (1/5 − 1/9)
44. (WMS 4.44)
Answer: k =
1
4
and F (y) =
Ry
−2
1/4dt = y/4 + 1/2
5
45. (WMS) 3.147, 3.161, 4.63, 4.75, 4.79, 4.161, 4.179
46. (WMS 3.148)
Answer: E[Y ] = p1 ; E[Y 2 ] =
p+2q
p2 ; V
(Y ) =
q
p2
47. (WMS 3.150)
Answer: geometric with p = .3
48. (WMS 3.156) Suppose the mgf of Y is m(t).
(a) Find m(0).
Answer: 1
(b) If W = 3Y , show that the mgf of W is m(3t).
Answer: E[e3yt ] = E[e(3t)y ] = m(3t)
(c) If X = Y − 2, show that the mgf if X is e−2t m(t).
Answer: E[e(y−2)t ] = E[e−2t eyt ] = e−2t m(t)
49. (WMS 3.158) If W = aY + b, show that mW (t) = etb mY (at).
Answer: E[e(ay+b)t ] = E[e(at)y ebt ] = ebt mY (at)
50. (WMS 4.62)
√
√
Answer: F (1) − F (−1) = .68; F ( 3.84146) − F (− 3.84146) = .95
51. (WMS 4.74)
Answer: 1 − F (72) = .8413; F −1 (.9) = 85.69; F −1 (.719) = 81.48; 1 − F (5 + F −1 (.25)) = .4369; (1 −
F (84))/(1 − F (72)) = .1886
52. (WMS 4.77)
Answer: .758; 22.2
53. (WMS 4.80) Suppose Y ∼ N (µ, σ). Observe Y , and construct a rectangle with dimensions |Y | and
|3Y |. Find the expected value for the area of this rectangle.
Answer: E[3Y 2 ] = 3E[Y 2 ] = 3(σ 2 + µ2 )
54. (WMS 4.88)
Answer: 1 − F (3) = .2865; F (3) − F (2) = .1481
55. (WMS 4.90)
Answer: 1 − F (5)10 = .7354
56. (WMS 4.92)
Answer: E[C] = 1100; V (C) = 2920000
57. (WMS 4.141)
Answer: f (x) =
1
θ2 −θ1
for x ∈ [θ1 , θ2 ], and m(t) =
1
θ2 t
t(θ2 −θ1 ) (e
− eθ 1 t )
58. (WMS 4.142)
Answer: mY (t) = 1t (et − 1)
1
mW (t) = mY (at) = at
(eat − 1) so W ∼ U N IF (0, a)
1
mX (t) = mY (−at) = at (1 − e−at ) so X ∼ U N IF (−a, 0)
1
mV (t) = etb mY (at) = at
(e(a+b)t − etb ) so W ∼ U N IF (b, a + b)
59. (WMS 4.168) If the number of arrivals N in the time interval [0, t] has the distribution N ∼ P OIS(λt),
then find the distribution for T , the waiting time for the first arrival.
0 −λt
Answer: 1 − F (t) = P [n = 0] = (λt)0!e
= e−λt ; therefore F (t) = 1 − e−λt and f (t) = λe−λt which
1
is exponential with β = λ
6
60. (WMS 4.174)
Answer: 1 − F (.25) = .6065
61. (WMS 4.184) If f (x) = 12 e−|x| , for x ∈ R, find the mgf and E[X].
Answer: Assuming t is close enough to zero,
Z
Z 0
Z ∞
m(t) =
etx e−|x| dx =
e(t+1)x dx +
e(t−1)x dx =
R
−∞
0
m0 (t) =
So E[X] = m0 (0) = 0.
7
2t
(1 − t2 )2
1
1 − t2
62. (WMS 3.187, 3.194, 3.204, 3.210, 3.214, 4.85, 4.101, 4.109, 4.123, 4.127)
63. (WMS 4.102)
Answer: .44592, 11.722
64. (WMS 4.104)
Answer: the prob of one piece failing is .86466 , and then binomcdf(3,.86466,1) = .05
65. (WMS 4.105)
Answer: µ = 3.2, σ 2 = 6.4, P (x > 4) = .28955
66. (WMS 4.110)
Answer: α = 3, β = 12 , so µ = 1.5, σ = .75
67. (WMS 4.111)
Answer: Y a f (y) is the pdf for Gam(α + a, β) but scaled by
for a = 1, we get √
E[Y ) = αβ;
βΓ(α+ 12 )
;
for a = 12 we get
Γ(α)
1
for a = −1 we get β(α−1) ;
for a = − 12 we get
for a = −2 we get
β a Γ(α+a)
;
Γ(α)
Γ(α− 12 )
√
;
βΓ(α)
1
β 2 (α−1)(α−2) ;
68. (WMS 4.118)
Answer: omit
69. (WMS 4.124)
R1
Answer: .4 12y 2 − 12y 3 dy = 4y 3 − 3y 4 k1.4 = .8208
70. (WMS 4.130)
Answer: follow p.196 to find E[Y 2 ], and then σ 2 = E[Y 2 ] − E[Y ]2
71. (WMS 4.133)
Answer: α = 3, β = 5 so c = 1/B(3, 5) = 105, µ = 3/8, σ = .16137, and P (Y > µ + 2σ) = .029749
72. Suppose X has a beta distribution with support [1, 4] instead of [0, 1]. If α = 3 and β = 5, do a change
of variables to find
(a) the PDF
Answer: use the change of variables y =
x−1 4
3 ) for x ∈ [1, 4]
x−1
3 ,
so x = 3y + 1 to get f (x) =
(b) the mean
Answer: 3(3/8) + 1 = 2.125
(c) the median
Answer: 3(.36412) + 1 = 2.0923
(d) the mode
Answer: set f 0 (x) = 0 to get x = 2
(e) the standard deviation
Answer: V (3y + 1) = 9V (y), so σ = .484
(f) the probability that x is within one standard deviation of the mean
Answer: use betacdf in octave to get .65653
8
1
3B(3,5)
x−1 2
3
(1 −
73. (WMS 5.3, 5.7, 5.9, 5.21, 5.25, 5.39, 5.51, 5.53, 5.67, 5.75, 5.77, 5.97, 5.103, 5.133, 5.151
74. (WMS 5.4)
Answer: F (1, 2) = 1
75. (WMS 5.14)
R .5 R 1−y
Answer: 0 y1 1 6y12 y2 dy2 dy1 = 1/32
76. (WMS 5.15)
77. (WMS 5.22)
78. (WMS 5.32)
79. (WMS 5.33)
80. (WMS 5.38)
Answer: f (x, y) =
1
x
81. (WMS 5.48)
82. (WMS 5.58)
Answer: they can’t be independent if the support isn’t rectangular
83. (WMS 5.59)
84. (WMS 5.62)
P∞
P 2 k
k 2
2
Answer:
(q ) =
k=0 (q p) = p
p2
1−q 2
85. (WMS 5.64)
Answer: sketch the regions - the ratio of areas is
3/4
5/6
= .9
86. (WMS 5.68)
87. (WMS 5.72)
88. (WMS 5.82)
89. (WMS 5.84)
90. (WMS 5.88)
Answer: 6(1/6 + 1/5 + 1/4 + 1/3 + 1/2 + 1) = 14.7
91. (WMS 5.92)
92. (WMS 5.94)
Answer: if Cov(X, Y ) = 0 then Cov(X + Y, X − Y ) = V (X) − V (Y ) and ρ =
93. (WMS 5.102)
94. (WMS 5.106)
Answer: 27/80
95. (WMS 5.108)
96. (WMS 5.110)
97. (WMS 5.115)
9
V (X)−V (Y )
V (X)+V (Y )
98. (WMS 5.136)
99. (WMS 5.142)
100. (WMS 5.146)
101. (WMS 5.149)
102. (WMS 5.153)
Answer: Let X be the number of eggs laid, and Y the number hatched. Then
µY = Ex [Ey [Y |X]] = Ex [pX] = pλ
σY2 = Ex [VY (Y |X)] + Vx (EY [Y |X]) = Ex [p(1 − p)X] + Vx (pX) = pEx [X] − p2 Ex [X] + p2 Vx (X) = pλ
10
103. (WMS 6.1, 6.28, 6.31, 6.49, 6.61, 6.95, 6.107
104. (WMS 6.2)
105. (WMS 6.6)
106. (WMS 6.11)
107. (WMS 6.20)
108. (WMS 6.30)
109. (WMS 6.34)
110. (WMS 6.37)
111. (WMS 6.42)
112. (WMS 6.48)
Answer: Let X, Y be std normal, and U 2 = X 2 + Y 2 . Then W = U 2 has a χ2 dist with 2 dof, so
2
fW (w) = 12 e−w/2 . Then fu (u) = 12 e−u /2 (2u)
113. (WMS 6.52)
114. (WMS 6.56)
115. (WMS 6.96)
116. (WMS 6.102)
117. (WMS 6.110)
11
118. (WMS 7.11)
119. (WMS 7.12)
120. (WMS 7.15)
121. (WMS 7.42)
122. (WMS 7.49)
123. (WMS 7.59)
124. (WMS 7.73)
125. (WMS 7.87)
12
126. (WMS 9.73)
127. (WMS 9.83)
128. A squirrel has hidden acorns in one of two patches of land, but she can’t remember for sure which one.
She is 80% sure they are burried in “Patch A.”
If she digs for t hours in the correct patch, the probability she finds at least one acorn is 1 − (.4)t . If
she digs in the wrong patch, she definitely won’t find any acorns.
Of course, squirrels are intelligent Bayesian creatures. She decides to dig in “Patch A” as long as there
is at least a 40% chance that the acorns are in “Patch A.” How many hours will she dig in “Patch A”
before giving up and switching to “Patch B”?
t
)
Answer: after t hours of unsuccessful digging in Patch A, P (A|t) = .8(.4.8(.4
t )+.2(1) set this equal to .4
to get t = 1.96 hours
129. Suppose you know X has a Poisson distribution, but don’t know the parameter λ. You have a prior
distribution for λ that is uniform on [0, 3].
(a) You take one observation and get x1 = 3.
i. What is the MLE estimate for λ?
Answer: maximize f (3) ∝ e−λ λ3 to get λ = 3
ii. Find the Bayesian posterior distribution for λ, Rand the Bayesian estimator λ̂ = E[λ].
Answer: P1 (λ|x1 ) =
1 3 −λ
/3!
3λ e
1 3 −λ
λ e /3!dλ
0 3
R3
3
and λ̂ =
R03
0
λ4 e−λ dλ
λ3 e−λ dλ
= 2.095
(b) You take another observation and get x2 = 5. Now given both observations,
i. What is the MLE estimate for λ?
Answer: maximize f (3)f (5) ∝ e−2λ λ8 to get λ = 4
ii. Find the Bayesian posterior distribution for λ, and the
Bayesian estimator λ̂ = E[λ].
R
Answer: P1 (λ|x1 , x2 ) =
1 8 −2λ
/(3!5!)
3λ e
1 8 −2λ
λ
e
/(3!5!)dλ
3
0
R3
3
and λ̂ =
R03
0
λ9 e−2λ dλ
λ8 e−2λ dλ
= 2.472
iii. Find a two-tailed 90% Bayesian confidence interval for λ.
Answer: the CDF for the posterior is complicated - do trial and error integration to get
[1.677, 2.9625]
(c) Suppose your prior distribution is uniform on [0, M ] (you just did M = 3).
Using x1 and x2 , find limM →∞ λ̂. Where λ̂ is the Bayesian estimator for λ.
Answer: it becomes a gamma distribution with α = 9 and β = .5, so the limit is 4.5.
130. Suppose X has a normal distribution with unknown mean µ but known variance 9. Your prior distribution for µ is N (17, 5). If you take n observations of X, and the sample mean is x = 14, find the
Bayesian estimator µ̂ as a√function of n.
Answer: x ∼ N (µ, 3/ n), so the posterior distribution is P1 (µ) ∝ exp(−(µ − 17)2 /50) − (µ −
14)2 /(18/n)) which indicates a normal distribution with mean 9·17+25n·14
, which is the weighted har9+25n
monic mean
(1/25)17+(n/9)14
(1/25)+(n/9)
13
© Copyright 2026 Paperzz