Theory of Probability - Brett Bernstein Homework 8 Solutions

Theory of Probability - Brett Bernstein
Homework 8 Solutions
Due Monday, July 20th at the beginning of class
1. (DeGroot) A civil engineer is studying a left-turn lane that is long enough to hold 4
cars. Let X be the number of cars in the lane at the end of a randomly chosen red
light. The engineer believes that the probability that X = x is C(x + 1)(8 − x) for
x = 0, . . . , 4 (the possible values of X) where C > 0 is the same for all x.
(a) Find the PMF of X.
(b) Find the probability that X will be at least 3.
Solution.
(a) Solving we have
C [(1)(8) + (2)(7) + (3)(6) + (4)(5) + (5)(4)] = 80C
so C = 1/80. Thus pX (x) = (x + 1)(8 − x)/80 for x = 0, . . . , 4, and 0 otherwise.
(b) The answer is
P (X ≥ 3) = pX (3) + pX (4) =
1
4·5+5·4
= .
80
2
2. A group of n people all have distinct heights. They are waiting in a straight line at the
bank (one person in front of the other), with all orderings of the people equally likely.
A person can see ahead if they are taller than everyone in front of them.
(a) What is the probability that the ith person in line can see ahead (where the first
person is at the front of the line, the second is behind the first, etc)?
(b) What is the expected number of people in line that can see ahead? [Hint: Linearity
of expectation.]
Solution.
(a) The ith person in line can see if he is the largest of the first i people. There is a 1/i
chance of this occurring. For a more explicit calculation, let S denote the space
of all n! possible orderings with each equally likely. The number of orderings with
the ith person able to see ahead is
n
n!
(i − 1)!(n − i)! = ,
i
i
as there are ni choices for the first i people, there are (i − 1)! orderings of them
that have the ith person tallest, and there are (n − i)! orderings of the remaining
n − i people.
1
(b) By linearity, the expected number is
n
X
1
i=1
i
.
3. (DeGroot) Suppose that when a machine is adjusted properly 50 percent of the items
made are high quality, and 50 percent are medium quality. If the machine is adjusted
improperly, then 25 percent are of high quality, and 75 percent are of medium quality. Assume the machine is adjusted improperly 10 percent of the time. [As usual,
conditional on the adjustment of the machine, the items are independent.]
(a) The machine is adjusted, and then you look at 5 produced items. If 4 out of 5
are high quality, what is the chance it was adjusted properly?
(b) After the 5 above were produced, you look at another item, and find it is of
medium quality (making a total of 6 observed items). What is your new posterior
probability that the machine was adjusted properly?
Solution.
(a) Let A be the event the machine is adjusted properly, and E the event that 4 out
of 5 are of high quality. Then
P (A|E)
=
=
=
≈
P (E|A)P (A)
c
c
P (E|A)P (A)
+ P5 (E|A )P (A )
5
(.5) (.9)
4
5
5
(.5) (.9) + 54 (.25)4 (.75)(.1)
4
96
97
.99.
As all of the 54 coefficients cancel, we see that saying 4 out of 5 are high quality,
or specifying a particular sequence like HHHHM or M HHHH give the same
conditional probability. This reflects the idea that when collecting independent
samples and updating beliefs, the order of the samples doesn’t have information.
(b) We solve this in two different ways.
i. For our first solution, let A denote the event the machine is adjusted properly,
and F the event we have 4 out of 5 high quality, followed by a medium quality.
2
This gives
P (A|F )
=
=
=
≈
P (F |A)P (A)
P (F |A)P (A) + P (F |Ac )P (Ac )
5
(.5)6 (.9)
4
5
6 (.9) + 5 (.25)4 (.75)2 (.1)
(.5)
4
4
64
65
.984.
As stated earlier, we could have also just used any sequence with 4 H’s (for
high quality) and 2 M’s (for medium quality) as the binomial coefficients just
cancel.
ii. For our second solution, let PE be the probability measure defined by PE (B) =
P (B|E) where E is defined as in part (a). Let G denote the event of getting
a medium quality item. Then we have
PE (A|G)
=
=
=
≈
PE (G|A)PE (A)
PE (G|A)PE (A) + PE (G|Ac )PE (Ac )
96
.5 97
1
.5 96
+ .75 97
97
64
65
.984.
Note that PE (G|A) = P (G|A) because conditional on the machine’s adjustment the items are independent.
4. Suppose we have two coins, one with probability p1 of getting heads, and the other
probability p2 . We flip each coin once. Letting X denote the total number of heads,
compute the PMF pX (k) for all k ∈ R.
Solution. For k ∈
/ {0, 1, 2} we have pX (k) = 0. Otherwise,
pX (0) = (1 − p1 )(1 − p2 ),
pX (1) = p1 (1 − p2 ) + (1 − p1 )p2 ,
pX (2) = p1 p2 .
5. Suppose we flip n coins each obtaining heads with probability 0 < p < 1.
(a) Give the sample space and probability measure.
(b) Let Xk denote the indicator random variable for the kth flip being heads (i.e., 1
if heads, 0 if tails). Show the Xk , k = 1, . . . , n are independent random variables.
3
Solution.
(a) Let S be defined by
S = {(f1 , . . . , fn ) : fi ∈ {H, T }}.
We use the general finite sample space where
P ({(f1 , . . . , fn )}) = p#H (1 − p)#T .
(b) Note that
P (X1 = x1 , . . . , Xn = xn ) = p#H (1 − p)#T =
n
Y
P (Xi = xi )
i=1
for any x1 , . . . , xn . Thus they are independent. We could have also written the
formula above as
Pn
Pn
p i=1 xi (1 − p)n− i=1 xi .
6. Let n be a fixed positive integer, and let X be a random variable that takes on the
values 1, . . . , n. Suppose pX (k) = Ck for k = 1, . . . , n, where C > 0 is the same for
all k.
(a) Determine C.
(b) What is E[X]?
Solution.
(a) Solving for C we have
1=
n
X
k=1
Ck =
2
Cn(n + 1)
=⇒ C =
.
2
n(n + 1)
(b) We have
E[X] =
n
X
k=1
2k 2
2n(n + 1)(2n + 1)
2n + 1
=
=
.
n(n + 1)
6n(n + 1)
3
4