Solutions exam Probability III October, 2016

Solutions exam Probability III
October, 2016
Problem 1
Consider a probability space (Ω, F, P), with P(A) < 1 for all A ∈ F \ Ω. Let X and Y be
F-measurable random variables. Let n and m be integers, strictly larger than 1.
Assume that X only takes values in a set S1 = {x1 , x2 , · · · , xn } of n distinct elements and Y only
takes values in a set S2 = {y1 , y2 , · · · , ym } of m distinct elements. Assume that P(X = xi ) > 0
for all 1 ≤ i ≤ n and P(Y = yj ) > 0 for all 1 ≤ j ≤ m.
(a) Define σ(X), the σ-algebra generated by X.
(4p)
Solution: The σ-algebra generated by the random variable X is the smallest σ-algebra A such
that X is A-measurable.
Assume that X and Y are independent.
(b) Show that σ(X) ∩ σ(Y ) = {Ω, ∅}.
Hint: Show that if A ∈ σ(X) and A ∈ σ(Y ) then P(A) is either 0 or 1.
(4p)
Solution: Assume that the set A is in both σ(X) and σ(Y ). By the definition of σ(X) and σ(Y ),
there are Borel sets BX and BY such that A = {ω; X(ω) ∈ BX } and A = {ω; Y (ω) ∈ YX }. By
independence P(X ∈ BX ∩ Y ∈ BY ) = P(X ∈ BX )P(Y ∈ BY ). But also by definition: P(X ∈
BX ) = P({ω : X(ω ∈ BX )}) = P(A) and similarly P(X ∈ BX ∩ Y ∈ BY ) = P(Y ∈ BY ) = P(A).
Therefore, P(A) = P(A)2 . So P(A) either 0 or 1.
Because X and Y are F-measurable, σ(X) ∈ F and σ(Y ) ∈ F. Ω is the only element of F with
probability 1 and therefore ΩC = ∅ is the only element of F with probability 0. So, A is either
Ω or ∅ which finishes the answer.
(c) Let σ(X, Y ) be the σ-algebra generated by X and Y . How many elements does σ(X) have?
and how many elements does σ(X, Y ) have?
(4p)
Solution: The smallest σ-algebra generated by X is generated by PX the partition with elements
{ω; X(ω) = xi } for i = 1, 2 · · · , n, which is a partition of n elements. Furthermore, every element
of σ(X) consists of a union of elements of PX and all unions of elements of PX are in σ(X) by
the definition of a σ-algebra. So σ(X) is the powerset of PX which has 2n elements.
For σ(X, Y ) note that for none of the i ∈ 1, 2, · · · n and j ∈ 1, 2, · · · m we have that {ω; X(ω) =
xi } ∩ {ω; Y (ω) = yj } = ∅, since that would imply that 0 < P(X = xi )P(Y = yj ) = P(X =
xi , Y = yj ) = 0, which is a contradiction. So σ(X, Y ) is generated by a partition of n × m
elements. So, σ(X, Y ) has 2n×m elements.
1
Problem 2
Let Z1 , Z2 , · · · be independent and identically distributed random variables with density given
by fZ1 (x) = 21P
e−|x| for x ∈ R.
Define Xn = nk=1 Zk .
1
a) Show that ψZ1 (t) = 1−t
2 is the moment generating function of Z1 and provide ψXn (t), the
moment generating function of Xn .
(4p)
Solution: For |t| ≤ 1 we have
ψZ1 (t) = E[e
tZ1
Z
∞
]=
−∞
t
ψXn (t) = E[e
1 −|x| tx
e
e dx =
2
Pn
k=1
Zk
] = E[
Z ∞
1 x tx
1 −x tx
1
1
1
e e dx+
e e dx =
+
=
2
2
2(1 + t) 2(1 − t)
1 − t2
0
0
Z
−∞
n
Y
tZk
e
]=
k=1
n
Y
tZk
E[e
tZ1
] = E[e
n
] =
k=1
1
1 − t2
n
.
Here we have used independence for the third equality, the identical distribution of the Zk ’s for
1
the the fourth equality and ψZ1 (t) = 1−t
2 for the final equality.
In what follows you may use without proof that the characteristic functions of Z1 is given by
1
ψZ1 (it) = 1+t
2.
n
Let N be a random variable satisfying P(N = n) = λn! e−λ for n = 0, 1, · · · , i.e. N is Poisson
P
distributed with mean λ. Assume that N is independent of Z1 , Z2 , · · · . Let Y = XN = N
k=1 Zk .
b) Show that the characteristic function of Y is given by
t2
φY (t) = exp −λ
.
1 + t2
(4p)
Solution: From a) we know that φXn (t) = ψXn (it) =
φY (t) = E[e
it
PN
k=1
Zk
]=
∞
X
P(Nn )E[e
it
PN
k=1
Zk
1
.
(1+t2 )n
|N = n] =
n=0
=
∞
X
So,
P(Nn )E[eit
Pn
k=1
Zk
]
n=0
∞
X
λn
n=0
n!
e
−λ
1
t2
φXn (t) = exp −λ(1 −
) = exp −λ
.
1 + t2
1 + t2
c) Show that (2λ)−1/2 Y converges in distribution to a standard normal distribution as λ → ∞.
(4p)
h
i
t2
Solution: By b) we have φ(2λ)−1/2 Y (t) = φY ((2λ)−1/2 t) = exp −λ 2λ+t
2 . As λ → ∞, we have
h
i
h 2i
t2
exp −λ 2λ+t
→ exp − t2 , which is the characteristic function of a Standard normal distribu2
tion. We know that convergence of characteristic functions implies convergence in distribution.
So we are done.
2
Problem 3
Assume in all subproblems that Z, Y, X, X1 , X2 , · · · are random variables defined on the same
probability space {Ω, F, P}.
a) Show that the following statements are equivalent:
• E(Y ) < ∞,
• for all > 0, there exists δ > 0 such that E(|Y |1
1(A)) < for all events A ∈ F satisfying
P(A) < δ.
(6p)
Solution: Suppose that for all > 0, there exists δ = δ() > 0, such that E(|Y |1
1A ) < for all
A satisfying P(A) < δ. Let and x > 0 be such that P(|Y | > x) < δ(). Then for all y > x we
have
E(|Y |) = E(|Y |1
1(|Y | ≤ y)) + E(|Y |1
1(|Y | > y)) ≤ yP(|Y | ≤ y) + ≤ y + < ∞
If on the other hand E(|Y |) < ∞, then E(|Y |1
1(|Y | > x) → 0. So, there exists y such that
E(|Y |1
1(|Y | > y) < /2. Now note that
E(|Y |1
1(A)) = E(|Y |1
1(A ∩ |Y | > y)) + E(|Y |1
1(A ∩ |Y | ≤ y))
≤ E(|Y |1
1(|Y | > y)) + yP(A ∩ |Y | ≤ y) ≤ /2 + yP(A).
The theorem follows by chosing δ = /(2y).
b) Let Xn → X in probability as n → ∞ and let Z be a non negative random variable with
E[Z] < ∞, such that |Xn | ≤ Z for all n. Show that Xn → X in mean as n → ∞.
(6p)
P
P
Solution: Xn → X implies P(|Xn − X| ≥ κ) → 0 for all κ > 0. Therefore, Xn → X and
|Xn | < Z implies that P(|X| ≥ Z + κ) = 0 for all κ > 0, and therefore |X| is almost surely less
than or equal to Z. This implies that we have |Xn − X| ≤ 2Z almost surely. Which in turn
implies that E(|Xn − X|) < ∞.
Observe further that
E(|Xn − X|) = E(|Xn − X|1
1(|Xn − X| > κ)) + E(|Xn − X|1
1(|Xn − X| ≤ κ))
≤ κ + E(|Xn − X|1
1(|Xn − X| > κ))
Note that P(|Xn − X| > κ) → 0, so there is an integer N such that for all n > N we have
P(|Xn − X| > κ) < δ.
By part (a) and E(|Xn − X|) < ∞, this implies
E(|Xn − X|) ≤ κ + .
Sending κ and to 0 gives the desired result.
3
Problem 4
Consider a supercritical Galton-Watson Branching Process {Z0 , Z1 , Z2 , · · · } with Z0 = 1. That
is, let {Xij }i=0,1,2,··· ;j=1,2,··· be independent and identically distributed with expectation m > 1
P k
and variance σ 2 < ∞. Define Z0 = 1 and Zk+1 = Z
j=1 Xkj , for k ≥ 0.
a) Show that for all n ≥ 1
E[Zn ] = mn
E[(Zn )2 ] = m2 E[(Zn−1 )2 ] + mn−1 σ 2 .
and
Deduce from this (e.g. by induction) that for n ≥ 1,
2
2n
E[(Zn ) ] = m
+σ
2
n
X
mn−k m2(k−1) = m2n + σ 2 mn−1
k=1
mn − 1
.
m−1
(4p)
Solution:
Zn−1
E[(Zn )2 ] = E[E[(Zn )2 |Zn−1 ]] = E[E[(
X
Zn−1
Xn−1,j )(
j=1
X
Xn−1,k )|Zn−1 ]]
k=1
Zn−1 −1 Zn−1
Zn−1
= E[
X
E[(Xn−1,j )2 |Zn−1 ]] + 2E[E[
j=1
2
X
X
j=1
k=j+1
Xn−1,j Xn−1,k |Zn−1 ]]
2
= E[Zn−1 ](σ + m ) + E[Zn−1 (Zn−1 − 1)]m2 = mn−1 σ 2 + m2 E[(Zn−1 )2 ].
b) Show that Wn = m−n Zn converges almost surely to a random variable W as n → ∞. (4p)
Solution: E[Wn+1 |Z0 , Z1 , · · · Zn ] = m−(n+1) E[Zn+1 |Zn ] = m−(n+1) mZn = m−n Zn = Wn . Also
E[|Wn |] = E[Wn ] = E[W0 ] = 1 < ∞. So, W0 , W1 , · · · is a martingale, with respect to filtration
−n
generated by Z0 , Z1 , · · · . Furthermore, we have E[(Wn )2 ] = m−2n E[(Zn )2 ] = 1 + σ 2 1−m
→
m2 −m
1 + σ 2 (m2 − m)−1 < ∞. Here we have used that m > 1. The Martingale convergence theorem
now gives the statement of the question.
c) Show that as n → ∞,
Pn
i=1 Zi
Pn−1
j=0 Zj
on {ω;
P∞
k=0 Zk (ω)
Pn
−m=
i=1 (Zi − mZi−1 ) a.s.
→
Pn−1
j=0 Zj
0
= ∞}.
(4p)
P
Solution: Use Theorem 21 of cheat sheet, with Sn = ni=1 (Zi − mZi−1 ) and f (x) = max(1, x).
First observe that Sn is a martingale because E[Zi − mZi−1 ] = 0 for all i and then using that
E[(Zi − mZi−1 )(Zj − mZj−1 )] = 0 for i 6= j and therefore that (Zi − mZi−1 ) and (Zj − mZj−1 )
are uncorrelated we obtain for all n
n
n
X
X
2
2
E[(Sn ) ] =
E[(Zi − mZi−1 ) ] =
σ 2 E[Zi−1 ] < ∞.
i=1
i=1
Note that
hSin =
n
X
2
E((Sk − Sk−1 ) |Fk−1 ) =
k=1
Pn
and thus we have that
(Zi −mZi−1 )
i=1
Pn−1
j=0 Zj
n
X
k=1
V ar(Zk |Zk−1 ) = σ
2
n
X
Zk−1
k=1
= Sn /f (hSin ) and apply the second part theorem 21.
4
Problem 5
In a dark cloakroom there are K coats belonging to K people. Assume that K > 1. Each
of these K people picks a coat at random (such that all matchings of coats with people have
equal probability). People who pick their own coat leave, the others put the coats back in the
cloakroom and repeat the procedure. Let Sn be the number of people that are present at the
n-th round (so, S1 = K) and let N = min{n; Sn = 0}. Let F be the filtration generated by
{S1 , S2 , · · · }. Let Mn = Sn + min(n, N ).
a) Let Xn be the number of people that find their coat in round n. Show that if Sn ≥ 2, then
E(Xn ) = 1 and V ar(Xn ) = 1. P
(4p)
k
Hint: Write for Sn = k, Xn = j=1 Ij , where Ij is 1 if the j-th person gets his or her own coat
and 0 otherwise. Similarly, write
k
k
k
k−1 X
k
X
X
X
X
2
(Xn ) = (
Ii )(
Ij ) =
(Ii ) + 2
Ii Ij .
2
i=1
j=1
i=1
i=1 j=i+1
PSn
PSn
P n
Solution: Using the hint, we obtain E[Xn ] = E[ Si=1
Ii ] =
i=1 P(Ii = 1).
i=1 E[Ii ] =
P(Ii = 1) = 1/Sn for all i (every coat has the same probabiliy of ending up in person i’s hands).
So, E[Xn ] = 1.
Similarly,
2
E[(Xn ) ] =
Sn
X
i=1
2
E[(Ii ) ] + 2
SX
n −1
Sn
X
E[Ii Ij ] =
i=1 j=i+1
Sn
X
i=1
E[Ii ] + 2
SX
n −1
Sn
X
P(Ii = 1, Ij = 1)
i=1 j=i+1
The first sum equals 1, by the computation above. The double sum has Sn (Sn − 1)/2 elements
and P(Ii = 1, Ij = 1) = S1n × Sn1−1 where the first term is the probability that i picks the right
coat out of Sn coats and the second term is the probability that after i picked the right coat j
picks the right coat as well from the remaining Sn − 1 coats. Therefore, the double sum is 1/2
and E[(Xn )2 ] = 2. The proof is finished by observing that V ar(Xn ) = E[(Xn )2 ] − (E[Xn ])2 = 1.
b) Show that Mn is an F-martingale and compute E(N ).
Hint: Note that Sn = 1 is not possible for any n and use an optional stopping theorem.
(4p)
Solution: E(|Mn |) ≤ K + n < ∞. Furthermore, if n ≥ N , then Sn = 0 and min(n, N ) = N .
So, it is trivial that E(Mn+1 |Mn ) = N = Mn for n ≥ N . If on the other hand n < N , we have
E(Mn+1 |Sn ) = E(Sn+1 + n + 1|Sn ) = E(Sn − Xn + n + 1|Mn ) = Mn − 1 + 1 = Mn .
So, Mn is a martingale.
Furthermore, note that the maximal jump length K − 1 and the probability that all coats will
be with the right person at the n-th trial is at least 1/K! (which implies that N is stochastically
smaller than a geometric distributed random variable with expectation K!, which is finite). We
can use optional stopping theorem III, to obtain that E(N ) = E(SN + N ) = E(S1 + 1) = K + 1.
5
c) Show that (Mn )2 + Sn is an F-martingale and compute V ar(N ).
(4p)
Solution: E[(Sn + min(n, N ))2 + Sn ] ≤ (K + n)2 + K < ∞. Assume that n ≤ N
E[(Mn+1 )2 + Sn+1 |Sn ] = E[(Sn+1 + n + 1)2 + Sn+1 |Sn ] = E[(Sn − Xn + n + 1)2 + Sn − Xn |Sn ]
= (Mn )2 − 2Mn E(Xn − 1|Sn ) + E[(Xn − 1)2 |Sn ] + Sn − E[Xn |Sn ]
= (Mn )2 − 2Mn × 0 + V ar(Xn ) + Sn − 1 = (Mn )2 + Sn
as required. The n > N case is trivial.
We can use optional stopping theorem III again to obtain that E(N 2 ) = (K +1)2 +K. Therefore,
V ar(N ) = E(N 2 ) − [E(N )]2 = (K + 1)2 + K − (K + 1)2 = K
6