Solutions to HW Assignment 1

Solutions to HW: 1
Course: Theory of Probability II
Page: 1 of 6
University of Texas at Austin
Solutions to HW Assignment 1
Problem 1.1. Let (Ω, F, {Fn }n≥0 , P) be a filtered probability space and T a stopping time.
(1) Show that
FT := {A ∈ F|A ∩ {T = n} ∈ Fn (∀) n}
is a sigma-field.
(2) If {Xn }n is an adapted process, then the process {Xn∧T }n is also adapted, and the random
variable XT I{T <∞} is FT -measurable.
Solution:
(1) this item is just an easy application of the definition of sigma-algebra.
(2) Let B any Borel measurable set of R (or, in general, any measurable set of the state space).
The we have
{Xn∧T ∈ B} = ∪n−1
{T
=
k}
∩
{X
∈
B}
∪
{T
≥
n}
∩
{X
∈
B}
∈ Fn .
n
k
k=0
In addition
{XT I{T <∞} ∈ B} ∩ {T = n} = {Xn ∈ B} ∩ {T = n} ∈ Fn
for each n, so
{XT I{T <∞} ∈ B} ∈ FT
by the definition of FT .
Problem 1.2. Let 0 ≤ S ≤ T two stopping times with respect to the discrete-time filtration
(Fn )n≥0 . Let A ∈ FS . Show that the random time T 0 defined by
T 0 = S 1A + T 1Ac
is also a stopping time.
Solution: We have that
{T 0 = n} = {S = n} ∩ A ∪ {S ≤ n} ∩ {T = n} ∩ Ac ∈ Fn
since
{S ≤ n} ∩ Ac ∈ Fn .
Problem 1.3.
(1) (Doob’s decomposition). Let {Xn , Fn } be a submartingale. Show that it
can be decomposed uniquely as
Xn = Mn + An , n = 0, 1, . . .
where {Mn , Fn } is a martingale and the process {An }n is increasing, predictable with
respect to the filtration {Fn }n and A0 = 0. Find the processes M and A in terms of the
process X.P
(2) Let Xn = nk=1 IBn for Bn ∈ Fn . What is the Doob decomposition for Xn ?
Solution: 1. Use the decomposition relation for n + 1 and n and substract term by term to get
Xn+1 − Xn = Mn+1 − Mn + (An+1 − An )
Taking conditional expectation with respect to Fn and using the martingale property for M and
the fact that A is predictable we obtain
E[Xn+1 − Xn |Fn ] = An+1 − An .
Now
An = A0 + (A1 − A0 ) + · · · + (An − An−1 ) = E[X1 − X0 |F0 ] + · · · + E[Xn − Xn−1 |Fn−1 ].
Instructor: Mihai Sı̂rbu
Semester: Spring 2016
Solutions to HW: 1
Course: Theory of Probability II
Page: 2 of 6
We can also see that
Mn+1 − Mn = Xn+1 − Xn − (An+1 − An ) = Xn+1 − E[Xn+1 |Fn ]
and since M0 = X0 we have
Mn = X0 + (X1 − E[X1 |F0 ]) + · · · + (Xn − E[Xn |Fn−1 ]).
Sow far we have only showed uniqueness, but existence of this decomposition can be easily proven
taking these relation as definitions for M and A.
2. Since Xn − Xn−1 = 1Bn and E[1Bn |Fn−1 ] = p(Bn |Fn−1 ), we conclude that
n
X
An =
p(Bi |Fi−1 )
i=1
and
Mn = Xn − An =
n
X
(1Bi − p(Bi |Fi−1 ))
i=1
Problem 1.4. Let Y1 , Y2 , . . . be non-negative i.i.d. with E[Yn ] = 1 and P(Yn = 1) < 1.
Q
(1) Show that Xn = nk=1 Yk is a martingale
(2) Use the martingale convergence theorem to conclude that Xn → 0 a.s.
(3) Use the SLLN to show that
log(Xn )
→ c < 0, a.s.
n
Solution: 1. E[Xn+1 |Fn ] = E[Xn Yn+1 |Fn ] = Xn E[Yn+1 |Fn ] = Xn , since Yn+1 is independent on
the sigma algebra Fn generated by X1 , . . . , Xn
3. we have that log(Xn ) = log(Y1 ) + · · · + log(Yn ). The hypotheses ensures that (log Y )+ ∈ L1
(by Jensen) and E[log Y ] < 0 (it CAN be −∞). If the expectation is finite, then we can apply the
SLLN. If the expectation is −∞, we can TRUNCATE log Yn at a negative level −C and then let
C → ∞ to conclude that
log(Y1 ) + · · · + log(Yn )
→ E[log Y ] ∈ [−∞, 0).
n
2. Since log(Xn )/n → c < 0 then Xn → 0.
One can prove this part directly, using martingale convergence theorem for Xn , concluding that
Xn → X∞ ≥ 0. Now, if X∞ > 0 then Yn = Xn /Xn−1 → 1, and we can obtain a contradiction.
Problem 1.5. (An example of a Uniformly Integrable Martingale which is not in H1 ). Consider
Ω = N, F = 2N and the probability measure P defined by
P[k] = 2−k , k = 1, 2 . . . .
Consider now the random variable K such that K(k) = k, k = 1, 2 . . . .
(1) find an explicit example of a random variable Y : Ω → [1, ∞) such that E[Y ] < ∞ and
E[Y K] = ∞.
(2) consider the filtration
Fn = σ ({1}, {2}, . . . , {n − 1}, {n, n + 1, . . . }) .
Find an expression for Xn = E[Y |Fn ].
(3) Show that the martingale (Xn )n is not in H1 (note that the martingale is UI by the way it
is defined).
Solution:
(1) We can define Y (k) = 2k k12 .
Instructor: Mihai Sı̂rbu
Semester: Spring 2016
Solutions to HW: 1
Course: Theory of Probability II
Page: 3 of 6
(2) It is clear (the formal proof is not very short though!) that
Xn (k) = Y (k), k = 1, 2, . . . n − 1
and
Xn (n) = Xn (n + 1) = · · · = 2−1 Y (n) + 2−2 Y (n + 1) + . . .
(3) we see that X ∗ (n) ≥ Xn (n), so
!
∞ ∞
∞
∞
X
X
1 XX 1
1
∗
−n
n−k−1 k
= ∞.
E[X (n)] ≥
2
2
2 2 =
k
2
k2
n=1
n=1 k=n
k=n
Problem 1.6.
(1) Suppose (Xn )n is a submartingale such that E[|Xn+1 − Xn ||Fn ] ≤ B a.s for
each n where B is a constant. Show that if N is a stopping time such that E[N ] < ∞ then
(Xn∧N )n is u.i., so, according to the optional sampling theorem E[X0 ] ≤ E[XN ]
(2) (Wald I): Let ξ1 , ξ2 , . . . be i.i.d such that ξi ∈ L1 . If Sn = ξ1 + . . . ξn and N is a stopping
time such that E[N ] < ∞ then
E[SN ] = E[N ]E[ξ]
Please note that this implies that the first hitting time at level one for the symmetric
random walk has INFINITE expectation.
P
Solution: 1. supn |Xn∧N | ≤ |X0 | + |X1 − X0 | + . . . |XN − XN −1 | = X0 + ∞
n=0 |Xn+1 − Xn |1{N >n} .
Taking into account that {N > n} ∈ Fn , we can condition each term on time n to conclude that
∞
X
E[sup |Xn∧N |] ≤ E[X0 ] +
E[E[|Xn+1 − Xn ||Fn ]1{N >n} ] ≤
n
E[X0 ] + B
n=0
∞
X
P(N > n) = E[X0 ] + P(N = 0) + E[N ] < ∞.
n=0
This means that the sequence (Xn∧N )n is dominated by an
2. Define the process Mn = Sn − nE[ξ]. We have Mn+1 − Mn = ξn+1 − E[ξn+1 ], so E[Mn+1 −
Mn |Fn ] = 0, and
E[|Mn+1 − Mn ||Fn ] ≤ E[|ξ − E[ξ]|]
This means that M is a martingale and we can apply Part 1 to get E[MN ] = 0 which means
E[SN ] = E[N ]E[ξ].
Problem 1.7. (Quadratic variation of square integrable martingales). Consider a square integrable
martingale, by which we mean that X0 = 0 and Xn ∈ L2 for each n. We can define the square
bracket (or the quadratic variation) of the martingale by
n
X
[X, X]n =
(Xk − Xk−1 )2 .
k=1
At the same time, we define the angle bracket, or the predictable quadratic variation process hX, Xi
as the predictable increasing process in the Doob decomposition
X 2 = M + hX, Xi.
Show that
P
(1) hX, Xin = nk=1 E[(Xk − Xk−1 )2 |Fk−1 ] and the predictable process hX, Xi is the compensator in the Doob decomposition of [X, X].
(2) let T be a stopping time (possibly undbounded and infinite). Show that
E[hX, XiT ] = E[[X, X]T ].
(note that the random variables above are actually well defined)
Instructor: Mihai Sı̂rbu
Semester: Spring 2016
Solutions to HW: 1
Course: Theory of Probability II
Page: 4 of 6
(3) show that
E[ sup |Xk |2 ] ≤ 4E[hX, XiT ] = 4E[[X, X]T ].
0≤k≤T
(4) assume that
E[hX, XiT ] < ∞.
Then the processes
2
Xn∧T
− [X, X]n∧T , n = 0, 1, 2 . . . ,
2
Xn∧T − hX, Xin∧T , n = 0, 1, 2 . . . ,
are uniformly integrable martingales. In particular, this means that X∞ is well defined on
the set {T = ∞} for such a stopping time T and
E[XT2 ] = E[hX, XiT ] = E[[X, X]∞ ].
(5) show that X∞ = limn Xn exists on and is finite on {hX, Xi∞ < ∞}.
Solution:
(1) Using the ”fundamental property of martingales”, we can easily check that X 2 − [X, X] is a
martingale. Therefore, the predictable compensator of X 2 (in the Doob decomposition) is
the same as the predictable compensator of [X, X]. Writing now the explicit representation
for the predictable compensator of [X, X] we get the answer.
(2) according to the item above, [X, X] − hX, Xi is a martingale. In particular, for each fixed
n we have
E[hX, Xin∧T ] = E[[X, X]n∧T ].
We can now let n → ∞ and use Monotone Convergence.
(3) We apply the Maximal Inequality for the stopped process X T up to time n, and use the
martingale property of the process X 2 − [X, X] stopped at T between time 0 and n. After
that we let n go to infinity.
(4) using the previous item, we see that both random variables in the difference are bounded
by something integrable (as in class for continuous martingales)
(5) let Tk = inf{n|hX, Xin+1 ≥ k}. Because hX, Xi is predictable, then Tk is a stopping time.
We also have that
hX, XiTk ≤ k, {hX, Xi∞ < k} ⊂ {Tk = ∞}.
According to part 4, X∞ = limn Xn is well defined on {Tk = ∞}, so the conclusion follows
by letting k → ∞.
Problem 1.8. (Wald II) Use the notation of exercise 9.4.2 and assume that E[ξ] = 0 and E[ξ 2 ] =
σ 2 < ∞. Use exercise 9.5.3 to show that if N is a stopping time such that E[N ] < ∞ then
2 ] = σ 2 E[N ].
E[SN
Solution: Since E[ξ] = 0 it is an easy exercise to show that (Sn )n is a martingale. Denote
Yn = Sn − nσ 2 . We have that
2
2
Yn+1 − Yn = Sn+1
− Sn2 − σ 2 = (Sn + ξn+1 )2 − Sn2 − σ 2 = 2Sn ξn+1 + ξn+1
− σ2,
so
2
E[Yn+1 − Yn |Fn ] = 2Sn E[ξn+1 ] + E[ξn+1
] − σ 2 = 0,
which means that Y is a martingale. This shows that the process An = nσ 2 is the predictable
quadratic variation of the martingale S. We can apply now the previous problem to finish the
proof.
Instructor: Mihai Sı̂rbu
Semester: Spring 2016
Solutions to HW: 1
Course: Theory of Probability II
Page: 5 of 6
Problem 1.9. (de la Vallée Poussin criterion) A set of random variables (Xi )ı∈I is u.i. if and only
if there exists a function ψ : [0, ∞) → [0, ∞) such that limx→∞ ψ(x)/x = ∞ and
sup E[ψ(|Xi |)] < ∞.
i∈I
The function ψ can be chosen to be convex.
Solution: One implication is quite easy. More precisely, if there exists such a function ψ, then,
for each ε there exists M (ε) such that
ψ(x)
1
≥ , x ≥ M (ε).
x
ε
Now, for any M ≥ M (ε) we have
E[|Xi |1{|Xi |≥M } ] ≤ εE[ψ(|Xi |)1{|Xi |≥M } ] ≤ ε sup E[ψ(|Xi |)].
i∈I
For the other implication, let us recall that, if ψ is (piecewise) C 1 and ψ(0) = 0 then
Z ∞
E[ψ(|X|)] =
ψ 0 (x)P(|X| ≥ x)dx.
0
We also have that
∞
Z
Z
∞
P(|X|1{|X|≥M } ≥ x)dx =
E[|X|1{|X|≥M } ] =
0
P(|X| ≥ x)dx.
M
Using the definition of uniform integrability, we can therefore choose an increasing sequences an %
∞, (with an increasing verly slowly), such that
Z ∞
X
P(|Xi | ≥ an )dx < ∞.
an sup
n
i
n
We can now define the piece-wise C 1 function ψ by ψ(0) = 0 and
ψ(x) = an , x ∈ (n, n + 1),
and finish the proof.
Problem 1.10. (backward martingales) Let (Ω, F, P) be a probability space and {Fn }n≥0 a “decreasing filtration”, which means that Fn+1 ⊂ Fn . Let {Xn , Fn }n be a bacward martingale, which
means that Xn ∈ L1 (Ω, Fn , P), n ≥ 0 and Xn+1 = E[Xn |Fn+1 ]. Show that Xn converges a.s. and
in L1 to some variable X∞ and X∞ = E[Xn |F∞ ], where F∞ = ∩n Fn .
Note: you know already from Probability I that backward martingales converge a.s., so the
point is to see the convergence is actually L1 as well.
Solution: As already seen in Probability I (and mentioned above), we know that
Xn → X∞ ∈ F∞ , a.s.
Since Xn = E[X0 |Fn ], we know (for example from Hw 1) that Xn is a UI sequence. This implies
that X∞ is integrable, and Xn → X∞ in L1 .
Now, let A ∈ F∞ . Let also k ≥ n. Since Xk = E[Xn |Fk ], we have
E[Xn 1A ] = E[Xk 1A ].
Letting k → ∞ we obtain
E[Xn 1A ] = E[X∞ 1A ], (∀)A ∈ F∞ ,
or
X∞ = E[Xn |F∞ ].
Instructor: Mihai Sı̂rbu
Semester: Spring 2016
Solutions to HW: 1
Course: Theory of Probability II
Page: 6 of 6
Problem 1.11. (backward submartingales) Let (Ω, F, P) be a probability space and {Fn }n≥0 a
“decreasing filtration” as above. Let {Xn , Fn }n be a bacward submartingale, which means that
Xn ∈ L1 (Ω, Fn , P), n ≥ 0 and Xn+1 ≤ E[Xn |Fn+1 ]. Show that {Xn }n is u.i. if and only if
inf n E[Xn ] > −∞.
Solution: We can use a (backwards) upcrossing inequality to conclude that there exists an X∞ ∈
L( F∞ ) such that
Xn → X∞ a.s.
+
The positive part Xn is UI, so (splitting each Xn into Xn+ − Xn− ) it is enough to show that
E[Xn ] & E[X∞ ].
Actually, just using Fatou for the negative part (since the positive part converges in L1 ) we have
that E[X∞ ] ≥ limn E[Xn ]. Now, for k ≤ n we have Xk ≤ E[Xn |Fk ]. Letting k → ∞ and using the
previous problem, we have that
X∞ ≤ E[Xn |F∞ ],
so E[X∞ ] ≤ E[Xn ], finishing the proof.
Instructor: Mihai Sı̂rbu
Semester: Spring 2016