Stopping time and conditional expectation

Math 561: Theory of Probability (Spring 2016)
Lecture: 17
Stopping time and conditional expectation
Lecturer: Partha S. Dey
Scribe: Amir Taghvaei <[email protected]>
Definition 17.1. A (discrete) filtration is a sequence of increasing σ-fields,
F1 ⊆ F2 ⊆ . . .
e.g let X1 , X2 , . . . be i.i.d r.v.s. Then Fn = σ(X1 , . . . , Xn ) is a filtration.
Definition 17.2. A sequence of events (An )n≥1 is adapted to the filtration (Fn )n≥1 if,
An ∈ Fn ,
∀n ≥ 1
A sequence of r.v.s (Xn )n≥1 is adapted to the filtration (Fn )n≥1 if,
σ(Xn ) ⊆ Fn ,
∀n ≥ 1
Definition 17.3. A sequence of events (An )n≥1 is predictable w.r.t the filtration (Fn )n≥1 if,
An+1 ∈ Fn ,
∀n ≥ 1
A sequence of r.v.s (Xn )n≥1 is predictable w.r.t to the filtration (Fn )n≥1 if,
σ(Xn ) ⊆ Fn ,
∀n ≥ 1
Definition 17.4. Stopping time w.r.t filtration (Fn )n≥1 is a r.v T : Ω → N s.t,
{T = n} ∈ Fn ;
∀n ≥ 1
(17.1)
Example 17.5. Let X1 , X2 , . . . be i.i.d r.v.s. Then
TA = inf{n| Sn ∈ A}
(17.2)
is a stopping time (hitting time) with respect to filtration Fn := σ(X1 , . . . , Xn ).
Proof.
{T = n} = {Sn ∈ A; Si ∈
/ A , i = 1, . . . , n − 1} ∈ σ(X1 , . . . , Xn ) = Fn
(17.3)
Example 17.6. Let X1 , . . . , XN be i.i.d r.v.s. Then
T = min{i|Xi = max(X1 , . . . , XN )}
is not a stopping time w.r.t Fn := σ(X1 , . . . , Xn ).
17-1
(17.4)
17-2
Lecture 17: Stopping time and conditional expectation
Proof.
{T = n} = {X1 , X2 , . . . , Xn−1 < Xn , Xn+1 , . . . , XN ≤ Xn } ∈
/ σ(X1 , . . . , Xn )
(17.5)
Example 17.7. Consider Example 17.5 with Fn := σ(X1 , . . . , Xn+1 ). Then {T = n} is predictable.
Lemma 17.8. T is stopping time w.r.t filtration Fn if,
{T ≤ n} ∈ Fn ,
or
{T > n} ∈ Fn ,
∀n
(17.6)
Lemma 17.9. If T and S are two stopping time w.r.t the same filtration (Fn ), then S + T , S ∨ T
and S ∧ T are stopping times.
Proof. For S + T :
{S + T = n} = ∪nk=0 {S = k} ∩ {T = n − k} ∈ Fn .
| {z } |
{z
}
(17.7)
{S ∨ T = n} = ({S = n} ∩ {T > n}) ∪ ({T = n} ∩ {S > n})
(17.8)
∈Fk ⊂Fn
∈Fn−k ⊂Fn
For S ∨ T ,
and use Lemma 17.8. Similarly for S ∧ T ,
{S ∧ T = n} = ({S = n} ∩ {T ≤ n}) ∪ ({T = n} ∩ {S ≤ n})
(17.9)
Example 17.10. Consider the setup in Example 17.5. Then,
TA ∧ TB = TA∪B
TA ∨ TB = TA∩B
Definition 17.11. Given σ-fields G ⊆ F and a r.v X ∈ L1 (Ω, F, P), we define E(X|G) as a r.v Y
s.t:
1. Y is G measurable.
2. E(X1A ) = E(Y 1A ), ∀A ∈ G. ⇔ E((X − Y )Z) = 0, ∀Z bounded and G-measurable. .
Lemma 17.12. Conditional expectation is unique.
Proof. Suppose Y1 , Y2 are conditional expectation of X given G. Take W := Y1 − Y2 which is G
measurable. Then
E(W 1A ) = 0, ∀A ∈ G
Take A = {W > } ∈ G. Then,
0 = E(W 1W > ) > εP(W > ) ⇒ P(W > ) = 0
Also take A = {W < −} ∈ G. Then,
0 = E(W 1W <− ) < −εP(W < −) ⇒ P(W < −) = 0
Therefor P (W ∈ (−ε, ε)) = 1 for all > 0. Hence P(W = 0) = 1. So Y1 = Y2 a.s.
Lecture 17: Stopping time and conditional expectation
17-3
The following theorem and lemma is going to be useful to prove the existence of the conditional
expectation.
Theorem 17.13. (Lebesgue-Radon-Nikodym) Let µ and λ be two σ-finite positive measures on
(Ω, F), such that µ << λ ( µ is abs cont w.r.t λ, i.e λ(A) = 0 ⇒ µ(A) = 0). Then there exists a
continuous function f s.t,
dµ
f=
⇔ µ(A) =
dλ
Z
f (x)λ(dx),
∀A ∈ F
A
Lemma 17.14. Let K ⊆ H be a close subspace of Hilbert space H. Then for all X ∈ H, there
exists a unique decomposition X = Y + Z s.t Y ∈ K and Z ∈ K ⊥ .
Lemma 17.15. Conditional expectation exists.
Here are a number of proofs for this Lemma.
R
Proof. (Measure theory) Assume X ≥ 0. Consider Theorem 17.13, let λ := P and µ(A) := A XdP,
for all A ∈ G. λ, µ are positive finite measures on (Ω, G) and µ << λ. Then by Theorem 17.13 ∃f
s.t
Z
f dP ⇒ E(X1a ) = E(f 1A ),
µ(A) =
∀A ∈ G
A
So f is the conditional expectation. In general X = X + − X − . Then
E(X | G) = E(X + | G) − E(X − | G)
Proof. (Functional analysis) Assume X ∈ L2 (F) = H (Hilbert space). Then by Lemma 17.14 there
exists a unique decomposition X = Y + E, s.t Y ∈ L2 (G) and E ∈ L2 (G)⊥ . So,
∀Z ∈ L2 (G),
E((X − Y )Z) = 0 ⇒ Y = E(X|G)
This proof can be generalized to X ∈ L1 .
Proof. (“Hands on proof”) Assume |G| < ∞. Then G = σ(C1 , . . . , Ck ) s.t Ci ∩ Cj = ∅ for i 6= j and
∪ki=1 Ci = Ω. Then any G-measurable r.v Z is of the form,
Z=
k
X
ai 1Ci
i=1
Then the conditional expectation is,
k
X
E(X1Ci )
E(X|G) = Y =
1Ci
E(1Ci )
i=1
17-4
Lecture 17: Stopping time and conditional expectation
since for all G-measurable r.v Z,
E(XZ) =
E(Y Z) =
k
X
i=1
k
X
i=1
=
k
X
ai E(X1Ci )
ai
k
X
E(X1Cj )
j=1
E(1Cj )
E(1Cj 1Ci )
ai E(X1Ci )
i=1
The generalization to |G| = ∞ is discussed in next lecture.