Theoretical Tutorial Session 1 - Department of Mathematics

Theoretical Tutorial Session 1
Xiaoming Song
Department of Mathematics
Drexel University
July 25, 2016
1 / 48
Basic properties for Brownian motion
1. Selfsimilarity:
1
For any a > 0, the process {a− 2 Bat , t ≥ 0} is also a Brownian
motion.
1
Proof: Let Yt = a− 2 Bat . It is obvious that the process
1
{Yt = a− 2 Bat , t ≥ 0} is an Gaussian process with zero mean
and continuous trajectories. It is sufficient to show that for any
0 ≤ s ≤ t < +∞ the covariance function E(Yt Ys ) = s which is
proved as follows:
1
1
E(Yt Ys ) = E(a− 2 Bat a− 2 Bas ) = a−1 E(Bat Bas ) = a−1 (as) = s.
2 / 48
2. For any h > 0, the process {Bt+h − Bh , t ≥ 0} is a Brownian
motion.
Proof: Let Yt = Bt+h − Bh . We know that the process
{Yt = Bt+h − Bh , t ≥ 0} is an Gaussian process with zero mean
and continuous trajectories. Next, we calculate its covariance
function: for any 0 ≤ s ≤ t < +∞,
E(Yt Ys ) = E((Bt+h − Bh )(Bs+h − Bh ))
= E(Bt+h Bs+h ) − E(Bh Bs+h ) − E(Bt+h Bh ) + E(Bh Bh )
= (s + h) − h − h + h
= s.
Therefore, the process {Bt+h − Bh , t ≥ 0} is a Brownian motion.
3 / 48
3. The process {−Bt , t ≥ 0} is a Brownian motion.
Proof: This process is a Gaussian process with zero mean and
continuous trajectories, and for any 0 ≤ s ≤ t < +∞,
E((−Bt )(−Bs )) = E(Bt Bs ) = s.
Thus, The process {−Bt , t ≥ 0} is a Brownian motion.
4 / 48
Chebyshev’s inequality
If ξ is a random variable with E(|ξ|p ) < ∞, then for any λ > 0
P(|ξ| > λ) ≤
E(|ξ|p )
.
λp
5 / 48
Borel-Cantelli lemma
Lemma
Let A1 , A2P
, A3 , . . . be a sequence of events in a probability
space. If ∞
n=1 P(An ) < ∞, then the probability that infinitely
many of them occur is 0, that is,
P(lim sup An ) = 0.
n→∞
Note that
lim sup An =
n→∞
∞ [
∞
\
Ak
n=1 k =n
6 / 48
Strong law of large numbers
Theorem
If ξ1 , ξ2 , ξ3 , . . . is a sequence of iid random variables and
µ = E(ξ1 ) < ∞, then
Pn
ξ1 + ξ2 + · · · + ξn
i=1 ξi
=
→µ
n
n
almost surely as n → ∞.
7 / 48
Bt
t→∞ t
4. Almost surely lim
= 0 and the process
(
tB1/t , t > 0;
Xt =
0, t = 0
is a Brownian motion.
Bn
n→∞ n
Proof: We first show that lim
= 0 which is true by using the
strong law of large numbers since
Pn
(Bi − Bi−1 )
Bn
= i=1
n
n
and B1 − B0 , B2 − B1 , . . . , Bn − Bn−1 , . . . are iid with law N(0, 1).
8 / 48
Then for any t > 0 and t ∈ [n, n + 1) for some n, we have
Bt
− Bn = Bt − Bn + Bn − Bn t
n
t
t
t
n sups∈[n,n+1] |Bs − Bn |
1 1
≤
+ |Bn |
−
t
n
t
sups∈[n,n+1] |Bs − Bn | |Bn |
≤
+
.
n
n
Bn
n→∞ n
Note that lim
(1)
= 0 implies
lim
n→∞
|Bn |
= 0.
n
(2)
9 / 48
Since the sequence of random variables
{ sup
|Bs − Bn |, n ≥ 1}
s∈[n−1,n]
are iid with the same law as the integrable random variable
sups∈[0,1] |Bs |, using the strong law of large numbers again we
can show that
Pn
i=1 sups∈[i−1,i] |Bs − Bi |
→ E( sup |Bs |) < ∞
n
s∈[0,1]
almost surely, as n → ∞.
10 / 48
Then, the above results yields
sups∈[n,n+1] |Bs − Bn |
n
i=1 sups∈[i−1,i] |Bs − Bi |
Pn−1
Pn
=
n
Pn
i=1 sups∈[i−1,i] |Bs
i=1
−
− Bi |
sups∈[i−1,i] |Bs − Bi |
n
Pn−1
i=1
sups∈[i−1,i] |Bs − Bi |
−
n
→E( sup |Bs |) − E( sup |Bs |)(1) = 0,
=
s∈[0,1]
n−1
n−1
n
(3)
s∈[0,1]
as n → ∞.
The fact of
imply
|Bt |
t
≤ Bt t −
Bn n +
|Bn |
n
together with (1), (2) and (3)
Bt
= 0.
t→∞ t
lim
11 / 48
Bt
t→∞ t
For the process X , using lim
= 0 almost surely we can
prove that the trajectories are continuous almost surely.
The process X is a Gaussian process with zero mean. For any
0 ≤ s ≤ t < ∞, we have
E(Xt Xs ) = E((tB1/t )(sB1/s )) = (t)(s)E(B1/t B1/s ) = (t)(s)(1/t) = s.
Therefore, X is also a Brownian motion.
12 / 48
Fatou’s lemma
Lemma
Let ξ1 , ξ2 , ξ3 , . . . be a sequence of non-negative random
variables on a probability space (Ω, F, P), then
E(lim inf ξn ) ≤ lim inf E(ξn ).
n→∞
n→∞
Let ξ1 , ξ2 , ξ3 , . . . be a sequence of non-negative random
variables on a probability space (Ω, F, P). If there exists a
non-negative integrable random variable η such that ξn ≤ η
for all n, then
lim sup E(ξn ) ≤ E(lim sup ξn ).
n→∞
n→∞
13 / 48
Hewitt-Savage 0-1 law
Theorem
Let ξ1 , ξ2 , ξ3 , . . . be a sequence of independent and
identically-distributed (iid) random variables taking values in a
set E. Then, any event whose occurrence or non-occurrence is
determined by the values of these random variables and whose
occurrence or non-occurrence is unchanged by finite
permutations of the indices, has probability either 0 or 1.
14 / 48
5. lim supt→∞
Bt
√
t
= ∞ and lim inft→∞
Bt
√
t
= −∞.
Proof: It suffices to show that lim supn→∞
Bn
lim infn→∞ √
n
Bn
√
n
= ∞ and
= −∞. For any positive constant C, we have
Bn
Bn
P lim sup √ > C = P lim sup √ > C
n
n
n→∞
n→∞
Bn
√ >C
(Fatou’s lemma)
≥ lim sup P
n
n→∞
= lim sup P (B1 > C) (Scaling property)
n→∞
>0.
15 / 48
Let ξn = Bn − Bn−1 , n ≥ 1. Then ξ1 , ξ2 , ξ3 , . . . is a sequence of
iid random variables. Note that the event
Pn
Bn
k√
=1 ξk
√
> C = lim sup
>C
lim sup
n
n
n→∞
n→∞
is unchanged by finite permutations of the indices. Hence, by
Hewitt-Savage 0 − 1 law together with the last inequality on
previous page, we have
Bn
P lim sup √ > C
= 1.
n
n→∞
Since C is arbitrary, we can show that lim supn→∞
Bn
√
n
= ∞.
Since {−Bt , t ≥ 0} is also a Brownian motion, we can show that
Bn
−Bn
Bn
lim inf √ = lim inf √ = − lim sup √ = −∞.
n→∞
n→∞
n
n
n
n→∞
16 / 48
Remark: From the Exercise 5, we obtain the following result
P(lim sup Bt = +∞) = 1,
(4)
t→∞
and
P(lim inf Bt = −∞) = 1.
(5)
P(lim sup Bt = +∞, lim inf Bt = −∞) = 1.
(6)
t→∞
Furthermore,
t→∞
t→∞
17 / 48
6. Show that for each fixed t ≥ 0
Bt+h − Bt
= ∞ = 1.
P ω ∈ Ω : lim sup
h
h→0+
Proof: Given this Brownian motion B, we construct another
Brownian motion X defined in Exercise 4:
(
tB1/t , t > 0;
Xt =
0, t = 0.
Then for any t ≥ 0, we obtain
lim sup
n→∞
Bt+ 1 − Bt
n
1
n
= lim sup
n→∞
B 1 − B0
n
1
n
(Exercise 2)
√
≥ lim sup nB 1
n→∞
n
Xn
= lim sup √ = ∞ (Previous exercise).
n
n→∞
18 / 48
7. Almost surely the paths of B are not differentiable at any
point t ≥ 0, more precisely, the set
ω ∈ Ω : for each t ∈ [0, ∞), either lim sup
h→0+
Bt+h − Bt
=∞
h
Bt+h − Bt
=
−∞
or lim inf
h
h→0+
contains an event A ∈ F with P(A) = 1.
Proof: Denote
h→0+
Bt+h − Bt
h
D+ Bt = lim inf
+
Bt+h − Bt
.
h
D + Bt = lim sup
and
h→0
19 / 48
It suffices to show the result on the interval [0, 1]. For any fixed
integers j ≥ 1 and k ≥ 1, define
[
\
Ajk =
{ω ∈ Ω : |Bt+h (ω) − Bt (ω)| ≤ jh}.
t∈[0,1] h∈[0,1/k ]
Note that Ajk is not an event, that is Ajk ∈
/ F. Note also that
{ω ∈ Ω : −∞ < D+ Bt (ω) ≤ D + Bt (ω) < ∞, for some t ∈ [0, 1]}
∞ [
∞
[
=
Ajk .
j=1 k =1
Then the proof of the result will be complete if we can find an
event C ∈ F with P(C) = 0 such that Ajk ⊆ C for all j and k .
20 / 48
For any fixed j and k , we consider a sample path ω ∈ Ajk , that
is, there exists t ∈ [0, 1] with
|Bt+h (ω) − Bt (ω)| ≤ jh
for all h ∈ [0, 1/k ].
Let n be an integer with n ≥ 4k . Then there exists some i,
i
1 ≤ i ≤ n such that i−1
n ≤ t < n . Then it is also easy to verify
that, for l = 1, 2, 3,
l +1
1
i +l
−t ≤
≤ .
n
n
k
Thus,
|B i+1 (ω) − B i (ω)| ≤ |B i+1 (ω) − Bt (ω)| + |B i (ω) − Bt (ω)|
n
n
n
n
2j
j
3j
+ = ,
≤
n
n
n
21 / 48
|B i+2 (ω) − B i+1 (ω)| ≤ |B i+2 (ω) − Bt (ω)| + |B i+1 (ω) − Bt (ω)|
n
n
n
n
3j
2j
5j
≤
+
= ,
n
n
n
and
|B i+3 (ω) − B i+2 (ω)| ≤ |B i+3 (ω) − Bt (ω)| + |B i+2 (ω) − Bt (ω)|
n
n
n
n
3j
7j
4j
+
= .
≤
n
n
n
Now denote
(n)
Ci
=
3 \
l=1
(2l + 1)j ω ∈ Ω : B i+l (ω) − B i+l−1 (ω) ≤
.
n
n
n
(n)
Then, we can see that Ci
for all n ≥ 4k .
is an event in F and Ajk ⊆
(n)
i=1 Ci
Sn
22 / 48
Note that the random variables
√ n B i+l − B i+l−1 , l = 1, 2, 3,
n
n
are iid with law N(0, 1). Note also that if a random variable ξ
has law N(0, 1) then
P(|ξ| ≤ x) ≤ x, for any x ≥ 0.
Therefore, we have
5j
7j
105j 3
3j
(n)
√
√
√
=
, i = 1, 2, · · · , n.
P(Ci ) ≤
3
n
n
n
n2
Then
n
[
105j 3
(n)
Ci ) ≤ √ .
n
i=1
P(
23 / 48
Take
∞ [
n
\
C=
(n)
Ci
∈ F.
n=4k i=1
Then Ajk ⊆ C and it also holds that
P(C) = P(
∞ [
n
\
(n)
Ci ) ≤ inf P(
n=4k i=1
n≥4k
n
[
(n)
Ci ) = 0,
i=1
which completes the proof.
24 / 48
Properties of the conditional expectation
8. Linearity:
E(aX + bY |G) = aE(X |G) + bE(Y |G).
Proof: aE(X |G) + bE(Y |G) is G-measurable. For any A ∈ G,
Z
Z
E(aX + bY |G)dP =
(aX + bY )dP
A
A
Z
Z
= a XdP + b YdP
A
ZA
Z
=a E(X |G)dP + b E(Y |G)dP
A
Z A
= (aE(X |G) + bE(Y |G))dP,
A
which implies
E(aX + bY |G) = aE(X |G) + bE(Y |G).
25 / 48
9. E(E(X |G)) = E(X ).
Proof: In the definition of conditional expectation, let A = Ω,
then
Z
Z
E(E(X |G)) =
E(X |G)dP =
XdP = E(X ).
Ω
Ω
26 / 48
10. If X and G are independent, then E(X |G) = E(X ).
Proof: It is obvious that E(X ) is G-measurable. For any A ∈ G,
Z
Z
E(X |G)dP =
XdP = E(1A X )
A
A
= E(1A )E(X ) (X and 1A are independent)
= P(A)E(X )
Z
=
E(X )dP.
A
Thus, it implies that E(X |G) = E(X ).
27 / 48
11. If X is G-measurable, then E(X |G) = X .
Proof: The proof follows from the fact that X is G-measurable
and the definition of conditional expectation: for any A ∈ G
Z
Z
E(X |G)dP =
XdP.
A
A
28 / 48
Dominated convergence theorem
Theorem
Let ξ1 , ξ2 , ξ3 , . . . be a sequence of random variables. If the
sequence {ξn } converges almost surely to a random variable ξ
and there exists a positive and integrable random variable η
such that |ξn | ≤ η for all n, then
lim E(|ξn − ξ|) = 0,
n→∞
which also implies
lim E(ξn ) = E(ξ).
n→∞
29 / 48
12. If Y is bounded and G-measurable, then
E(XY |G) = YE(X |G).
Proof: If Y = 1B is an indicator function where B ∈ G, then for
any A ∈ G
Z
Z
Z
E(YX |G)dP =
1B XdP =
XdP
A
A
A∩B
Z
=
E(X |G)dP
A∩B
= intA 1B E(X |G)dP,
which implies E(XY |G) = YE(X |G).
By the linearity of conditional expectation, we know that the
result is true if Y is a simple function (a linear combination of
indicator functions).
30 / 48
If Y is bounded G-measurable function, then there exists a
sequence of simple functions Yn such that |Yn | ≤ |Y | and
lim Yn = Y almost surely.
n→∞
Then by dominated convergence theorem, we can show that for
any A ∈ G,
Z
Z
Z
Yn XdP
E(YX |G)dP =
YXdP = lim
n→∞ A
A
A
Z
= lim
Yn E(X |G)dP
n→∞ A
Z
=
YE(X |G)dP,
A
which proves E(XY |G) = YE(X |G).
31 / 48
13. Given two σ-fields B ⊂ G, then
E(E(X |B)|G) = E(E(X |G)|B) = E(X |B).
Proof: Since B ⊂ G, we know that E(X |B) is G-measurable.
Then by Exercise 4, we obtain
E(E(X |B)|G) = E(X |B).
For any A ∈ B ⊂ G, we have
Z
Z
Z
Z
E(E(X |G)|B)dP =
E(X |G)dP =
XdP =
E(X |B)dP,
A
A
A
A
which implies
E(E(X |G)|B) = E(X |B).
32 / 48
14. Let X and Z be such that
(i) Z is G-measurable.
(ii) X is independent of G.
Suppose that E(h(X , Z )|) < ∞. Then,
E(h(X , Z )|G) = E(h(X , z))|z=Z .
Proof: Denote g(z) = E(h(X , z)) and µX (E) = P(X ∈ E) for
E ∈ B(R). For any A ∈ G, we denote
Y = 1A ;
µY ,Z (F ) = P((Y , Z ) ∈ F ), F ∈ B(R2 ).
33 / 48
Then,
Z
Z
E(h(X , Z )|G)dP =
A
h(X , Z )dP
A
= E(Yh(X , Z ))
Z
yh(x, z)µX (dx)µY ,Z (dy , dz)
=
3
R
Z
=
yg(z)µY ,Z (dy , dz)
R2
= E(Yg(Z ))
= E(1A g(Z ))
Z
=
g(Z )dP
ZA
E(h(X , z))|z=Z dP.
=
A
34 / 48
Properties of stopping times
15. S ∨ T and S ∧ T are stopping times.
Proof: For all t ≥ 0, we know that {S ≤ t} ∈ Ft and
{T ≤ t} ∈ Ft . Hence,
{S ∨ T ≤ t} = {S ≤ t} ∩ {T ≤ t} ∈ Ft ,
and
{S ∧ T ≤ t} = {S ≤ t} ∪ {T ≤ t} ∈ Ft .
35 / 48
16. Given a stopping time T ,
FT = {A : A ∩ {T ≤ t} ∈ Ft , for all t ≥ 0
is a σ-field.
Proof: It is obvious that Ω is in FT and FT is closed under
intersection of countably infinite many subsets. We only need
to show that FT is closed under complement. If A ∈ FT then
A ∩ {T ≤ t} ∈ Ft , and hence
Ac ∩{T ≤ t} = (A∪{T > t})c = ((A∩{T ≤ t})∪{T > t})c ∈ Ft .
36 / 48
17. S ≤ T ⇒ FS ⊂ FT .
Proof: If A ∈ FS , then for any t ≥ 0 we have A ∩ {S ≤ t} ∈ Ft .
Since S ≤ T , we also have {T ≤ t} ⊂ {S ≤ t}. Thus
A ∩ {T ≤ t} = A ∩ {S ≤ t} ∩ {T ≤ t} ∈ Ft ,
which implies A ∈ FT .
37 / 48
18. Let {Xt , t ≥ 0} be a continuous and adapted process. The
hitting time of a set A ⊂ R is defined by
TA = inf{t ≥ 0 : Xt ∈ A}.
Then, if A is open or closed, TA is a stopping time.
Proof: If A is an open set, then for any t > 0, we have
[
{TA < t} =
{Xr ∈ A} ∈ Ft .
r ∈Q+ ,r <t
By Exercise 1 in Professor Nualart’s lecture notes, we can
show that TA is a stopping time.
38 / 48
If A is a closed set, then for any t ≥ 0 we have
\
{Xr ∈
/ A} ∈ Ft .
{TA ≥ t} =
r ∈Q+ ,r <t
Hence {TA < t} ∈ Ft . By Exercise 1 in Professor Nualart’s
lecture notes, we can show that TA is a stopping time.
39 / 48
19. Let Xt be an adapted stochastic process with
right-continuous paths and T < ∞ is a stopping time. Then the
random variable
XT (ω) = XT (ω) (ω)
is FT -measurable.
Proof: For any n ∈ N, define
Tn =
∞
X
i +1
1 i
i+1 .
2n { 2n <T ≤ 2n }
i=0
For any t ≥ 0,
{Tn ≤ t} =
[
i, i+1
≤t
2n
{T ≤
i +1
} ∈ Ft .
2n
Then Tn is a stopping time for each n, and moreover, Tn ↓ T .
40 / 48
Then it suffices to show that for any open set A ⊂ R
{XTn ∈ A} ∩ {Tn ≤ t} ∈ Ft , t ≥ 0.
In fact,
{XTn ∈ A} ∩ {Tn ≤ t}
= ∪k
≤t
2n
({X k ∈ A} ∩ {
2n
k
k −1
< T ≤ n }) ∈ Ft .
n
2
2
Since Tn ↓ T and the process X is right-continuous, we can let
n go to ∞ and obtain
{XT ∈ A} ∩ {T ≤ t} = lim {XTn ∈ A} ∩ {Tn ≤ t} ∈ Ft .
n→∞
41 / 48
Application to Brownian hitting times
Let Bt be a Brownian motion. Fix a ∈ R and consider the hitting
time
τa = inf{t ≥ 0 : Bt = a}.
Proposition
If a < 0 < b, then
P(τa < τb ) =
b
.
b−a
Proof: From (4) and (5), we can get τa < ∞ and τb < ∞ almost
surely.
Exercise 18 implies that τa and τb are two stopping times, and
hence Exercise 15 implies that τa ∧ τb is also a stopping time.
42 / 48
For any t ≥ 0, τa ∧ τb ∧ t is a bounded stopping time. Then by
the optional stopping theorem, we have
E(Bτa ∧τb ∧t ) = E(B0 ) = 0.
Since
a ≤ Bτa ∧τb ∧t ≤ b, ∀t ≥ 0
and
lim Bτa ∧τb ∧t = Bτa ∧τb
t→∞
almost surely, by the dominated convergence theorem, we have
E(Bτa ∧τb ) = E( lim Bτa ∧τb ∧t ) = Bτa ∧τb ∧t E(Bτa ∧τb ∧t ) = 0.
t→∞
The random variable Bτa ∧τb takes only two values a and b and
its distribution is given by
(
a, with probability P(τa < τb ),
Bτa ∧τb =
b, with probability P(τa > τb ).
43 / 48
Hence,
E(Bτa ∧τb ) = aP(τa < τb ) + bP(τa > τb )
= aP(τa < τb ) + b(1 − P(τa < τb ))
= 0.
Solving this equation for P(τa < τb ), we obtain
P(τa < τb ) =
b
.
b−a
44 / 48
Proposition
Let T = inf{t ≥ 0 : Bt ∈
/ (a, b)}, where a < 0 < b. Then
E(T ) = −ab.
Proof: In fact T = τa ∧ τb < ∞ almost surely.
Using that Bt2 − t is a martingale, we get, by the optional
stopping theorem,
E(BT2 ∧t − (T ∧ t)) = 0,
that is,
E(BT2 ∧t ) = E(T ∧ t).
(7)
45 / 48
Since T ∧ t ↑ T as t ↑ ∞, by the monotone convergence
theorem we get
E(T ) = E( lim (T ∧ t)) = lim E(T ∧ t).
t→∞
t→∞
(8)
Since BT2 ∧t is bounded for all t ≥ 0 and BT2 ∧t → BT almost
surely as t → ∞, using the dominated convergence theorem
and (7) and (8) we have
E(BT2 ) = E( lim BT2 ∧t ) = lim E(BT2 ∧t ) = E(T ).
t→∞
t→∞
From the previous Proposition, we get
b
b
+ b2 1 −
= −ab.
E(BT2 ) = a2
b−a
b−a
Therefore,
E(T ) = −ab.
46 / 48
Additional exercises
1. Let Z be a Gaussian random variable with law N(0, σ 2 ).
From the expression
1
E(eλZ ) = e 2 λ
2 σ2
,
deduce the following formulas for the moments of Z :
E(Z 2k ) =
(2k )! 2k
σ , k = 1, 2, . . . ,
2k K !
E(Z 2k −1 ) = 0, k = 1, 2, . . . .
47 / 48
2. Let {Bt , t ∈ [0, 1]} be a standard Brownian motion. Show
that
Z
1
Bt dt
ξ=
0
is a well-defined random variable. Calculate its first and second
moments.
48 / 48