notes

ICAPM Research School
Analysis and Probability
Université Félix Houphouët Boigny
Modeste N’ZI1
Abidjan, march 17-28, 2014
1 LMAI,
E-mail:[email protected]
Contents
1 Generalities on stochastic processes
1.1 Probability spaces an random variables .
1.2 Stochastic processes . . . . . . . . . . . .
1.3 Gaussian spaces and Gaussian processes
1.4 Gaussian measures . . . . . . . . . . . .
.
.
.
.
2 Brownian Motion
2.1 Brief historical . . . . . . . . . . . . . . .
2.2 De…nition and some properties . . . . . . .
2.3 Construction of Brownian motion . . . . .
2.3.1 Canonical Brownian motion . . . .
2.3.2 Construction via Gaussian measure
2.4 Path properties . . . . . . . . . . . . . . .
2.5 Brownian Motion and Martingales . . . . .
2.5.1 Filtrations . . . . . . . . . . . . . .
2.5.2 Stopping times . . . . . . . . . . .
2.5.3 Martingales . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
2
2
3
5
6
.
.
.
.
.
.
.
.
.
.
7
7
7
10
11
11
12
15
15
16
17
3 Stochastic integrals
24
3.1 Itô’s integral . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.2 Itô’s formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
1
Chapter 1
Generalities on stochastic
processes
1.1
Probability spaces an random variables
A probability space is a triplet ( ; F; P) where is a set, F is a
algebra on
; and P is a probability measure on ( ; F) that is a positive measure with
mass one. This mathematical object is a model for a random experiment
whose outcome cannot be exactly told in advance. The set
stands for
the collection of all possible outcomes of the experiment. The
algebra F
is the collection of all events that may occur during the realization of the
experiment. An event F is said to occur if the outcome of the experiment
happens to belong to F: The number P(F ) is called the probability of the
event F and it is the quanti…cation of the chance of occurrence of the event
F:
Let (E; E) be a measurable space. A random variable de…ned on ( ; F; P)
with values in (E; E) is a mapping X :
! E which is measurable relative
to F and E; that is X 1 (A) = f! 2 : X(!) 2 Ag is an event for all A 2 E:
Let X be a E valued random variable. The image of the measure P by
X is called the probability law or the distribution of X and is denoted by
PX . Thus,
PX (A) = P(X
1
(A)) = P(X 2 A); for all A 2 E:
The random variables X1 ; : : : ; Xn de…ned on ( ; F; P) with values respectively in (E1 ; E1 ); : : : ; (En ; En ) are said to be independent if the probability
2
law of the n uplet (X1 ; : : : ; Xn ) is the product measure ni PXi that is for
all Ai 2 Ei ; i = 1; : : : ; n
Yn
P(X1 2 A1 ; : : : ; Xn 2 An ) =
P(Xi 2 Ai ):
i=1
Thus, the random variables X1 ; : : : ; Xn are independent means that the
algebras (Xi ) generated respectively by Xi are independent that is for
any event Ai 2 (Xi ); i = 1; : : : ; n the events A1 ; : : : ; An are independent.
1.2
Stochastic processes
Let T be a set, for example T = N; R; R+ ; Rd : A collection (Xt )t2T of random
variables taking values in (E; E) is called a stochastic process with state space
(E; E) and parameter set T:
T
Let E T be the set
O of all maps from T to E: E is endowed with the
E: The map
product
algebra
t2T
X:
!
ET
! 7 ! (Xt (!))t2T
(1.1)
O
E):
is a random variable taking values in ( E T ;
t2T
For any ! 2 ; X(!) is called a trajectory of the stochastic process
(Xt )t2T : The distribution of the map X is called the probability law of the
stochastic process (Xt )t2T and is denoted by PX
Let
nY
o
C=
At with At 2 E and Cardft 2 T : At 6= Eg < 1
t2T
denote theO
set of …nite dimensional cylinders of E T :
Since
E is generated by C and C is stable by …nite intersection,
t2T
the monotone class theorem implies that the probability law of the stochasticYprocess (Xt )t2T is determined by its values on C: For any cylinder
C =
At there exists a …nite number of indexes t1 ; : : : ; tn such that
t2T
Ati 6= E; i = 1; : : : ; n: Therefore
PX (C) = P ((Xt1 ;
; Xtn ) 2 At1
3
Atn ) :
Thus, the probability law of the stochastic process (Xt )t2T is determined
by its …nite dimensional probability laws that is probability laws of random
vectors (Xt1 ;
; Xtn ); t1 ; : : : ; tn 2 T:
A stochastic process may also be regarded as a map denoted again by X
which is de…ned on
T and takes values in E. More precisely
X:
T !
E
(!; t) 7 ! Xt (!):
(1.2)
Let (Xt )t2T and (Yt )t2T be two E valued stochastic processes de…ned on
( ; F; P): Then, one is said to be a modi…cation of the other if, for any t 2 T
P(Xt = Yt ) = 1:
They are said to be indistinguishable if
P(Xt = Yt for all t 2 T) = 1:
Remark 1 If two stochastic processes are indistinguishable then there are
modi…cations of each other. In general, the inverse implication is not true
but if T is countable then it holds.
From now on, T
R : If the map X in (1.2) is measurable relative to
F B(T) and E; where B(T) stands for the Borel
algebra on T; we say
that the stochastic process (Xt )t2T is measurable.
Let (Xt )t2T be a Rd valued stochastic process. If for any ! 2
the
trajectory
X(!) : T !
Rd
t 7 ! Xt (!)
is continuous, we say that (Xt )t2T is a continuous stochastic process.
Remark 2 If two stochastic processes are continuous and modi…cations of
each other then they are indistinguishable.
Let C = C(T; Rd ) stands for the set of Rd valued continuous functions
de…ned on T: So, the trajectories of a Rd valued continuous stochastic
process are in C: If C is endowed with the topology of uniform convergence
on compact subsets of T then the map in (1.1) is measurable relative to F
and B(C) where B(C) is the Borel
algebra of C and the probability law of
the stochastic process (Xt )t2T is a probability measure on ( C; B(C)):
4
1.3
Gaussian spaces and Gaussian processes
De…nition 3 A Gaussian space is a closed linear subset of L2 ( ; F; P) formed
by centered Gaussian random vectors.
For example if X = (X1 ; Xn ) is a Rd valued centered Gaussian vector
then the linear subset generated by fX1 ;
; Xn g is a Gaussian space.
De…nition 4 A real-valued stochastic process (Xt )t2T is a Gaussian process
if its …nite dimensional probability laws are Gaussian, that is, for any t1 ; : : : ; tn 2
T, (Xt1 ;
; Xtn ) is a Gaussian random vector.
If (Xt )t2T is a Gaussian process then the linear set generated by the
coordinates of (Xt )t2T ;
( n
)
X
V ec(X) =
ui Xti ; n 1; ui 2 R; ti 2 T; i n
i=1
is formed by Gaussian random variables.
Proposition 5 Let (Xn )n 0 be a sequence of real-valued Gaussian random
variables with E(Xn ) = mn and var(Xn ) = 2n : Assume that (Xn )n 0 converges in law to a random variable X: Then
i) X is a real-valued Gaussian random variable with E(X) = m = limn!1 mn
and var(X) = 2 = limn!1 2n :
ii) If (Xn )n 0 converges in probability to X, then the convergence holds in
Lp ; 1 p < 1:
Proof. Exercise.
L2
In view of Proposition 5, V ec(X) the closure in L2 of V ec(X) is a
closed linear subset of L2 ( ; F; P) whose elements are also Gaussian random
variables.
De…nition 6 Let X = (Xt )t2T be a Gaussian process. The Gaussian space
H X = V ec(Xt
L2
E(Xt ); t 2 T)
is said to be associated to the stochastic process X:
5
Remark 7 If the stochastic process X is centered then H X is the Gaussian
space generated by the coordinates of X, that is, the smallest Gaussian space
enclosing all coordinates of X:
Exercise 8 Let H be a Gaussian space and let (Hi )i2I be a family of linear
subset of H. Prove that the subsets Hi ; i 2 I are orthogonal if and only if
the
algebras (Hi ); i 2 I are independent.
1.4
Gaussian measures
De…nition 9 Let be a
…nite measure on a measurable space (E; E): A
Gaussian measure on (E; E) of intensity is an isometry G from L2 (E; E; )
onto a Gaussian space.
For any f 2 L2 (E; E; ); G(f ) is a centered Gaussian random variable
with variance
E (G(f ))2 = kG(f )kL2 (
;F ;P)
= kf kL2 (E;E; ) :
In particular if A 2 E with (A) < 1 then the law of G(1A ) is N (0; (A)):
From now on we put G(A) = G(1A ):
Exercise 10 Let
be a
…nite measure on a measurable space (E; E):
1) Prove that there exists a Gaussian measure on (E; E) of intensity :
2) Let G be a Gaussian measure of intensity :
a) Prove that if A1 ; : : : ; An 2 E are disjointed with (Ai ) < 1; i = 1; : : : ; n
then G(A1 ); : : : ; G(An ) are independent Gaussian random variables.
b) Let A 2 E with (A) < 1 such that there exists a countable
partition
A1 ; A2 ; : : :of A in elements of E: Prove that the serie
X1
G(Ai ) is convergent in L2 ( ; F; P) and
i=1
G(A) =
1
X
i=1
6
G(Ai );
a:s:
Chapter 2
Brownian Motion
2.1
Brief historical
The Brownian motion was discovered in 1822 by the British botanist Robert
Brown. Indeed, observing the displacement of grains of pollen suspended in a
liquid, he noticed that the movement of the grains were very irregular . This
irregularity is due to collisions of the grains with the particles of the liquid
leading to a dispersion (di¤usion) of the grains.
In 1900, Bachelier interested in the ‡uctuations of stock price published
the …rst quantitative work on Brownian motion. Einstein(1905) studying
molecular dynamics hightlighted the Brownian motion. He derived the transition density. A rigorous mathematical study of Brownian motion began
with the works of Wiener (1923, 1924) who has given the …rst existence
proof. One of the deep study of Brownian motion is due to P. Lévy (1939,
1948) who provided a construction by interpolation and derived some …ne
properties of …rst passage times and sample paths of Brownian motion.
Nowadays numerous works can be found in literature on Brownian motion
for example the stochastic calculus of K. Itô (1958) which is a very useful
tools in various …elds such as …nance, economic, biology, partial di¤erential
equations, etc.
2.2
De…nition and some properties
De…nition 11 A stochastic process B = (Bt )t
is called a Brownian motion if it satis…es:
7
0
with state space (R; B(R))
(I) B is a continuous process
ii) B0 = 0; a.s.
iii) B has independent increments that is for any 0
t0 <
< tn ; the
increments Bt0 ; Bt1
Bt0 ; : : : ; Btn
Btn 1 are independent random
variables
iv) For any s < t, Bt
Bs has the Gaussian distribution N (0; t
s):
Remark 12 a) Condition (iv) implies that Brownian motion has stationary
increments which means that the law of Bt Bs depends on t and s only
through the di¤erence t s:
b) Condition (iii) is equivalent to the following one: for any s < t, Bt Bs
is independent of the
algebra generated by fBu : u sg which is denoted
B
by Fs :
Proposition 13 A stochastic process B is a Brownian motion if and if B is
a real-valued continuous centered Gaussian process with covariance function
(2.1)
E(Bt Bs ) = min(s; t)
:
Proof. Assume that B is a Brownian motion. So, B is a continuous process.
Since the increments of B are independent and have Gaussian distributions,
by using a transformation of random variables argument one can prove that
for any 0 = t0 <
< tn , (Bt1 ; : : : ; Btn ) has a Gaussian distribution with
density function
f (x1 ; : : : ; xn ) =
(2 )n=2
p
t1 (t2
1 X (xi
2 i=1 ti
n
1
t1 )
(tn
tn 1 )
exp
xi 1 )2
ti 1
!
where x0 = 0:
Since if (X1 ; : : : ; Xn ) is a Gaussian Random variable then for any permutation of f1; : : : ; ng the random variable (X (1) ; : : : ; X (n) ) is again a
Gaussian random variable, it follows that for any t1 ; : : : ; tn 2 R+ , (Bt1 ; : : : ; Btn )
is a Gaussian random variable.
In view of (iv) and (ii), E(Bt ) = 0: Thus, B is a continuous centered
Gaussian process.
8
For any s < t;
E(Bs Bt ) =
=
=
=
=
E(Bs (Bt Bs ) + Bs2 )
E(Bs (Bt Bs )) + E(Bs2 )
E(Bs )E(Bt Bs ) + s
0+s
min(s; t)
The two last lines follow from property (iii) and (iv).
Reciprocally, Assume that B is a continuous Gaussian process with the
covariance function given by (2.1) . Then E(B02 ) = 0 and for any s < t < u
E((Bt
Bs )2 ) = E(Bt2 ) + E(Bt2 ) 2E(Bs Bt )
= t + s 2 min(s; t)
= t s
and
E((Bt
Bs )(Bu
Bt )) = 0:
Therefore B satis…es conditions (i)-(iv) of De…nition 11
Proposition 14 Assume that B is a Brownian motion. Then the following
hold:
a) (Symmetry) The stochastic process ( Bt )t
b) (Scaling) X = (a
number a > 0:
1=2
Bat )t
0
0
is a Brownian motion.
is a Brownian motion for any …xed real
c) (Time inversion) Let us put Y0 = 0 and Yt = tB1=t for t > 0: Y = (Yt )t
is a Brownian motion.
d) For any t0 0; B (t0 ) = (Bt+t0 Bt0 )t
of the
algebra FtB0 :
0
is a Brownian motion independent
e) (Reversibility) For any …xed T > 0; Z = (BT
9
0
t
BT )t2[0;T ]
Proof. It is not di¢ cult to prove that B; X; (Yt )t>0 , B (t0 ) and Z are
centered continuous Gaussian processes with covariance functions given by
(2.1). It remains to prove the continuity of Y at 0 which is equivalent to
Bt
= 0; a:s:
t!1 t
lim
Let n
0 be such that n < t
Bt
t
n + 1; we have
Bn
1
+ jBt Bn j
n
n
Bn
1
+
sup jBn+s
n
n0 s 1
Since
Bn =
n
X
(Bi
Bn j:
Bi 1 );
i=1
and the random variables Bi
1 are independent and identically
Bn
distributed, the strong law of large numbers implies that
goes a.s. to 0
n
as n ! 1:Now, by virtue of Kolmogorov’s inequality (see [?]), for any " > 0
P
1
sup jBn+s
n0 s 1
Bn j
Bi 1 ; i
"
1
n 2 "2
E (Bn+1
Bn )2 =
1
n 2 "2
:
P
Since
1=n2 is …nite, Borel Cantelli lemma leads to n1 sup0 s 1 jBn+s Bn
goes a.s. to 0 as n ! 1: Hence, Bt =t ! 0; a.s. as t ! 1 which ends the
proof.
2.3
Construction of Brownian motion
µ
We begin by a theorem due to Kolmogorov and Centsov
(1956) which permits
to prove the existence of a continuous version of a stochastic process.
Theorem 15 Suppose that a stochastic process X = (Xt )t2[0;T ] satis…es the
condition
E(jXt Xs j ) Cjt sj1+ ; 0 s; t T
10
for some positive constants ; ; and C: Then there exists a continuous mode = (X
et )t2[0;T ] of X, which is locally Hölder-continuous with expoi…cation X
nent for every 2 (0; = ); i.e.
0
1
et (!) X
es (!)j
jX
B
C
P @! :
sup
A = 1;
jt
sj
0<t s<h(!)
s;t2[0;T ]
where h(!) is an a.s. positive random variable and
constant.
> 0 is an appropriate
Proof. See [?]
2.3.1
Canonical Brownian motion
The law of the Brownian motion is called Wiener measure which will be
denoted W : Let C stands for the space of continuous real-valued functions
de…ned on R+ : The Wiener measure is the probability measure on (C; B(C))
caracterised by: for any 0 = t0 < t1 <
< tn and A0 ; A1 ; : : : ; An in B(R)
W (w
2 C : (w(t0 ); w(t1 ); : : : ; w(tn ) 2 A0 A1
Z
1
p
1A0 (0)
n=2
t1 (t2 t1 )
A1 ::: An (2 )
:::
An ) =
n
(tn
tn 1 )
Let us consider the canonical probability space (C; B(C);
stochastic process W by
Wt (w) = w(t), t
1 X (xi
2 i=1 ti
exp
W)
and de…ne a
0:
This process is called the coordinate mapping process. It is not di¢ cult to
prove that W is a Brownian motion.
2.3.2
Construction via Gaussian measure
Let G be a Gaussian measure with intensity the Lebesgue measure on R+ :
let us put
Bt = G(([0; t]); t 0:
11
xi 1 )2
ti 1
!
dx1
dx
It is not di¢ cult to prove that B is a centered Gaussian process with covariance function given by (2.1). Now, we have
E(jBt
Bs j4 = 3jt
sj2 ; for any s; t
0:
Therefore in view of Theorem 15, there exists a continuous version of B which
is a Brownian motion.
2.4
Path properties
Theorem 16 (Blumenthal’s zero-one law) Let F0B+ =
event in F0B+ has probability 0 or 1.
T
s>0
FsB : Then, every
Proof. Let g : Rk ! R be a bounded continuous function. For any A 2
< tn ; in view of continuity argument and Lebesgue
F0B+ and 0 < t1 <
dominated convergence Theorem, we have
E(1A g(Bt1 ; : : : ; Btk )) = lim E(1A g(Bt1
"!0
B" ; : : : ; Btk
B" )):
If " < t1 then Bt1 B" ; : : : ; Btk
B" are independent of the
algebra
B
B
B
F" . Since F0+ F" ; these random variables are also independent of the
algebra F0B+ : It follows that
E(1A g(Bt1 ; : : : ; Btk )) = lim E(1A g(Bt1
"!0
= lim P(A)E(g(Bt1
"!0
B" ; : : : ; Btk
B" ; : : : ; Btk
B" ))
B" ))
= P(A)E(g(Bt1 ; : : : ; Btk )):
Therefore, for every t1 ; : : : ; tk > 0; F0B+ is independent of ((Bt1 ; : : : ; Btk ):
Now, since (Bt ; t > 0) is generated by sets of the form (Bt1 2 A1 ; : : : ; Btk 2
Ak ); ti > 0 and Ai 2 B(R); i = 1; : : : ; k; we deduce that F0B+ is independent
of (Bt ; t > 0): It is clear that (Bt ; t > 0) = (Bt ; t 0) because B0 is the
pointwise limit of Bt as t ! 0: Finally, since F0B+
(Bt ; t 0) we derive
that F0B+ is independent of himself that is every event in F0B+ is of probability
0 or 1:
Corollary 17 We have a.s. , for any " > 0;
sup Bs > 0;
inf Bs < 0:
0 s "
0 s "
12
Proof. Let ("p ) be a sequence of positive real numbers decreasing to 0 and
set
)
(
\
Ap :
Ap =
sup Bs > 0 and A =
0 s "p
p
Since (Ap ) is a decreasing sequence we have A 2 F0B+ :
We have
1
P(Ap ) P(B"p > 0) =
2
P(A) = lim P (Ap ) :
p!1
Therefore, by virtue of Theorem 16, we conclude that P(A) = 1:
It follows that for every " > 0
sup Bs > 0
P
= 1:
0 s "
By using the symmetry of Brownian motion, we deduce that
inf Bs < 0
P
= 1:
0 s "
Exercise 18 Prove the following assertions
1) For every
>0
lim P
!0
2) For every
sup Bs >
=1
0 s 1
>0
P
sup Bs >
0 s 1
=P
Bs > 1
sup
0 s 1=
!
2
3)
P sup Bs > 1
s 0
= P inf Bs <
s 0
13
1
=1
4) For every C > 0
P sup Bs > C
= P inf Bs <
C
s 0
s 0
=1
5) P(Ta < 1) = 1 where for any a 2 R, we put Ta = infft 0 : Bt = ag
(with the convention inf ; = +1). Ta is called the …rst passage time
in a:
6) almost surely
lim supBt = +1, and lim inf Bt =
t!1
t!1
1:
Proposition 19 (Quadratic variation of Brownian motion) Let 0 = tn0 <
< tnpn = t a sequence of partition of of [0; t] with mesh going to 0
tn1 <
that is sup1 i pn (tni tni 1 ) ! 0 as n ! 1: Then
lim
n!1
pn
X
Btni 1 )2 = t;
(Btni
i=1
where the convergence holds in L2 :
Proof. Let
= ft0 ; tn1 ; : : : ; tpnn g be a partition of [0; t]: We put
V( )=
pn
X
Btni 1 )2 :
(Btni
i=1
We have
E (V ( ))2
=
pn
X
h
E (Btni
i=1
pn
= 3
X
(tni
i
Btni 1 )4 + 2
i=1
= 2
pn
X
(tni
i=1
= 2
pn
X
(tni
tni 1 )2 +
"
1 i<j pn
X
tni 1 )2 + 2
1 i<j pn
pn
X
i=1
tni 1 )2 + t2 :
i=1
14
X
(tni
tni
h
E (Btni
tni
#2
tni 1 )
1
tnj
Btni 1 )2 (Btnj
tnj
1
Btnj 1 )2
i
Now,
E(V ( )) =
pn
X
(tni
tni 1 ) = t:
i=1
Therefore
E (V ( )
2
t)
=2
pn
X
(tni
tni 1 )2
2 sup (tni
i=1
The right hand side converges to 0 as j j = sup1
1 i pn
n
i pn (ti
tni 1 )t:
tni 1 ) goes to 0:
Corollary 20 The Brownain has no …nite variation on any interval.
Proof. Exercise.
2.5
Brownian Motion and Martingales
Martingales are a very useful theory in the study of stochastic process. Particularly it is a fundamental notion in Itô’s stochastic calculus. In this course,
we only give an introduction of Martingale theory. The reader may see [?]
for details.
2.5.1
Filtrations
De…nition 21 A …ltration F = (Ft : t
0) is an increasing family of
sub- algebras of A that is for every s t, Fs Ft :
Let X = (Xt )t 0 be a stochastic process. We put FtX = (Xs : s t):
Then F X is a …ltration called the natural …ltration of X or the …ltration
generated by X:
Heuristically, we may regard a …ltration F as a ‡ow of information, with
Ft representing all the information accumulated until time t: For example
FtX has all the information regarding the past of X and the present Xt :
Let us set
\
Ft+ =
Fs :
s>t
It is clear that (Ft+ : t 0) is a …ltration which is …ner that the …ltration
(Ft : t 0) that is Ft Ft+ for every t 0:
15
Heuristically, Ft+ has the same information as Ft plus the information just
immediately after time t: For instance, let X = (Xt )t 0 a stochastic process
representing the smooth random motion of a particle, that is t ! Xt (!) is a
smooth function for every ! 2 : The
algebra Ft+ contained information
by time t plus the velocity Vt = lim"!0 (Xt+" Xt )="; the acceleration and
so on.
De…nition 22 A …ltration F = (Ft : t
for every t 0; Ft = Ft+ :
0) is said to be right-continuous if
It is not di¢ cult to see that (Ft+ : t 0) is a right-continuous …ltration.
From now on F = (Ft : t 0) is a …xed …ltration.
De…nition 23 A stochastic process X = (Xt )t 0 is F adapted or adapted
relative to the …ltration F if for every t
0, the random variable Xt is
Ft measurable.
A stochastic process is adapted means that this process doesn’t anticipate
on the future. It is clear that any stochastic process is adapted relative to
its natural …ltration.
2.5.2
Stopping times
De…nition 24 A random time :
! R+ is a F stopping time if for
every t 0 the event f
tg belongs to Ft :
Heuristically, a random time can be viewed as the time of occurrence of
a particular event The information accumulated by time t is su¢ cient to
decide if this event has occurred or not.
For instance let X = (Xt )t 0 be a real-valued, right-continuous and
F adapted stochastic process. For every closed set F we put
F
= infft
0 : Xt 2 F g:
The random time F called the hitting time of the set F is a F stopping
time.
If we put F = supft
1 : Xt 2 F g then F is not a stopping time
because for every t 1; Ft is not su¢ cient to decide if F t or not.
16
Every constant random time
t0 is a stopping time and for every
sequence ( n )n 0 of stopping times, inf n n , supn n and limn n (if the limit
exists) are stopping times.
For every F stopping time , we consider the set F = fA 2 A : A\f
tg 2 Ft g called the past until : It is easy to prove that F is a
algebra
and F
F if is another F stopping time.
If
t then F = Ft :
Exercise 25 a) Let be a stopping time and V a random variable. Prove
that V belongs to F if and only if V 1f tg is Ft measurable for every t 2 R+ :
b) Prove that if and are stopping times then F \ F = F ^ and for
any F measurable random variable V the following random variables are
F ^ measurable:
V 1f g ; V 1f = g ; V 1f < g ;
in particular the events f
2.5.3
g; f = g and f < g belong to F
^
:
Martingales
De…nition 26 A stochastic process M = (Mt )t
( F supermartingale, F martingale) if
0
is called an F submartingale
i) M is F adapted
ii) for every t
iii) for every s
Ms
0; Mt is integrable
t;
E(Mt jFs ) (Ms
E(Mt jFs );
Ms = E(Mt jFs )):
Remark 27 A stochastic process M = (Mt )t 0 is an F supermartingale if
M is an F submartingale and an F martingale if it is both an F submartingale
and an F supermartingale.
Examples of martingales
1) Let X = (Xt )t 0 be an F adapted stochastic process with independent
increments. If for every t 0; Xt is integrable then (Xt E(Xt ))t 0 is
an F martingale. In particular a Brownian motion B = (Bt )t 0 is an
F B martingale.
17
2) Let B = (Bt )t 0 be a Brownian motion. (Bt2 t)t
Indeed, for every s t
E((Bt2
Bs2 )jFs ) =
=
=
=
=
0
is an F B martingale.
E((Bt Bs )2 + 2Bt Bs 2Bs2 jFs )
E((Bt Bs )2 jFs ) + E(2Bt Bs jFs ) + E(
E((Bt Bs )2 ) + 2Bs E(Bt jFs ) 2Bs2
t s + 2Bs2 2Bs2
t s:
2Bs2 jFs )
3) Let B = (Bt )t 0 be a Brownian motion. For every 2 R; put Bt =
2
exp( Bt 2 t), t 0: Then, B is an F B martingale and is called the
exponential martingale. (Exercise)
It is easy to see from Jensen’s inequality that if M is an F martingale
then for every convex function : R ! R such that (Mt ) is integrable,
( (Mt ))t 0 is an F submartingale.
Theorem 28 (Doob’s stopping theorem). Let M be a right-continuous F martingale.
For every bounded stopping times and such that
; M and M are
integrable and
M = E(M jF ):
The equality being replaced by
(resp. an F submartingale)
(resp.
) if M is an F supermartingale
Proof. See
As a consequence of Doob’s stopping Theorem, if M an F martingale
then for every bounded stopping time ;
E(M ) = E(M0 ):
(2.2)
If is not bounded then (2.2) may be false. For instance, let us consider Tb
the hitting time of a point b by a Brownian motion B that is Tb = infft
0 : Bt = bg: Since B has continuous trajectories, we have BTb = b: So
b = E(BTb ) 6= E(B0 ) = 0: We will prove later that Tb is almost surely …nite
and not bounded.
There exist a general version of Doob’s stopping theorem which doesn’t
need boundness of stopping times.
18
Theorem 29 (Doob’s stopping theorem for uniformly integrable martingale).
Let M be a right-continuous F martingale and uniformly integrable. For
every bounded stopping times and such that
; M and M are
integrable and
M = E(M jF ):
Proof. see
Proposition 30 Let M = (Mt )t 0 be an F martingale and a F stopping
time. Then the stopped process M = (M ^t )t 0 is an F martingale.
Proof. First of all, let us remark that in view of Doob’s stopping theorem,
for every s t; we have
E(Mt jF
^s )
= Ms :
This equality proves that M is a martingale relative to the …ltration fF
s 0g; but this is not what we want. Our aim is to prove that
E(Mt jFs ) = Ms :
^s
:
(2.3)
For this fact, let us note that it is not di¢ cult to prove that for any random
variable Z;
1f sg E(ZjF ^s ) = 1f sg E(ZjFs ):
Therefore, we have
1f
sg Ms
= 1f
sg E(Mt
jF
^s )
= 1f
sg E(Mt
jFs ):
(2.4)
Now,
1f
sg E(Mt
jFs ) = E(1f sg Mt jFs )
= E(1f sg Ms jFs )
= 1f sg Ms
(2.5)
where we have used the equality 1f sg Mt = 1f sg Ms and the fact that
1f sg Ms is Fs measurable.
Combining (2.4) and (2.5), we deduce (2.3).
Now, we deal with the strong Markov property of Brownian motion
19
Corollary 31 For very a; b > 0; let us put
0 : Bt 2]
= infft
a; b[g:
We have
P(Tb < T a ) = P(B = b) =
a
;
a+b
P(T
a
< Tb ) = P(B =
Proof. Let us put Mt = Bt2 t; and n = ^ n. Since
stopping time and M is a martingale, we have
n
a) =
b
:
a+b
is a bounded
E(M n ) = E(M0 ) = 0:
It follows that
E(B 2n ) = E(
Now, since for every n
0; j B n j
E(
n ):
a _ b; we have
n)
a2 _ b 2 :
So by applying monotone convergence theorem, we derive that E( ) < 1
and therefore is a.s. …nite.
By virtue of Doob’s stopping Theorem, we have
E(B n ) = E(B0 ) = 0:
By letting n ! 1, Lebesgue dominated convergence theorem implies
that
E(B ) = 0:
Therefore,
0 = E(B ) = bP(B = b)
By combining P(B = b) + P(B =
P(B = b) =
a
;
a+b
aP(B =
a):
a) = 1; we deduce that
P(B =
a) =
b
:
a+b
Now it is clear that
fB = bg = fTb < T a g and fB =
20
ag = fT
a
< Tb g:
Exercise 32 For very a; b > 0; let us put
0 : Bt 2]
= infft
a; b[g:
Compute the expectation of :
The following theorem shows that any stopped martingale at a stopping
time remains a martingale relative to the same …ltration.
Theorem 33 Let
for every t 0;
be a stopping time. Assume that P( < 1) > 0 and put
( )
Bt
= 1f
<1g
(B
B ):
+t
Then under the conditional probability P( j < 1); the stochastic process
B ( ) is a Brownian motion independent of the
algebra F :
Proof. The proof given here is taken from [?]. We begin by the case the
case where > 1 as Let us note that we just have to prove that for every
A 2 F ; 0 t1 <
< tp and F a nonnegative bounded continuous function
p
de…ned on R ; we have
( )
( )
(2.6)
E(1A F (Bt1 ; : : : ; Btp )) = P(A)E(F (Bt1 ; : : : ; Btp )):
Indeed, if A = in formula (2.6) then this formula shows that B ( ) is a
Brownian motion. Formula (2.6) implies that for every 0
t1 <
< tp ;
( )
( )
the random variable (Bt1 ; : : : ; Btp ) is independent of F , so by a monotone
class argument, it follows that B ( ) is independent of F :
For any integer n
1; [ ]n stands for the smallest real number of the
n
form k2
greater or equal to , with [ ]n = 1 if = 1: In order to prove
(2.6), we remark that
( )
( )
([ ]n )
F (Bt1 ; : : : ; Btp ) = lim F (Bt1
n!
([ ]n )
; : : : ; Btp
(2.7)
):
It follows by (2.7) and dominated convergence theorem that
( )
( )
E(1A F (Bt1 ; : : : ; Btp ))
=
=
([ ]n )
lim E(1A F (Bt1
n!1
lim
n!1
1
X
k=0
E(1A 1f(k
([ ]n )
; : : : ; Btp
1)2
n<
k2
))
ng
F (Bk2
21
n +t
1
Bk2 n ; : : : ; Bk2
n +t
p
Bk2 n )):
Now, for every A 2 F ; the event A\f(k 1)2 n <
k2 n g is Fk2
In view of the simple Markov property (see Propos), we have
F (Bk2 n +t1 Bk2 n ; : : : ; Bk2 n +tp
<
k2 n g)E(F (Bt1 ; : : : ; Btp )):
E(1A\f(k 1)2 n <
= P(A \ f(k 1)2
k2
n
ng
n
measurable.
Bk2 n ))
It su¢ ces to summand other k to conclude.
In the case where P( = 1) > 0; the same arguments lead to
E(1A\f
( )
( )
<1g F (Bt1 ; : : : ; Btp ))
= P(A \ f < 1g)E(F (Bt1 ; : : : ; Btp )): (2.8)
The conclusion follows straightly by equality (2.8).
A very important property of the trajectories of Brownian motion is the
re‡ection principle which is a consequence of the Markov property we have
proved above.
Theorem 34 For every t
a 0 and b a, we have
P(St
0; let us put St = supu
a; Bt
b) = P(Bt
2a
t
Bu : Then, for every
b):
In particular, St and jBt j have the same probability law.
Proof. Let us note that by virtue of Theorem Ta < 1 a.s. We have
P(St
a; Bt
b) =
P(Ta t; Bt
= P(Ta t; Bt a
(T )
= P(Ta t; Bt aTa
b)
b a)
b a):
Since B (Ta ) is a Brownian motion independent of FTa this process is in particular independent of Ta: Therefore (Ta ; B (Ta ) ) and (Ta ; B (Ta ) ) have the same
law. Let H = f(s; w) 2 R+ C(R+ ; R) : s t; w(t s) b ag: The above
probability is equal to
P((Ta ; B (Ta ) ) 2 H) =
=
=
=
because the fBt
2a
bg
fTa
P((Ta ; B (Ta ) ) 2 H)
(T )
P(Ta t;
Bt aTa b
P(Ta t; Bt 2a b)
P( Bt 2a b)
tg:
22
a)
To prove that St and jBt j have the same probability law, we note that
P(St
a) = P(St a; Bt
= 2P(Bt a)
= P(jBt j a):
a) + P(St
Exercise 35 Prove that Ta has the same law as
N (0; 1): Give the density of Ta :
a; Bt
a)
a2
where X follows the law
X
We end this chapter by giving the de…nition of multidimensional Brownian
motion.
De…nition 36 A Rd valued process B = ((Bt1 ; : : : ; Btd ))t 0 is called a d dimensional
Brownian motion started from 0 if its components B 1 ; : : : ; B d are independent
real Brownain motion started from 0:
23
Chapter 3
Stochastic integrals
In the preceeding chapter, we have proved that a Brownian motion doesn’t
have …nite variations on any interval. Therefore it is not possible to de…ne
an integral relative to B for …xed ! as a Stieljes integral. We will de…ne the
Itô’s stochastic integral which uses a L2 framework.
3.1
Itô’s integral
De…nition 37 A stochastic process ( t )t 0 is said progressively measurable
if for every t 0; the map (!; s) 7 ! t (!) de…ned on
[0; t] with values
in R is Ft B([0; t]) measurable.
Let us introduce some spaces.
2
M (R+ ) =
2
M ([0; T ]) =
: progressively measurable processes such that E
Z
2
t dt
R+
: progressively measurable processes such that E
Z
T
2
t dt
0
M2 =
2
\
T 0
M2 ([0; T ]):
It is not di¢ cult to see that M (R+ ) and M2 ([0; T ]) are Hilbert spaces when
endowed respectively with the norms
Z
Z T
h ; i=
h ; i=
t t dt;
t t dt:
R+
0
24
<1
<1
We begin by de…ning the stochastic integral for a simple process. We call
simple process a real process ( t )t 0 such that
t (!) =
n 1
X
Xi (!)1]ti ;ti+1 ]
i=0
with 0 = t0 < t1 < t2 <
< tn and Xi is a square integrable random
variable which is Fti measurable. Let E stands for the set of all simple
processes.
For every 2 E, we de…ne the stochstic integral of by
Z
t dBt =
R+
R
2
R+
t dBt
Xi [Bti+1
Bti ]:
i=0
It is clear that the map 7 !
isometry from E to L2 ( ):
We have
E
n 1
X
R
R+
t dBt
is linear. Let us prove that it is an
nP1
E Xi2 E([Bti+1 Bti ]2 jFti )
i=0
P
+2 E Xi [Bti+1 Bti ]Xj E[Btj+1
i<j
R
= E R+ 2t dt = jj jj2M2 (R+ ) :
=
Btj ]jFtj
Proposition 38 E is dense in M2 (R+ ):
Proof. See
R
Now, one can prove that the map E 3 7 ! R+ t dBt 2 L2 ( ) admits
2
2
an unique extension
R as an isometry from M (R+ ) to L ( ). This extension
is also denoted by R+ t dBt and called the stochstic integral of 2 M2 (R+ ):
The isometry property implies
Theorem 39 For every ; 2 M2 (R+ ), we have
Z
Z
Z
2
E
= 0; E
=E
t dBt
t dBt
R+
E
R+
Z
R+
t dBt
Z
t dBt
R+
R+
=E
Z
R+
25
2
t dt
t
t dt
:
Remark
40 If is a deterministic function in M2 (R+ ) that is 2 L2 (R+ ; B(R+ ); dt)
R
then R+ t dBt is the Wiener integral de…ned by as the element of the Gaussian
R
space H B with variance R+ 2t dt. If G is a Gaussian measure with intensity
R
the Lebesgue measure then R+ t dBt is also de…ned as G( ): It is clear that
the two de…nitions coincides.
If
2 M2 then we de…ne the stochastic integral of
Z t
Z
1]0;t] (u) u dBu :
u dBu =
0
We put
Z
R+
Z
t
u dBu
on [0; t] by
=
s
1]s;t] (u)
u dBu
R+
Rt
One can prove that the stochastic process
0
u dBu
u dBu
has a continuous
t 0
version. From now on, we consider this continuous version.
Rt
Proposition 41 i) Let us put Mt = 0 u dBu : The stochastic process M is
a continuous martingale.
Rt 2
du
ii) Mt2
is a continuous martingale
0 u
t 0
Proof. Let (ti )i be a sequence of subdivision of [s; t] and (Xi )i be random
variables such that Xi is Fti measurable and square integrable with
Z t
X
in L2 ( ):
dB
=
lim
Xi [Bti+1 Bti ];
u
u
s
i
M is F adapted and square integrable. Now, in L2 ( ); we have
Rt
P
E s u dBu jFs = lim i E(Xi [Bti+1 Bti ]jFs )
P
= lim i E(Xi E([Bti+1 Bti ]jFti )jFs )
= 0:
Since, E[Mt2 Ms2 jFs ) = E[(Mt Ms )2 jFs ); to prove that Mt2
is a martingale, it is su¢ cient to establish that
Z t
2
2
E[(Mt Ms ) jFs ) = E
u dujFs :
s
26
Rt
0
2
u du
t 0
We have
Rt
E
s
2
u dBu jFs
= E lim
n
P
2
Xi [Bti+1
jFs
Bti ]
i
2
P
!
!
lim E
Xi [Bti+1 Bti ] jFs
n
i
P
= lim E E Xi2 [Bti+1 Bti ]2 jFti jFs
n
i
P
+2 E E Xi Xj [Bti+1 Bti ][Btj+1 Btj ]jFtj jFs
i<j
P
= lim E[Xi2 (ti+1 ti )jFs ]
n
R it 2
= E s u dujFs
3.2
Itô’s formula
Let us recall the fundamental theorem of di¤erential calculus. Let x : R+ !
R and : R ! R be deterministic functions of class C 1 ; Then we have
Z t
Z t
0
0
0
(x(t)) = (x(0)) +
(x(s))x (s)ds = (x(0)) +
(x(s))dx(s):
0
0
The aim of this section is to extend this formula to stochastic calculus.
Let us consider the following example. Let ti = it=n be a subdivision of
[0; t]: We have
Bt2
=
n
X
[Bt2i
i=1
n
X
= 2
Bt2i 1 ]
Bti 1 [Bti
Bti 1 ] +
i=1
n
X
[Bti
Bti 1 ]2
i=1
By letting n ! 1; we obtain
Bt2
Z t
= 2 Bs dBs + t:
0
Therefore the fundamental theoremm for di¤erential calculus doesn’t work
in the stochastic case.
27
Let Cb2 denote the set of real valued fonctions de…ned on R such that
and its …rst and second derivatives are bounded.
2 Cb2 ; we have a.s.
Z t
Z
1 t
0
(Bt ) = (B0 ) +
(Bs )dBs +
2 0
0
Theorem 42 For every
00
(Bs )ds;
8t
0
Proof. Let ti = it=n be a subdivision of [0; t]: Taylor’s formula applied to
leads to
n
X
(Bt ) =
(B0 ) +
( (Bti )
(Bti 1 ))
i=1
n
X
0
(Bti 1 ))
(B0 ) +
(Bti )( (Bti )
i=1
X 00
1
(B i )[ (Bti )
(Bti 1 )]2 ;
2
=
with i = i (n; !) 2]ti 1 ; ti [:
By the de…nition of stochastic integral, we have
Z t
n
X
0
0
(Bs )dBs ; as n ! 1:
(Bti 1 )) !
(Bti )( (Bti )
0
i=1
Now, let us put
n
X
Un =
00
(B i )[ (Bti )
(Bti 1 )]2
(Bti 1 )[ (Bti )
(Bti 1 )]2
i=1
Vn =
n
X
00
i=1
and
Wn =
n
X
00
(Bti 1 )[ti
ti 1 ]:
i=1
We have
EjUn
Vn j
E(sup j
i
2
4E(sup j
i
00
(B i )
00
(Bti 1 )j
n
X
[ (Bti )
(Bti 1 )]2
i=1
00
(B i )
00
(Bti 1 )j2
n
X
i=1
28
[ (Bti )
(Bti 1 )]2
!2 31=2
5
:
Therefore by letting n ! 0 and thanks to Lebesgue dominated convergence
Theorem and Theorem, we have EjUn Vn j ! 0: We have
3
2
2
n
X
00
(Bti 1 )([ (Bti )
(Bti 1 )]2 (ti ti 1 )) 5
EjVn Wn j2 = E 4
i=1
=
n
X
i=1
00
E j
sup(
00
)2
n
X
i=1
= sup(
00
(Bti 1 )]2
(ti
ti 1 ))j2
(Bti 1 )]2
(ti
ti 1 ))j2
(Bti 1 )([ (Bti )
2
)
2
E j([ (Bti )
n
X
(ti
ti 1 )2 :
i=1
Therefore by letting n ! 1; EjVn Wn j2 ! 0:
Since,
Z
n
X
00
Wn =
(Bti 1 )[ti ti 1 ] !
t
00
(Bs )ds;
0
i=1
we conlude.
Exercise 43 For every progressively measurable process such that ,
1; t 0 we put
Z
Z t
1 t 2
ds :
Zt = exp
s dBs
2 0 s
0
Rt
0
2
s ds
Prove that the stochastic process Z is a martingale.
Exercise 44 Let X = (Xt )t
be an Itô process that is
Z t
Z t
Xt = X0 +
s dBs +
s ds;
0
0
0
where ; 2 M2 and X0 is a square integrable and F0 measurable random
variable. Prove that for every 2 Cb2
Z t
Z t
Z
1 t 00
0
0
(Xt ) = (X0 ) +
(Xs ) s dBs +
(Xs ) s ds +
(Xs ) 2s ds:
2
0
0
0
(3.1)
29
<
Formula (3.1) may be write in di¤erential form
dXt =
0
(Xt )dXt +
where
hXit =
Z
0
30
1
2
00
(Xs )dhXit
t
2
s ds: