Initial enlargement of filtrations and entropy of
Poisson compensators
Stefan Ankirchner and Jakub Zwierz
“Enlargement of Filtrations and Applications to Finance and Insurance”
Jena, May 31 - June 4, 2010
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
1
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
Hypothesis (H’)
F = (Ft ) a filtration
G ⊃ F an enlargement
Hypothesis (H’):
every F-martingale is a G-semimartingale.
Question: Does the enlargement preserve integrability?
For example, when do we have for any r , p ≥ 1:
Every Hp (F)-martingale is a S r (G)-semimartingale.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
2
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
S p norms for semimartingales
Recall the definition of the k · kS p norm:
Definition
X a special F-semimartingale,
canonical F-decomposition: X = M + A
Let 1 ≤ p < ∞. Then
Z
1
2
kX kS p (F) := [M, M]∞
+
0
∞
|dAs | p .
L
S p (F) is the set of all F-semimartingales with kX kS p (F) < ∞.
Hp (F) is the set of all F-martingales with kX kS p (F) < ∞.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
3
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
0
Hypothesis (Hp,r
)
Suppose (H’) is satisfied. If M is an F-martingale and ψ is
G-predictable and bounded, then
(ψ · M) is defined wrt G.
0 ):
Hypothesis (Hp,r
there exists a constant Cp,r such that for every Fmartingale M and every bounded G-predictable ψ
Z
kψ · MkS r (G) ≤ Cp,r
E
∞
p ! p1
2
ψ 2 dhM, Mi
0
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
4
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
Initial enlargements by finite partitions
let F be a filtration s.th. every F-martingale is continuous,
let A1 , A2 , . . . be a countable measurable partition,
let G = (Gt ) be the initial enlargement
\
Gt =
(Fs ∨ σ(A1 , A2 , . . .)).
s>t
Theorem (Yor 1985, LNM 1118)
Let p, γ > 0 and r ≥ 1 such that
0 ) is satisfied iff
(Hp,r
X
P(An ) log
n≥1
Stefan Ankirchner and Jakub Zwierz
1
r
=
1
P(An )
1
p
+
1
2γ .
Then Hypothesis
γ
< ∞.
Initial enlargement & Poisson compensators
5
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
Initial enlargements by finite partitions
Special case: γ = 1. Then
X
n≥1
P(An ) log
1
P(An )
γ
is the absolute entropy of the partition A1 , A2 , . . .
Corollary
If the partition’s absolute entropy is finite, then
H2 (F) → S 1 (G), X 7→ X , is continuous.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
6
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
Arbitrary initial enlargements
let F be a filtration s.th. every F-martingale is continuous,
let G = (Gt ) be an arbitrary initial enlargement by a random
variable G
\
Gt =
(Fs ∨ σ(G )).
s>t
Theorem (A., Dereich, Imkeller, Sem. de Prob. XL, 2007)
1
Let p, γ > 0 and r ≥ 1 such that 1r = p1 + 2γ
. Then Hypothesis
0
(Hp,r ) is satisfied iff the generalized mutual information satisfies
I p (σ(G ), F∞ ) < ∞.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
7
Beyond Hypothesis (H’)
Continuous embedding of continuous martingales
Information drifts
Definition
A G-predictable process α is called G-information drift of an
F-martingale M if
Z t
Mt −
αs dhM, Mis
0
is a G-local martingale.
Theorem (A., Dereich, Imkeller, ’07)
If I (σ(G ), F∞ ) < ∞, then every F-martingale has a square
integrable G-information drift.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
8
Poisson random measures and filtrations
Information drift in terms of compensators
Problem description
So far: Every F-martingale was supposed to be continuous.
Questions:
When is integrability of semimartingales with jumps preserved
under enlargements?
0 ) for jump semimartingales?
When do we have (Hp,r
Does finite entropy guarantee integrable information drifts?
Does finite entropy imply that every jump semimartingale M
in H 2 (F) belongs to S 1 (G)?
To simplify the analysis we make the following restrictions:
initial enlargements
pure jump martingales driven by a comp. Poisson random
measure
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
9
Poisson random measures and filtrations
Information drift in terms of compensators
Poisson random measures
jump space: a st. Borel space (E , E)
jump measure: ν (assumed to be σ-finite)
Let µ be a homogeneous Poisson random measure on
(R+ × E , λ ⊗ ν). In particular
a) if λ ⊗ ν(A) < ∞, then µ(ω; A) is Poisson distributed with
intensity λ ⊗ ν(A),
b) if A ∩ B = ∅, then µ(ω; A) and µ(ω; B) are independent.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
10
Poisson random measures and filtrations
Information drift in terms of compensators
Poisson compensators depend on the filtration
Lemma and Definition
F = the filtration generated by µ
H ⊃ F an enlargement
P(H) = the predictable σ-field on Ω × R+ associated with H
There exists a unique predictable random measure π H , called
compensator of µ relative to H, s.th.
!
!
Z Z
Z Z
∞
∞
φ(s, z)µ(dz, ds)
E
0
E
φ(s, z)π H (dz, ds)
=E
0
E
for every nonnegative P(H) ⊗ E-measurable function φ.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
11
Poisson random measures and filtrations
Information drift in terms of compensators
Poisson compensators depend on the filtration
Remark.
Compensators exist for all enlargements of F!
π F = λ ⊗ ν.
Let µH := µ − π H . If ψ is square integrable, then
Z tZ
H
(ψ ∗ µ )t =
ψ(s, z)µH (ds, dz)
0
E
is a square-int. H-martingale.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
12
Poisson random measures and filtrations
Information drift in terms of compensators
Information drifts of pure jump martingales
F = (Ft ) a filtration
G ⊃ F an enlargement
Definition
A G-predictable process α is called G-information drift of an
F-martingale M if
Z t
Mt −
αs ds
0
is a G-local martingale.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
13
Poisson random measures and filtrations
Information drift in terms of compensators
Linking information drifts with compensators
Rt R
let Mt = 0 E ψ(s, z)µF (ds, dz) be an F-martingale
G ⊃ F an enlargement
α = G-information drift of M
Notice that
t
Z
Mt −
αs ds
(ψ ∗ µF )t −
=
0
t
Z
αs ds
0
(ψ ∗ µG )t + (ψ ∗ (π G − π F ))t −
|
{z
G − predictable
=
t
Z
0
αs ds
}
Z
=⇒
αs
=
ψ(s, z)(π G − π F )(ds, dz)
G
Z
dπ
ψ(s, z)
−
1
π F (ds, dz)
dπ F
E
E
=
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
14
Poisson random measures and filtrations
Information drift in terms of compensators
Linking information drifts with compensators
The information drift, provided it exists, satisfies
G
Z
dπ
αs =
ψ(s, z)
− 1 π F (ds, dz)
dπ F
E
Question: Is the Lebesgue integral defined P-a.s.?
a necessary condition: π G × P π F × P on P(G) ⊗ E
a sufficient condition: ψ ∈ L2 (π F × P) ∩ L1 (π F × P)
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
15
Entropy and mutual information
Poisson compensators of initial enlargements
Mutual information
Definition
let A, B be two sub-σ-algebras of F
P(·|B) = regular conditional probability of P with respect to B
The mutual information between A and B is defined as
I (AkB) = EHA (P(·|B)kP),
where HA (P(·|B)kP) is the relative entropy of P(·|B) wrt P on A.
This means: if P(·|B) P on A, then
Z Z
P(dω 0 |B)(ω) I (AkB) =
log
P(dω 0 |B)P(dω).
A
P(dω 0 )
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
16
Entropy and mutual information
Poisson compensators of initial enlargements
Mutual information between random variables
Lemma
Let G , H be two random variables, and denote by PG and PH their
distributions. Moreover let PG ,H be the joint distribution of G and
H. Setting A = σ(G ) and B = σ(H), we have:
I (AkB) = H(PG ,H kPG ⊗ PH ).
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
17
Entropy and mutual information
Poisson compensators of initial enlargements
Properties of mutual information
Properties:
1)
2)
3)
4)
I (A, B) = I (B, A),
A, B independent =⇒ I (A, B) = 0,
if C ⊂ B, then I (A, C)) ≤ I (A, B),
if X and Y are Gaussian, then
1
I (σ(X ), σ(Y )) = − log 1 − corr(X , Y )2 .
2
5) Let A be generated by a countable partition
{A1 , A2 , . . .}. Then I (A, A) = absolute entropy of
A, i.e.
X
I (A, A) = −
P(Ai ) log P(Ai ).
i≥1
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
18
Entropy and mutual information
Poisson compensators of initial enlargements
Initial enlargement of Poisson filtrations
Back to Poisson random measures:
F = (Ft ) = filtration generated by µ(dt, dz)
W
F∞ = t Ft
G = (Gt ) = initial enlargement by a random variable G :
\
Gt =
Fs ∨ σ(G )
s>t
Question: What does I (σ(G )kF∞ ) tell us about π G (dt, dz), or
the information density
G
dπ
−1 ?
dπ F
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
19
Entropy and mutual information
Poisson compensators of initial enlargements
Mutual information and entropy of compensators
Theorem
(i) If T ∈ R+ and ν(E ) < ∞, then
HPT (G)×E (π G × Pkπ F × P) = I (σ(G )kFT ).
(ii) Let Tn ∈ R+ such that Tn ↑ ∞, and E1 , E2 , . . . an increasing
sequence of sets in E with En ↑ E and ν(En ) < ∞. Then we
have
sup HPTn (G)×(E∩En ) (π G × Pkπ F × P) = I (σ(G )kF∞ ).
n
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
20
Entropy and mutual information
Poisson compensators of initial enlargements
Convex conjugate of the information density
The G-information drift of an F-martingale M = (ψ ∗µF ):
G
Z
dπ
αs =
− 1 π F (ds, dz)
ψ(s, z)
F
dπ
E
If
)kF
∞ ) is finite, then we have convex integrability of
I (σ(G
dπ G
− 1 wrt
dπ F
f (x) = (x + 1) log(x + 1) − x.
Namely,
Z G
Z
G
dπ
dπ G
dπ
F
F
E
f
− 1 π (ds, dz) = E
log
π (ds, dz)
dπ F
dπ F
dπ F
= I (σ(G )kF∞ )
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
21
Entropy and mutual information
Poisson compensators of initial enlargements
Convex conjugates
Recall the definition of the convex conjugate or Legendre
transform of a convex function f : R → R ∪ {+∞}:
f ∗ : R → R ∪ {+∞},
f ∗ (y ) := sup(xy − f (x)).
x∈R
Young’s Inequality:
xy ≤ f (x) + f ∗ (y ).
The convex conjugate of
f (x) = (x + 1) log(x + 1) − x,
x > −1,
is given by
f ∗ (y ) = e y − y − 1.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
22
Entropy and mutual information
Poisson compensators of initial enlargements
Convex conjugates
Young’s inquality implies
xy ≤ e y − y − 1 + (x + 1)log (x + 1) − x.
Proposition
Let M = (ψ ∗ µF ) be an F-loc. martingale, and suppose
Z
f ∗ (|ψ|) d(π F × P) < ∞.
If I (σ(G )kF∞ ) is finite, then M possesses an integrable
G-information drift. It is given by
G
Z
dπ
αs =
ψ(s, z)
− 1 π F (ds, dz)
F
dπ
E
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
23
Finite entropy implies integrable information drifts
Theorem
Let G = (Gt ), where Gt =
If I (σ(G )kF∞ ) < ∞, then
T
s>t
Fs ∨ σ(G ).
(i) every X ∈ H2 (F) possesses an integrable G-information drift,
(ii) H2 (F) ⊂ S 1 (G),
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
24
Example I
Let ψ ∈ L1 (π F ) be a deterministic positive function and define
Z ∞Z
G=
ψ(s, z)µ(ds, dz).
0
E
ThenR G Ris Poisson distributed with parameter
∞
λ = 0 E ψ(s, z)π F (ds, dz) ∈ R+ . The mutual information
between G and F∞ is given by
I (σ(G )kF∞ ) = −
∞
X
i=0
e −λ
λi λi
log e −λ
,
i!
i!
which is easily shown to be finite. Consequently every square
integrable F-martingale has a G-information drift and belongs to
S 1 (G),
T where the filtration G = (Gt ) is defined by
Gt = s>t Fs ∨ σ(G ).
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
25
Example II
Let τ be the first jump time of a standard Poisson process
N = (Nt )t≥0 . Notice that I (σ(τ ), F∞ ) = ∞, since the distribution
of τ is absolutely continuous wrt the Lebesgue measure.
Add some noise:
Let X be normally distributed, and independent of F∞ . Let
τ̂ = τ + X ,
and consider the enlargement
Gt =
\
Fs ∨ σ(τ̂ ).
s>t
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
26
Example II cont’d
Since X is normally distributed, the mutual information satisfies
1 Var(τ ) + Var(X ) I (σ(τ̂ )kF∞ ) ≤ log
< ∞.
2
Var(X )
Notice that the variance Var(τ ) is defined since τ is exponentially
distributed.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
27
References
S.A. and Jakub Zwierz: Initial enlargement of filtrations and
entropy of Poisson compensators. Journal of Theoretical
Probability 2010. In Press.
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
28
Thanks for your attention !!
Stefan Ankirchner and Jakub Zwierz
Initial enlargement & Poisson compensators
29
© Copyright 2026 Paperzz