Sheet 1

Probability 2 (Wahrscheinlichkeitstheorie)
Prof. Jean Bertoin
HS 2013
Sheet 1
We work on a probability space (Ω, F , P) for all the exercises. All the random variables and functions
are assumed to be real-valued.
Exercise 1. The three questions are independent.
(i) Let X and Y be i.i.d. random variables Bernoulli distributed: P(X = 1) = 1 − P(X = 0) = p
for some p ∈ [0, 1]. We set Z := 1{X+Y =0} . Compute E[X|Z] and E[Y |Z]. Are these random
variables independent ?
(ii) Let G ⊂ F be a sub-σ-algebra and A ∈ F . We consider B := {E[1A |G ] = 0} ∈ G . Show that
B ⊂ Ω \ A almost surely.
(iii) Let X be a square-integrable random variable and G ⊂ F a sub-σ-algebra. We define the
conditional variance
Var(X|G ) := E[(X − E[X|G ])2 |G ].
Show the following identity:
Var(X) = E[Var(X|G )] + Var(E[X|G ]).
Exercise 2. Let X and Y be two integrable random variables.
(i) Assume that X = Y almost surely. Prove that, almost surely, E[X|Y ] = Y and E[Y |X] = X.
(ii) Conversely, suppose that X and Y are such that E[X|Y ] = Y and E[Y |X] = X almost surely.
Prove that X = Y almost surely.
Hint: You can consider E[(X − Y )1{X>c,Y ≤c} ] + E[(X − Y )1{X≤c,Y ≤c} ].
(Recall that for any x, y ∈ R, x > y is equivalent to ∃c ∈ Q such that x > c ≥ y.)
Exercise 3.
(i) Let X, Y, Z be three random variables such that (X, Z) and (Y, Z) are identically distributed.
We denote by µ the distribution of X.
a) Prove that X and Y are identically distributed.
b) Prove that for any measurable function f non-negative or such that f (X) is integrable,
E[f (X)|Z] = E[f (Y )|Z] almost surely.
c) Let g be a measurable function non-negative or such that g(Z) is integrable. Set
h1 (X) := E[g(Z)|X] and h2 (Y ) := E[g(Z)|Y ].
Show that h1 = h2 µ-almost surely.
(ii) Let T1 , T2 , . . . , Tn be i.i.d. integrable random variables and set Sn = T1 + . . . + Tn .
a) Prove that for any j ∈ {1, . . . , n}, E[Tj |Sn ] = n−1 Sn almost surely.
b) Compute E[Sn |Tj ] for any j ∈ {1, . . . , n}.
1
Probability 2 (Wahrscheinlichkeitstheorie)
Prof. Jean Bertoin
HS 2013
Exercise 4. Let (X, Y ) be a Gaussian vector in R2 with mean (a, b) and covariance matrix D. Suppose
also that Var(Y ) > 0.
(i) Find α, β ∈ R (depending on a, b and D) such that the following holds:
E[X] = E[α + βY ] and E[XY ] = E[(α + βY )Y ].
Using these α and β, we define the function ` : R → R : s 7→ α + βs. We then set Z = X − `(Y ).
(ii) Show that (Z, Y ) is a Gaussian vector and then that Z and Y are independent.
(iii) Show that E[X|Y ] = `(Y ) almost surely.
(iv) Compute Var(Z) in terms of D.
p
(v) In what follows, we denote by σ := Var(Z) and, for any a ∈ R, µa is the Gaussian N (a, σ 2 )
distribution.
Let f : R → R be a measurable and bounded function. For any a ∈ R, we set
Z
ψ(a) =
f dµa .
R
a) Show that ψ is continuous when σ > 0.
b) Show that E[f (X)|Y ] = ψ(`(Y )) almost surely.
Please drop your solutions with the name of your assistant on it into his letter box in Y27J, until Tuesday,
24th of September.
2