Untitled

Homework # 2
1. Let {Xij , 1 ≤ i ≤ n, 1 ≤ j ≤ m(i)} be a collection of independent random variables on
a probability space (Ω, F, IP ). Let fi : Rm(i) → R be measurable maps for i = 1, · · · n. Let
Ui =f
˙ i (Xi,1 , · · · , Xi,m(i) ). Show that the random variables Ui , i = 1, · · · n are independent.
2. Recall that for a metric space S, B(S) denotes the Borel σ field. Show that B(Rn ) = B(R)⊗n
3. Give examples of the following.
(i) Two random variables X and Y on a measurable space (Ω, F) and two probability measures
P and Q on this space such that under P X, Y are independent while under Q they are
not.
(ii) A probability measure on (R2 , B(R)⊗2 ) that is not a product measure.
(iii) A probability measure on (R2 , B(R)⊗2 ) that is a product measure.
4. Let Ω = {1, 2, 3, 4}, F the collection of all subsets of Ω and P ({i}) = 0.25, i = 1, 2, 3, 4.
Give an example of two classes A1 , A2 that are independent but whose generated σ fields are
not.
5. Show that X1 , · · · Xn are independent if σ{X1 , · · · , Xk−1 } is independent of σ{Xk } for all
k = 2, 3, · · · n.
6. Let {Xi } be a sequence of independent random variables on some probability space
(Ω, F, IP ). Show that for any fixed n, σ(X1 , · · · Xn ) is independent of σ(Xn+1 , Xn+2 , · · · ).
1
Homework # 3
1. Let {Xn } be a sequence of random variables given on some probability space. Suppose that
n)
IE(Xn2 ) → 0 as n → ∞. Show that Sn −IE(S
converges to 0 in probability as n → ∞.
n
2. Let {Xn } be a sequence of uncorrelated random variables with IE(Xi ) = µi and Var(Xi )/i →
0 as i → ∞. Let Sn = X1 + · · · Xn and νn = IE(Sn )/n, then as n → ∞, Sn /n − νn → 0 in L2
and in probability.
3. Let r : IN0 → (−∞, ∞) be such that r(k) → 0 as k → ∞, where IN0 is the space of
nonnegative integers. Let {Xn } be a sequence of random variables such that IE(Xn ) = 0 and
IE(Xn Xm ) ≤ r(n − m) for m ≤ n. Show that (X1 + · · · Xn )/n → 0 in probability.
4. (Monte Carlo Integration.) Let f be a measurable function on [0, 1] with
∞. Let U1 , U2 , · · · be independent and uniformly distributed on [0, 1]. Define
R1
0
|f (x)|dx <
In =n
˙ −1 (f (U1 ) + · · · f (Un )).
R1
R1
Show that In → I =
˙ 0 f (x)dx in probability. Suppose next that 0 |f (x)|2 dx < ∞. Use Cheby√
chev’s inequality to estimate IP (|In − I| > a/ n).
5. Let {Xn } be an i.i.d sequence with P (Xi > x) = e/x log(x) for x > e. Show that IE|Xi | = ∞
but there is a sequence of finite constants µn → ∞ so that Sn /n − µn → 0 as n → ∞.
1
Homework # 4
Note: All random variables will be R valued, unless stated explicitly.
1. Let Xn be any sequence of random variables. Show that the following are equivalent. (i)
Xn → 0, a.s. (ii) for every ² > 0 P (|Xn | > ², i.o.) = 0. Also show that these equivalent properties are implied by the condition: (iii) For some sequence of ²n ↓ 0, P (|Xn | > ²n , i.o.) = 0.
2. If Xn is any sequence of random variables, there are constants cn → ∞ so that Xn /cn → 0,
a.s.
P
c
3. (i) If P (An ) → 0 and ∞
n=1 P (An ∩ An+1 ) < ∞ then P (An , i.o.) = 0. (ii) Find an example
of a sequence An to which the result in (i) can be applied but the Borel-Cantelli lemma cannot.
4. Let An be a sequence of independent events with P (An ) < 1 for all n.
P (∪∞
n=1 An ) = 1 implies P (An i.o.) = 1.
Show that
5.
P Let X1 , X2 , · · · be independent. Show that sup Xn < ∞ a.s. if and only if
n P (Xn > A) < ∞ for some A ∈ (0, ∞).
6. Let X1 , X2 , · · · be independent with P (Xn = 1) = pn and P (Xn = 0) = 1 − pn . P
Show that
(i) Xn → 0 in probability if and only if pn → 0, and (ii) Xn → 0, a.s. if and only if
pn < ∞.
1
Homework # 5
1. Let
P∞Xn ≥ 0 be independent for n ≥ 1. Show that the following are equivalent.
(i) Pn=1 Xn < ∞ a.s.
(ii) P∞
n=1 (IP (Xn > 1) + IE(Xn 1Xn ≤1 )) < ∞
Xn
(iii) ∞
n=1 IE( 1+Xn ) < ∞.
2. Let ψ(x) = x2 when |x| ≤ 1 and
equal to |x| when |x| ≥ 1.
Show that if X1 , X2 , · · · are
P∞
P∞
independent with IE(Xn ) = 0 and n=1 IE(ψ(Xn )) < ∞ then n=1 Xn converges a.s.
3. Let X1 , X2 , · · · be independent and let Smn = Xm+1 + · · · Xn for m < n and define Snn = 0.
Show that
IP ( max |Smj | > 2a) min IP (|Sk,n | ≤ a) ≤ IP (|Smn | > a).
(1)
m<j≤n
m<k≤n
4. Use (1) to show that if X1 , X2 , · · · are independent and Sn = X1 + · · · Xn , then limn→∞ Sn
exists in probability implies that limn→∞ Sn exists a.s.
5. Let X1 , X2 , · · · be i.i.d. and Sn = X1 + · · · Xn . Use (1) to show that if Sn /n → 0 in
probability then (max1≤m≤n Sm )/n → 0 in probability as well.
1
Homework # 8
1. Suppose that X and Y are independent. Let φ be a measurable function with IE|φ(X, Y )| <
∞. Let g(x) = IE(φ(x, Y )). Show that IE(φ(X, Y ) | X) = g(X).
2. Prove the ”conditional” Cauchy-Schwarz inequality: Let X, Y be square integrable random
variables. Show that
(IE(XY | G))2 ≤ IE(X 2 | G)IE(Y 2 | G), a.s.
3. Prove the ”conditional” Chebychev’s inequality: Let X be a square integrable random
variable. Show that for all a ∈ (0, ∞)
IP (|X| > a | G) ≤
1
IE(X 2 | G), a.s.
a2
4. By considering Ω = {a, b, c} , give an example showing that if G1 , G2 are two sub-σ
fields of F and X is an integrable random variable, then IE (IE(X | G1 ) | G2 ) need not equal
IE (IE(X | G2 ) | G1 ).
5. Suppose that X ≥ 0 and IE(X) = ∞. Show that there is a unique G measurable random
variable Y ; 0 ≤ Y ≤ ∞ satisfying
Z
Z
XdP =
Y dP ∀ A ∈ G.
A
A
1
Homework # 9
1. Let {Xn }n≥0 be a submartingale with sup Xn < ∞ a.s. Let ξn = Xn − Xn−1 and suppose
IE(supn ξn+ ) < ∞. Show that Xn converges a.s.
Hint. Follow the proof of the result on martingales with bounded increments that was covered
in class.
2. Let Xn andPYn be positive integrable and adapted to Fn . Suppose that IE(Xn+1 | Fn ) ≤
Xn + Yn with
Yn < ∞ a.s. Prove that Xn converges a.s. to a finite limit.
P
Hint. Introduce a supermartingale; consider the stopping time N = inf k km=1 Ym > M and
stop the supermartingale at time N . Also recall the proof for martingales with bounded increments.
3. Let µ, ν, µ̂n and ν̂n be as in the proof of Kakutani’s theorem. Suppose that both µ̂n and ν̂n
are concentrated on {0, 1} for all n. Let αn = µ̂n {1} and βn = ν̂n {1}.
(i)Find a necessary and sufficient condition in terms of αn , βn for µ << ν.
(ii)
P Suppose2that 0 < ² ≤ αn , βn ≤ 1 − ² < 1. Show that in this case the condition is simply
(αn − βn ) < ∞.Q
P∞
Hint. Recall that ∞
m=1 (1 − xm ) converges.
m=1 xm converges to a non zero quantity iff
4. Let ξin , i, n ≥ 0 be i.i.d. nonnegative integer valued random variables. Define a sequence
Zn , n ≥ 0 by Z0 = 1 and
Zn+1 = ξ1n+1 + · · · + ξZn+1
, if Zn > 0.
n
If Zn = 0, we set Zn+1 = 0. Let Fn = σ{ξim : i ≥ 1, 1 ≤ m ≤ n} and µ = IE(ξim ) ∈ (0, ∞).
(i) Show that Zn /µn is a Fn martingale.
(ii) Show that, if µ < 1 then Zn = 0 for all n sufficiently large and so Z n /µn → 0 as n → ∞.
1
Homework # 10
1. Let Xn be a submartingale and let M , N be stopping times such that M ≤ N , a.s. Further
suppose that IP (N ≤ k) = 1. Show that:
(i) IE(XM ) ≤ IE(XN ).
(ii) IE(XN | FM ) ≥ XM , a.s.
2. Let¡Xn , Yn be martingales
with IE(Xn2 ) < ∞, IE(Yn2 ) < ∞ for all n. Show that:
¢
2
2 for all m ≤ n.
(i) IE (Xn − Xm ) | Fm = IE(Xn2 | Fm ) − Xm
(ii) IE(Xn Yn ) − IE(X0 Y0 ) =
Pn
m=1 IE
((Xm − Xm−1 )(Ym − Ym−1 )).
3. Let Xn be a martingale and define ξn =X
˙ n − Xn−1 , n ≥ 1. Use the Lp convergence theorem
for martingales to show
P the following.
2
2
(i) If IE(X02 ) < ∞, ∞
m=1 IE(ξm ) < ∞ then Xn → X∞ , a.s. and in L .
(ii) If bm ↑ ∞ and
P∞
2
2
m=1 IE(ξm )/bm
< ∞ then Xn /bn → 0 a.s.
4. Let Z1 , Z2 , · · · be i.i.d. with IE(|Z1 |) < ∞. Let θ be a r.v. independent of {Zi } with finite
mean and define Yi = Zi + θ. In Bayesian terms, the distribution of θ is called Prior distribution and the distribution IP (θ ∈ · | FnY ), where FnY =σ{Y
˙
1 , · · · Yn } is called the Posterior
distribution. Show that the ”posterior mean”: IE(θ | FnY ) converges to θ, a.s. as n → ∞. This
property is called Consistency.
5. Show that if Fn ↑ F∞ and Yn → Y in L1 , then IE(Yn | Fn ) → IE(Y | F∞ ) in L1 .
1
Homework # 11
1. Let X, Xn , n ≥ 1, be integer valued. Show that Xn ⇒ X if and only if IP (Xn = m) →
IP (X = m) for all m.
2. Let Xn ⇒ X and Yn ⇒ c, where c is a constant. Show that:
(i) Yn → c in probability.
(ii) Xn + Yn ⇒ X + c. Consequently, if Zn − Xn ⇒ 0 then Zn ⇒ X.
(iii) Xn Yn ⇒ cX.
Hint. Use Skorohod’s rep. Th. and continuous mapping theorem.
3. Show that if Xn = (Xn1 , · · · , Xnn ) is uniformly distributed over the surface of the sphere of
√
radius n in Rn then Xn1 ⇒ a standard normal r.v.
¢1
¡ P
Hint. Use the fact that if Y1 , · · · are i.i.d. N (0, 1) then (Y1 , · · · Yn ) n/ nm=1 Ym2 2 has the
same distribution as Xn .
4. Suppose g and h are continuous functions from R to R. Suppose that g(x) > 0 for all x
and
functions such that Fn ⇒ F and
R |h(x)|/g(x) → 0 as |x| → ∞. Let
R Fn , F be distribution
R
g(x)dFn (x) ≤ C < ∞. Show that h(x)dFn (x) → h(x)dF (x).
Hint. Use Skorohod repn. theorem and show that the family {h(X̃n )}n≥1 (where X̃n is the
sequence in the cited theorem) is u.i.
5.
(i) Show that if F1 , · · · Fn are distribution functions withPch.fs φ1 , · · · φn resp.
Pnand λi ≥ 0 are
n
such that λ1 + · · · λn = 1 then the distribution function i=1 λi Fi has ch.f. i=1 λi φi .
(ii) Show that if φ is a ch.f. then <φ and |φ|2 are also, where for complex number z, <z
denotes its real part.
1
Solutions of Homework # 6
1. Let Bk = {|Sk | ≥ 3α, |Sj | < 3α, for j < k}. Let Λ = max1≤k≤n P (|Sk | ≥ α).
P ( max |Sk | ≥ 3α) ≤ P (|Sn | ≥ α) +
1≤k≤n
n−1
X
P (Bk ∩ |Sn | < α)
k=1
≤ Λ+
= Λ+
≤ Λ+
n−1
X
k=1
n−1
X
k=1
n−1
X
P (Bk ∩ |Sn − Sk | > 2α)
P (Bk )P (|Sn − Sk | > 2α)
P (Bk ) (P (|Sn | > α) + P (|Sk | > α))
k=1
≤ Λ + 2Λ
n−1
X
P (Bk )
k=1
≤ 3Λ
2. Y is measurable with respect to the tail σ field of {Xn }. Thus be K’s 0-1 law
we have that for each n ≥ 1, there exists a unique integer an such that P (Y ∈
[an /2n , (an + 1)/2n )) = 1. Also letting Bn = [an /2n , (an + 1)/2n ), one has
B1 ⊃ B2 ⊃ B3 · · ·
Letting B = ∩n≥1 Bn , we see that
P (Y ∈ B) = 1.
(1)
Clearly B cannot contain more than 1 point. Combining this with (1) we get that
B must contain exactly one point.
3. Necessity is a consequence of three series theorem. For sufficiency note that
∞
X
V ar(|Xnc |) ≤
∞
X
E(|Xnc |2 )
n=1
∞
X
n=1
≤ c
E(|Xnc |).
n=1
Hence sufficiency also follows from three series theorem. The final part of the
problem again is an immediate consequence of the three series theorem.
1
4. Define wm = supn≥m |Zn − Z|. Then
⇔
⇔
⇔
⇔
Zn → Z a.s. as n → ∞
wm → 0 a.s. as m → ∞
wm → 0 in probability as m → ∞
∀ ² > 0 ∃ n ≥ 1, s.t. P (wn > ²) ≤ 1 − ²
∀ ² > 0 ∃ n ≥ 1, s.t. P ( sup |Zk − Z| > ²) ≤ 1 − ²
n≤k<∞
⇔ ∀ ² > 0 ∃ n ≥ 1, s.t. P ( sup |Zk − Z| > ²) ≤ 1 − ² ∀ m ≥ n.
n≤k<m
5. This is an immediate consequence of the weak law of large numbers for independent
random variables with uniformly bounded variances. (See class notes).
2
Solutions to HW7
1 Let Fn = σ{ξ1 , . . . , ξn } ⊂ F , Fn ⊂ Fn+1 , n = 1, 2, . . .,{Fn } is a filtration.
(a)it is easy to check Sn2 − s2n is Fn -measurable,
(b) E|Sn2 − s2n | ≤ ESn2 + s2n = 2s2n
(c)
2
+ 2ξn Sn−1 + ξn2 |Fn−1 ) − s2n
E(Sn2 − s2n |Fn−1 ) = E(Sn−1
2
= Sn−1
+ σn2 − s2n
2
= Sn−1
− s2n−1
⇒ Sn2 − s2n isamartingale
2 Xn , Yn are submartingales wrt Fn
n −Yn |
⇒ Xn ∨ Yn = Xn +Yn +|X
is Fn -measurable
2
E|Xn ∨ Yn | ≤ E|Xn | + E|Yn | < ∞
Note E(Xn ∨ Yn |Fn−1 ) ≥ E(Xn |Fn−1 ) ∨ E(Yn |Fn−1 ) = Xn−1 ∨ Yn−1
⇒ Xn ∨ Yn is a submartingale.
3 (i)
Let Fn = σ{Y1 , . . . , Yn } ⊂ F , {Fn } is a filtration
Q
⇒ Xn = nm=1 Ym is Fn -measurable
Q
Y1 , Y2 , . . . are non-negative, iid, EYm = 1 ⇒ E|Xn | = nm=1 EYm = 1 < ∞
Q
Qn−1
E(Xn |Fn−1 ) = E( nm=1 Ym |Fn−1 ) = m=1
Ym E(Yn ) = Xn−1
⇒ Xn is a martingale.
(ii)
Q
Define Zn = m≤n
√
√Ym
E Ym
√
√Ym
E Ym
√
> 0 independent, E E √YYmm = 1 ⇒ Zn is a non-negative martingale(by(i)).
Applying martingale convergence theorem, we know that Xn converges to
some finite r.v. X as. and Zn converges as to a finite r.v. Z. X ≥ 0, Z ≥
0, EX ≤ EX1 = 1, EZ
√ ≤ 2EZ1 = 1
√
P (Y√1 = 1) < 1 ⇒√(E Y1 ) < EY1 = 1 ⇒ (E Y1 )n → 0 as n → ∞
⇒ Xn = Zn (E Y1 )n → 0 as, ie X = 0 as.
(iii)
P
Note that n1 log Xn = n1 nm=1 log Ym , E| log Y1 | = E(log Y1 )+ + E(log Y1 )−
E(log Y1 )+ < ∞
if E(log Y1 )− < ∞, then n1 log Xn → E log Y1 , where E log Y1 < log EY1 =
1
0(sinceP (Y1 = 1) < 1)
if E(log Y1 )− = ∞, then
1
n
log Xn → −∞.
4
Define Zn+1 = Qn Xn+1
(1+Y
k=1
k)
(i) Zn+1 ∈ Fn+1
(ii)EZn+1 = E Qn Xn+1
(1+Y
k=1
k)
≤ EXn+1 < ∞
(iii)
E(Zn+1 |Fn ) = E( Qn Xn+1
|Fn )
(1+Y )
k=1
k
= E(Xn+1 |Fn )/ nk=1 (1 + Yk )
Q
≤ (1 + Yn )Xn / nk=1 (1 + Yk )
= Zn
Q
⇒ Zn is a non-negative supermartingale.
Apply martingale convergence theorem, ∃Z ≥ 0, st Zn → Zas and EZ < ∞,
thus Z < ∞.
Note 0 < lim nk=1 (1 + Yk ) = lim exp( nk=1 log(1 + Yk )) ≤ exp(lim
Q
∞, thus Xn = Zn n−1
k=1 (1 + Yk )converges as to a finite limit.
Q
P
Pn
k=1
Yk ) <
5 (i)
Xn1 , Xn2 are supermartingales measurable wrt Fn , {N > n} ∈ Fn , and {N ≤
n} ∈ Fn
(a) Yn ∈ Fn
(b) E|Yn | ≤ E|Xn1 | + E|Xn2 | < ∞
(c)
E(Yn |Fn−1 ) =
≤
=
≤
=
E(Xn1 1N >n |Fn−1 ) + E(Xn2 1N ≤n−1 |Fn−1 ) + E(Xn2 1N =n |Fn−1 )
E(Xn1 1N >n |Fn−1 ) + E(Xn2 1N ≤n−1 |Fn−1 ) + E(Xn1 1N =n |Fn−1 )
2
E(Xn1 1N ≥n |Fn−1 ) + Xn−1
1N ≤n−1
1
2
Xn−1 1N ≥n + Xn−1 1N ≤n−1
Yn−1
⇒ Yn is a supermartingale.
(ii)
Similarly
(a) Zn ∈ Fn
2
(b) E|Zn | ≤ E|Xn1 | + E|Xn2 | < ∞
(c)
E(Zn |Fn−1 ) = E(Xn1 1N ≥n |Fn−1 ) + E(Xn2 1N <n |Fn−1 )
= 1N ≥n E(Xn1 |Fn−1 ) + 1N <n E(Xn2 |Fn−1 )
2
1
+ 1N <n Xn−1
≤ 1N ≥n Xn−1
1
2
1
+ 1N =n−1 Xn−1
+ 1N <n−1 Xn−1
≤ 1N ≥n Xn−1
= Zn−1
⇒ Zn is a supermartingale.
3
Solutions to HW8
1. Sufficient to check
(a) g(X) is G-measurable,
(b) g(X) is integrable(b/c, E|g(X)| ≤ E|Φ(X, Y )| < ∞)
(c)For A = {X ∈ B} ∈ σ(X), B ∈ B(<),
R
R R
A
g(X)dP = RB R Φ(x, y)dFX dFY
= RR∗R Φ(x, y)1B dFX dFY
= A Φ(X, Y )dP
2. Note
0 ≤ E((tX + Y )2 |G) = E(X 2 |G)t2 + 2E(XY |G)t + E(Y 2 |G) as ∀t
E(X 2 |G) ≥ 0,
⇒ [2E(XY |G)]2 ≤ 4E[X 2 |G]E[Y 2 |G] as
3. For any A ∈ G
R
X 2 dP ≥ A RX 2 1|X|≥a dP
≥ a2 A 1|X|≥a dP
R
A
⇒ E(X 2 |G) ≥ a2 E(1|X|≥a |G) = a2 P (|X| ≥ a)
4. Let A = {a} B = {b},
Choose G1 = {A, Ac , φ, Ω}, G2 = {B, B c , φ, Ω},
define X = 1A , p({a}) = p({b}) = p({c}) = 1/3,
E(E(X|G1 )|G2 ) = E(X|G2 ) = p(A|B)1B + p(A|B c )1B c = 21 1B c
E(E(X|G2 )|G1 ) = E( 21 1B c |G1 ) = 1/2(p(B c |A)1A + p(B c |Ac )1Ac ) =
1/21A + 1/41Ac
⇒ E(E(X|G1 )|G2 ) 6= E(E(X|G2 )|G1 )
5. Existence
Define Xn = X ∧ n, Y = lim E(Xn |G)
E(Xn |G) is G-measurable
→ RY is G-measurable and nonnegative
R
Next to show that A Y dP = A XdP, ∀A ∈ G
R
Y dP = A lim
E(Xn |G)dP
R n
= lim RA E(Xn |G)dP (MCT)
= Rlim A Xn dP
= A XdP (MCT)
R
A
1
Uniqueness
R
R
LetR Ỹ be another G−measurable rv st A Ỹ dP = A XdP, ∀A ∈ G
⇒ A (Y − Ỹ )dP = 0.
take A = {Y 6= Ỹ } ∈ G ⇒ p(A) = 0
2