STATISTICS 381: MEASURE-THEORETIC PROBABILITY I
HOMEWORK ASSIGNMENT 6
DUE FEBRUARY 20, 2017
In these problems all random variables are assumed to be defined on a fixed probability
space (⌦, F , P ).
Problem 1. Quickies. Let X 1 , X 2 , . . . be any sequence of real-valued random variables defined on a common probability space (⌦, F , P ).
P1
(A) Show that there are scalars a n > 0 such that the series n=1 a n X n converges almost
surely to a finite limit S . HINT: Start from the observation made in class that if (yn )n 1 is
any sequence of real numbers such that |yn yn+1 | < 2 n eventually then limn!1 yn exists.
(B) Assume
now that the random variables X 1 , X 2 , . . . are independent. Show that if the seP1
ries n=1 X n converges almost surely to a finite constant then the random variables X n are
themselves constants. HINT: First show that if two random variables X , Y are independent
then X + Y is (almost surely) constant if and only if X and Y are themselves constant.
(C) Assume that the random variables X 1 , X 2 , . . .P
are independent. Show that if there is a sem
quence of scalars a n ! 0 such that limm !1 a m n=1 X i exists and is finite with probability
one then the limit is a constant random variable.
Problem 2. Bounded step random walks. Assume that the random variables X 1 , X 2 , . . .
are independent (but not necessarily identically distributed) and satisfy
Pn|X i | 1. Assume
further that all have mean E X i = 0, and that 2j := E X j2 < 1. Let Sn = i =1 X i . Show that
P1
if n=1 n2 = 1 then with probability 1,
sup |Sn | = 1.
n 1
HINT: First use Wald II to show that for any 0 < ↵ < 1, with probability one, supn |Sn | > ↵.
BONUS: Is it necessarily true that supn
1 Sn
= 1?
Problem 3. Maximal inequalities without 2nd moments. This problem outlines a proof
of the following generalization of the L 2 maximal inequality proved in the notes.
Proposition 1. Let X 1 , X 2 , . . . , X n be independent, mean-0 random variables and let Sk =
Pk
X be the k th partial sum. Let ' : R ! (0, 1) be an increasing, convex, continuous
i =1 i
function. Then for each ↵ > 0,
P { max Sk > ↵}
1k n
1
E '(Sn )
.
'(↵)
(A) First, prove (brief sketch only) the following variant of the Fubini theorem. Let U , V be
independent random vectors taking values in Rk and Rm , respectively. Let : Rk ⇥ Rm !
[0, 1) be a measurable function, and for each u 2 Rk define
(u ) := E (u , V ).
Then
E (U , V ) = E
(U ).
HINT: This is trivially true for functions of the form
Borel subsets of Rk and Rm , respectively.
(u , v ) = 1A (u)1B (v ), where A, B are
(B) Fix k n , and for each real u define k (u ) = E '(u + Sn
to show that for any event B 2 (X 1 , X 2 , . . . , X k ),
E '(Sn )1B = E
(C) Use Jensen’s inequality to check that '(u )
Sk ). Use the result of part (A)
k (Sk )1B .
k (u ).
(D) Now prove the proposition.
Problem 4. Symmetric random variables. A real random variable X is said to be symmetric (or, more properly, it is said to have a symmetric distribution) if the random variables
X and X both have the same distribution, that is, if E f (X ) = E f ( X ) for every bounded
Borel function f .
(A) Let X 1 , X 2 , . . . , X n be independent (but not necessarily identically distributed) symmetPk
ric random variables, and let Sk = i =1 X i be the k th partial sum. Show that for any ↵ > 0,
P { max Sk
1k n
↵} 2P {Sn
↵}.
(B) The maximal inequality in part (A) holds even if the random variables X i are not in L 1 ;
the only hypothesis is that each X i is symmetric. Use the maximal inequality to prove the
following. IfPX 1 , X 2 , . . . are independent, symmetric random variables such that the partial
n
sums Sn := i =1 X i converge in probability to a finite random variable S1 , then
lim Sn = S1
n!1
almost surely.
NOTE: Recall that a sequence Yn of random variables defined on a common probability
space converges in probability to a limit random variable Y1 if for every " > 0 there exists
n" such that for all n n" ,
P {|Yn Y1 | > "} < ".
REMARK: The result of Problem 4 (B) carries over to non-symmetric random variables. This
homework assignment is already long enough, so the following problem, which outlines a
proof of the extension, is optional, not to be turned in.
Problem 5. Symmetrization. The results of Problem 4 can sometimes be useful even in
dealing with non-symmetric random variables. The trick to this is called symmetrization:
if X and Y are independent random variables with the same (not necessarily symmetric)
distribution, then X Y is symmetric.
Let X 1 , Y1 , X 2 , Y2 , . . . be a sequence of independent random variables such that for each
index i the random variables X i and Yi have the same distribution. Denote by
n
n
X
X
X
Y
Sn =
X i and Sn =
Yi
i =1
i =1
the partial sums.
X
X
(A) Show that if there is a finite random variable S1
such that SnX ! S1
in probability, then
Y
Y
Y
there is a finite random variable S1 such that Sn ! S1 .
HINT: First show that if a sequence Wn is “Cauchy in probability” (you should provide the
appropriate definition) then there is a subsequence Wnk that converges almost surely.
X
X
(B) Show that if there is a finite random variable S1
such that SnX ! S1
in probability, then
lim S X
n!1 n
X
SnY = S1
Y
S1
almost surely,
Y
where S1
is the random variable whose existence you established in part (a).
(C) Use the result of (b) together with the independence of the sequences X 1 , X 2 , . . . and
X
X
Y1 , Y2 , . . . to show that if there is a finite random variable S1
such that SnX ! S1
in probability, then
X
lim SnX = S1
almost surely.
n!1
Thus, the result of Problem 4 (d) carries over to non-symmetric random variables.
© Copyright 2026 Paperzz