Characteristic Functions and Convergence

Problem Sheet 3
Probability Theory and Mathematical Statistics (2016)
Characteristic Functions and Convergence
Problem 1. Manipulation of characteristic functions
Let ϕX (t) be the characteristic function of a random variable X. Are each of the following the
characteristic functions of some distributions. Explain why/why not.
(i) ϕ2X (t)
(ii) |ϕX (t)|2
(iii)
1
1−
1
ϕ (t)
2 X
(iv) 1/ϕX (t)
Problem 2. Characteristic function
Is the function
ψ(t) := (1 − t2 )1/2 1(|t| < 1)
a characteristic function? Explain why or why not.
Hint. Check differentiability properties.
Problem 3. Sums and Products of the Normal Distribution
Let X1 , X2 , X3 and X4 be i.i.d. N (0, 1) random variables.
(i) Compute ϕX1 X2 (t).
(ii) Deduce ϕX1 X2 +X3 X4 (t) and the law of X1 X2 + X3 X4 .
Problem 4. Triangular characteristic function
(i) Let U1 and U2 be i.i.d. U(−1, 1) random variables. Compute the characteristic function
of U1 and of the sum U1 + U2 . Derive the density of U1 + U2 .
(ii) Show that ϕ(t) = (1 − |t|/2)1(|t| ≤ 2) for t ∈ R is a characteristic function, and specify
the respective distribution.
(iii) Using the properties of characteristic functions, is the distribution corresponding to the
characteristic function symmetrical?
Problem 5. Show that if, for a sequence of random variables {Xn } on a common probability
p
d
space, we have that Xn → X ≡ c = cst as n → ∞, then also Xn → X.
1
Problem Sheet 3
Probability Theory and Mathematical Statistics (2016)
Problem 6. Let {Xk } be a sequence of random variables with
P(Xk = k 2 ) =
1
,
k2
P(Xk = −1) = 1 −
1
,
k2
k ≥ 1.
(i) Show that the distribution of Xk converges as k → ∞. Identify the limiting distribution.
(ii) Does Xk converge in probability? in L1 ? in L2 ?
Problem 7. Convergence in distribution
Suppose that {Yn } is a sequence of geometric random variables, with pmf
P(Yn = k) = pn (1 − pn )k−1 ,
k = 1, 2, . . . ,
where pn = λ/n, and λ is a strictly positive constant. Prove that Xn := Yn /n converges in
distribution to an exponential random variable with parameter λ, as n → ∞.
Hint. It suffices to show convergence for the distribution tails.
Problem 8. Convergence in probability
Suppose that U1 , U2 , . . . is an i.i.d. sequence of U(0, 1) distributed random variables.
(i) Show that the sequence
Pn
2
j=1 Uj
Xn := Pn
j=1 Uj
converges in probability as n → ∞, and find its limit.
(ii) Show that the sequence
Pn
j=1 f (Uj )
Yn := Pn
j=1 g(Uj )
converges in probability as n → ∞, where f and g are continuous functions on [0, 1], and
find its limit.
Name any theorem you may be using in part (i) and (ii).
Problem 9. Weierstrass Theorem
Let {Xn } be a sequence of i.i.d. RVs, where each Xj ∼ B(x). Put X̄ =
f : [0, 1] → R be a continuous function.
(i) Show that for all > 0, there exists a δ > 0 such that
E|f (X̄) − f (x)|1{|X̄−x|<δ} ≤ .
(ii) Using Tchebyshev inequality, show that
E|f (X̄) − f (x)|1{|X̄−x|≥δ} ≤
2
M
.
2nδ 2
1
n
Pn
i=1
Xi and let
Problem Sheet 3
Probability Theory and Mathematical Statistics (2016)
(iii) Deduce from (i) and (ii) that f is the uniform limit of a sequence of Bernstein polynomials,
defined as
n
X
n k
k
Bn (x) =
x (1 − x)n−k ,
x ∈ [0, 1] .
f
n
k
k=0
Problem 10. Let {Yn } be an i.i.d. sequence of positive and absolutely continuous RVs with a
density f (x) which is continuous on [0, ∞) with f (0) = a > 0.
(i) Show that for t > 0,
Z
t/n
at
f (x)dx − → 0
n
0
n
as
n → ∞.
(ii) Deduce from (i) that Xn := n min{Y1 , . . . , Yn } converges in distribution to an exponential
distribution with parameter a as n → ∞.
Problem 11. Let Y be an absolutely continuous RV with density f (y) which is continuous and
bounded for y ∈ [0, 1], and zero outside of this interval. Put
Xn = nY − bnY c
be the fractional part of nY , for n ≥ 1. Show that Xn converges in distribution as n → ∞ and
find the limiting distribution.
Hint. You may need the Mean Value Theorem, which states that for a continuous function
f : [a, b] → R, there exists c ∈ (a, b) such that
Z b
f (u)du = f (c)(b − a) .
a
Problem 12. Convergence in L1 and L2
Let {Xn } be a sequence of square integrable random variables. Show that if {Xn } converges in
L2 to some random variable X, then {Xn2 } converges in L1 to X 2 . Is the converse true?
Problem 13. Sequence of exponential random variables
Let {Xn } be a sequence of i.i.d. exponential RVs with parameter λn . Using first the general
definition of weak convergence (Eg(Xn ) → Eg(X) for all continuous and bounded functions g),
and then using characteristic functions, state whether of not the sequence {Xn } converges in
distribution in each of the following three cases. When it does, specify the limit.
(i) limn→∞ λn = λ ∈ (0, ∞)
(ii) limn→∞ λn = +∞
(iii) limn→∞ λn = 0
3
Problem Sheet 3
Probability Theory and Mathematical Statistics (2016)
Problem 14. Let U1 , U2 , . . . and V1 , V2 , . . . be uniformly distributed random variables on (0, 1).
That is, for all n > 1,
Un ∼ U (0, 1) and Vn ∼ U (0, 1).
Suppose they are independent of each other. Define, for all n > 1,
(
1 if Un2 + Vn2 6 1,
Xn =
0 otherwise
and
X1 + . . . + Xn
.
n
(i) What is the distribution of Xn ? Specify the value of the parameter(s).
Zn = 4
(ii) Using the Law of Large Numbers (LLN), show that Zn converges to π in distribution.
(iii) Let α ∈ (0, 1), and > 0. Using Chebyshev inequality, show that there exists a number n0
such that for all n greater than n0 ,
P(|Zn − π| > ) 6 α.
What is the value of n0 when α = 0.05 and = 0.0001?
(iv) Using the Central Limit Theorem (CLT), give an approximation to the probability
P(|Zn − π| > ).
Express your final result in terms of the standard normal cdf Φ. Compute the value of this
approximation when = 0.0001.
Problem 15. Chi-square distribution
(i) Derive the characteristic function of Z 2 , where Z follows a standard normal distribution.
(ii) Consider k i.i.d. RVs Zj ∼ N (0, 1). Compute the characteristic function of the sum
Pk
2
2
j=1 Zj =: χk .
(iii) Use the characteristic function of χ2k derived in (ii) to show that
χ2n − n
√
→ N (0, 1)
2n
as n → ∞ .
(iv) Use the CLT to recover the result of question (iii).
Problem 16. Central Limit Theorem
Suppose ξ1 , ξ2 , . . . is a sequence of i.i.d. random variables with mean µ and variance σ 2 ∈ (0, ∞).
Put
Xk := ξk − 3ξk+1 + ξk+2 ,
k = 1, 2, . . . ,
and
Sn := X1 + . . . + Xn .
4
Problem Sheet 3
Probability Theory and Mathematical Statistics (2016)
(i) Compute EXk and Var(Xk ).
(ii) For x ∈ R, find the limit
lim P
n→∞
S + nµ
p n
≤x
nVar(X1 )
!
.
Hint. For part (ii), even though the Xk s are not independent, note that the sum Sn can be
decomposed into two terms, the first term being a sum of i.i.d random variables.
Problem 17. Sum of Cauchy random variables
Let X be a random variable with a Cauchy distribution, that is with pdf
fX (x) =
1
,
π(1 + x2 )
x ∈ R.
Put Sn = X1 + . . . + Xn , where each Xj is distributed as X.
(a) Is the mean of X defined? finite? infinite? undefined?
√
(b) Using characteristic functions, study the convergence in distribution of the sequence {Sn / n}.
Does the sequence converge in probability?
(c) Study the convergence in probability and distribution of the sequence {Sn /n2 }.
(d) Consider the sequence {Sn /n}. Does this sequence converge in distribution? Then show
that the sequence does not converge in probability. To do so, you may consider the sequence
S2n Sn
−
,
2n
n
derive its distribution, and deduce that it does not converge in probability to 0. Then show
that {Sn /n} cannot converge in probability.
5