(∑20 i=1 Xi > 15) . P(X − µ ≥ t) ≤ σ2 σ2 + t2 . e−|x|, x ∈ R. E[f(X)] = f

MATH 411 – Mathematical Probability (Summer 2015)
Section 100
Homework Set 5
due July 3, Friday
June 29, Monday
1. Let X1 , . . . , X20 be independent Poisson random variables with mean 1.
(a) Use the Markov inequality to obtain a bound on P
P
20
i=1
Xi > 15 .
(b) Use the Central Limit Theorem to approximate the above probability.
(c) Compute the above probability.
2. (Chebyshev-Cantelli Inequality) Prove the following one-sided improvement of
Chebyshev’s inequality: for any real-valued random variable X with mean µ and
variance σ2 and t > 0 we have,
P(X − µ ≥ t) ≤
σ2
.
+ t2
σ2
3. (Double exponential) Let X be random variable with density: fX (x) = 21 e−|x| , x ∈ R.
i. If f is differentiable function with lim x→±∞ f (x)e−|x| = 0, then:
E[ f (X)] = f (0) + E[sgn(X) f 0 (X)].
Hint; use integration by parts.
ii. If f as before, then:
Var( f (X)) ≤ 4E[( f 0 (X))2 ].
Hint; use previous identity and Cauchy-Schwarz inequality.
4. Let U1 , . . . , Un be independent uniformly distributed over (0, 1). Prove the following
small ball probability estimate:
 n

 1 X


P 
Ui < ε < (eε)n , ε > 0,
n i=1
by following the next steps:
(
i. If U is uniformly distributed over (0, 1), then MU (t) =
ii. If S n :=
Pn
i=1
et −1
t ,
1,
t,0
.
t=0
Ui , then by Markov’s inequality (or Chernoff’s bound) we obtain:
!n
1 − e−t
P(−S n > −εn) ≤ eεtn
,
t
for all t > 0. Choose t appropriately to conclude the result.
Hints - Solutions
Homework Set 5
1. Hint. Let X = i≤20 Xi . (i) We have: P(X ≥ 15) ≤
since, we always have P(X ≥ 15) ≤ 1.
P
EX
15
= 32 . Note that this bound is useless
√
5.5
√ 20 ≥ − √
(ii) We may write: P(X ≥ 15) = P(X > 14.5) = P( X−
) ≈ Φ(5.5/ 20) ≈ Φ(1.22) ≈ 0.38.
20
20
(iii) We know that X is Poisson with parameter 20, thus: P(X ≥ 15) =
j
P∞
j=15
e−20 20j! .
2. Hint. For any α > 0 we may write:
P(X − µ > t) = P(X − µ + α > t + α) ≤
E(X − µ + α)2
σ2 + α2
=
,
(α + t)2
(t + α)2
by Chebyshev’s inequality. Minimizing the right-hand side expression with respect to α > 0
(choose α = σ2 /t) we get the result.
3. Hint. (i) We may write:
E f (X) =
Z
∞
f (x) fX (x) dx =
−∞
1
2
∞
Z
f (x)e−|x| dx =
−∞
1
2
0
"Z
f (x)e x dx +
−∞
Z
∞
#
f (x)e−x dx .
0
Integration by parts yields:
Z
0
f (x)e x dx = [e x f (x)]0−∞ −
−∞
0
Z
sgn(x) f 0 (x)e−|x| dx.
−∞
−∞
R∞
R∞
Similarly, we find 0 f (x)e−x dx = f (0) +
conclude the desired formula.
0
0
Z
f 0 (x)e x dx = f (0) +
sgn(x) f 0 (x)e−|x| dx, thus combining all together we
(ii) Since, Var( f (X)) = Var( f (X)− f (0)) we may assume without loss of generality that f (0) = 0.
Then, we have:
1/2 1/2
E[( f (X))2 ] = 2E sgn(X) f (X) f 0 (X) ≤ 2 E( f (X))2
E( f 0 (X))2
,
where we have used the formula from (i) and Cauchy-Schwarz inequality. It follows that
E( f (X))2 ≤ 4E( f 0 (X))2 . Taking into account that Var( f (X)) ≤ E[( f (X))2 ] the result follows.
4. Hint. (i) This is standard: for t , 0 we have
MU (t) = Ee
tU
=
1
Z
et x
e dx =
t
"
#1
.
tx
0
0
(ii) Fix ε > 0. For any t > 0 we may write using Markov’s inequality (or Chernoff’s bound):
P(−S n > −εn) ≤ eεnt M−S n (t) = eεnt MS n (−t).
Note that, since Ui ’s are independent we get:
MS n (−t) =
n
Y
MUi (−t) = [MU (−t)]n =
1 − e−t
t
i=1
Combining the last two we arrive at:
P(S n < εn) ≤ eεnt
1 − e−t
t
for all t > 0. The choice t = 1/ε completes the proof.
!n
<
eεt
t
!n
,
!n
.