Math 151. Rumbos Fall 2014 1 Solutions to Assignment #16 1

Math 151. Rumbos
Fall 2014
1
Solutions to Assignment #16
1. Suppose that X ∼ Normal(µ, σ 2 ) and define Z =
X −µ
.
σ
Prove that Z ∼ Normal(0, 1)
Solution: Since X ∼ Normal(µ, σ 2 ), its pdf is given by
fX (x) = √
1
2
2
e−(x−µ) /2σ
2π σ
for − ∞ < x < ∞.
We compute the cdf of Y :
FY (y) = Pr(Y 6 y),
= Pr
X −µ
6y
σ
for − ∞ < y < ∞,
= Pr (X 6 σy + µ)
= FX (σy + µ).
Differentiating with respect to y we then get that
fY (y) = FX0 (σy + µ) · σ,
for − ∞ < y < ∞,
= σ fX (σy + µ)
= σ·√
1
2
2
e−(σy+µ−µ) /2σ
2π σ
1
2
e−y /2 ,
= √
2π
for −∞ < y < ∞, which is the pdf of a Normal(0, 1) distribution. 2. (The Chi–Square Distribution) Let X ∼ Normal(0, 1) and define Y = X 2 .
Compute the pdf, fY , of Y .
The distribution of Y is called the Chi–Square distribution with one degree of
freedom; we write Y ∼ χ2 (1).
Math 151. Rumbos
Fall 2014
2
Solution: The pdf of X is given by
1
2
fX (x) = √ e−x /2 ,
2π
for − ∞ < x < ∞.
We compute the pdf for Y by first determining its cdf:
P (Y 6 y) = P (X 2 ≤ y) for y > 0
√
√
= P (− y ≤ X ≤ y)
√
√
= P (− y < X ≤ y), since X is continuous.
Thus,
√
√
P (Y 6 y) = P (X ≤ y) − P (X ≤ − y)
√
√
= FX ( y) − FX (− y) for y > 0.
We then have that the cdf of Y is
√
√
FY (y) = FX ( y) − FX (− y) for y > 0,
from which we get, after differentiation with respect to y,
1
1
√
√
fY (y) = FX0 ( y) · √ + FZ0 (− y) · √
2 y
2 y
1
1
√
√
= fX ( y) √ + fZ (− y) √
2 y
2 y
1
1
1
−y/2
−y/2
√
= √
e
+√
e
2 y
2π
2π
1
1
= √ · √ e−y/2
y
2π
for y > 0. We then have that the pdf of Y is

 √1 · √1 e−y/2 , if y > 0;
y
2π
fY (y) =

0,
elsewhere.
3. (Moment Generating Function of the Chi–Square Distribution) Assume that
2
Y ∼ χ2 (1). Compute the mgf, ψY , of Y by computing E(etY ) = E(etX ), where
X ∼ Normal(0, 1).
Use the mgf of Y to compute E(Y ) and var(Y ).
Math 151. Rumbos
Fall 2014
Solution: Compute
ψY (t) = E(etY )
2
= E(etX )
Z
∞
2
etx fX (x) dx,
=
−∞
where the pdf for X is
1
2
fX (x) = √ e−x /2 ,
2π
for − ∞ < x < ∞.
We then have that
Z
or
∞
1
2
2
etx √ e−x /2 dx,
2π
−∞
ψY (t) =
Z
∞
ψY (t) =
−∞
1
2
√ e−(1−2t)x /2 dx.
2π
(1)
The integral on the right–hand side in (1) converges provided that
t < 1/2. In order
√ to evaluate the integral in (1) make the change of
variables z = 1 − 2t x, for t < 1/2, to get
Z ∞
1
1
2
√ e−z /2 dz,
ψY (t) = √
1 − 2t −∞ 2π
so that
ψY (t) = √
1
,
1 − 2t
1
for t < .
2
(2)
Next, differentiate on both sides of (2) with respect to t to get
ψY0 (t) =
1
,
(1 − 2t)3/2
1
for t < ,
2
(3)
ψY00 (t) =
3
,
(1 − 2t)5/2
1
for t < .
2
(4)
and
It follows from (3) and (4) that
E(Y ) = ψY0 (0) = 1
3
Math 151. Rumbos
Fall 2014
4
and
E(Y 2 ) = ψY00 (0) = 3
respectively; so that
var(Y ) = E(Y 2 ) − [E(Y )]2 = 2.
4. Let Y1 and Y2 denote two independent random variables such that Y1 ∼ χ2 (1)
and Y2 ∼ χ2 (1). Define X = Y1 + Y2 . Use the mgf of the χ2 (1) distribution
found in Problem 3 to compute the mgf of X. Give the distribution of X.
Solution: Compute the mgf of X to get
ψX (t) = ψY1 (t) · ψY2 (t),
since Y1 and Y2 are independent. Thus, since Y1 ∼ χ2 (1) and Y2 ∼
χ2 (2), it follows from the result of Problem 3 in (2) that
1
1
·√
,
ψX (t) = √
1 − 2t
1 − 2t
for t <
1
2
or
1
1
, for t < .
(5)
1 − 2t
2
Observe that the mgf in (5) is the mgf for an Exponential(2) distribution. Hence, it follows from the Uniqueness Theorem for Moment
Generating Functions that X ∼ Exponential(2). Thus, the pdf of X
is given by

 1 e−x/2 , if x > 0;
fX (x) = 2
0,
if x 6 0.
ψX (t) =
5. Let X1 and X2 denote independent, Normal(0, σ 2 ) random variables, where
σ > 0. Define the random variables
X=
X1 + X 2
2
and Y =
(X1 − X2 )2
.
2σ 2
Math 151. Rumbos
Fall 2014
Determine the distributions of X and Y .
Suggestion: To obtain the distribution for Y , first show that
X1 − X2
√
∼ Normal(0, 1).
2σ
Solution: Since X1 and X2 are Normal(0, σ 2 ) distribution, they both
have the mgf
ψ(t) = eσ
2 t2 /2
It then follows that the mgf of
for − ∞ < t < ∞.
X1
is
2
ψX1 /2 (t) = E(etX1 /2 )
= E(e(t/2)X1 )
= ψ(t/2)
= eσ
2 t2 /8
for − ∞ < t < ∞.
Similarly,
ψX2 /2 (t) = eσ
2 t2 /8
for − ∞ < t < ∞.
Thus, since X1 and X2 are independent, it follows that the mgf of X
is
ψX (t) = ψX1 /2+X2 /2 (t)
= ψX1 /2 (t) · ψX2 /2 (t)
= eσ
2 t2 /8
= eσ
2 t2 /4
= e(σ
· eσ
2 /2)t2 /2
2 t2 /8
,
for − ∞ < t < ∞,
which is the mgf of a Normal(0, σ 2 /2) distribution. It then follows
that X ∼ Normal(0, σ 2 /2) and, therefore, its pdf is
fX (x) = √
= √
1
2
2
√ e−x /σ
2π (σ/ 2)
1
2
2
e−x /σ ,
πσ
for − ∞ < x < ∞.
5
Math 151. Rumbos
Fall 2014
X1 − X2
√
, so that Y = Z 2 . We show that Z ∼
2σ
Normal(0, 1). To see why this is so, compute the mgf of Z:
Next, let Z =
ψZ (t) = ψX
= ψX
√
√
1 /( 2σ)+X2 /( 2σ)
√
1 /( 2σ)
(t) · ψX
(t)
√
2 /( 2σ)
(t)
√
√
= ψ(t/ 2σ) · ψ(t/ 2σ)
√
= [ψ(t/ 2σ)]2
= [eσ
2 (t2 /2σ 2 )/2
2 /2
= et
,
]2
for − ∞ < t < ∞,
which is the mgf of a Normal(0, 1) distribution. Thus, Z has a
Normal(0, 1) distribution.
It then follows that Y = Z 2 ∼ χ2 (1). Thus, the pdf of Y is

1
1 −y/2


, if y > 0;

 √2π · √y e
fY (y) =



0,
otherwise.
6