Moment-Generating Functions of Continuous Random Variables

Moment-Generating Functions of Continuous Random Variables
1
Introduction.
Moment-generating function of continuous random variables are completely analogous to moment-generating
functions of discrete random variables.1 Recall that for a discrete random variable X with distribution
P (X = xi ) = pi (i = 1, 2, 3, . . .),
the moment-generating function for X is the function
X txi
mX (t) = E etX =
e pi .
i≥1
Analogously, a continuous random variable Y with probability density function y 7→ f (y) has moment-generating
function
Z ∞
mY (t) = E etY =
ety f (y) dy,
−∞
if this integral converges.
2
Properties of mY (t).
The properties of moment-generating functions are completely parallel in the discrete, continuous and hybrid
cases, because, in each case, they are based on the properties of expected value, which are also completely parallel
in all three cases. Here are the proofs of the two main properties of mY (t) in the continuous case. They have
been ported over without change from handout #8, because no changes are necessary.
Theorem 1
[a]: If X and Y are independent, then
mX+Y (t) = mX (t)mY (t).
[b]:
∞
X
E Yk k
mY (t) =
t .
k!
k=0
Proof of [a]. Because X and Y are independent, so are etX and etY . Therefore
h
i
mX+Y (t) = E et(X+Y )
= E etX+tY
= E etX etY
(by independence −→) = E etX E etY
= mX (t)mY (t).
1
Hybrid random variables have them as well (not discussed here).
1
(1)
Proof of [b].
mY (t)
= E etY
" ∞
#
X (tY )k
= E
k!
k=0
∞
X
(tY )k
=
E
k!
k=0
∞
X
tk Y k
=
E
k!
k=0
∞ k
X
t
E Yk
k!
k=0
∞
X E Yk
tk .
=
k!
=
k=0
Exercise 1 Use equation (1 ) to show, for all k ≥ 1, that
(k)
mY (0) = E Y k .
3
Formula for mY (t) in Specific Cases.
There is no closed formula for mY (t) when Y has a beta distribution. However, there are closed formulas for
mY (t) if Y has a gamma distribution, a normal distribution, or a uniform distribution. I will derive the formulas
in the gamma distribution and normal distribution cases; the uniform distribution case is left as an exercise.
3.1
The Gamma Distribution.
Theorem 2 Let Y have a gamma distribution with parameters α > 0 and β > 0. Then
mY (t) =
1
.
(1 − βt)α
(2)
Proof.
mY (t) = E etY
Z ∞
=
ety f (y) dy
−∞
Z ∞
1
=
y α−1 e−y/β ety dy
Γ(α)β α 0
Z ∞
1
=
y α−1 e−y/β+ty dy
Γ(α)β α 0
Z ∞
1
−y
(·)
α−1
(arithmetic; “exp (·)” means e −→) =
y
exp
dy
Γ(α)β α 0
β/(1 − βt)
α
1
1
β
if t < , then (1 − βt) > 0, and the integral converges −→
=
Γ(α)
β
Γ(α)β α
1 − βt
1
=
.
(1 − βt)α
2
3.2
The Normal Distribution.
Theorem 3 Let Z have a standard normal distribution. Then
1 2
mZ (t) = e 2 t .
(3)
Proof.
1
The integral √
2π
Z
∞
mZ (t) = E etZ
Z ∞
=
ety f (y) dy
−∞
Z ∞
2
1
ety e−y /2 dy
= √
2π −∞
Z ∞
2
1
= √
ety−(y /2) dy
2π −∞
Z ∞
2
1
1
e− 2 (y −2ty) dy
= √
2π −∞
Z ∞
2
2
1 2
1
1
(complete the square −→) = √
e− 2 (y −2ty+t )+ 2 t dy
2π −∞
Z ∞
2
2
1
1 2
1
e− 2 (y −2ty+t ) e 2 t dy
= √
2π −∞
Z
1 1 2 ∞ − 1 (y−t)2
= √ e2t
e 2
dy
2π
−∞
Z
1 1 2 ∞ − 1 u2
substitute u = (y − t) = √ e 2 t
e 2 du
2π
−∞
Z ∞
1 2
1 2
1
e− 2 u du
= e2t √
2π −∞
1 2
1 2
e− 2 u du equals 1. Why? −→
= e2t .
−∞
Exercise 2
[a]: Show, for any random variable Y , that mcY (t) = mY (ct).
[b]: Show, for any random variable Y , that mY +d (t) = etd mY (t).
[c]: Let Y be normal with mean µ and variance σ 2 . As discussed in class, Y = σZ + µ, where Z has a standard
normal distribution. Use this fact, equation (3 ), and parts [a ] and [b ] of this exercise to show that
σ 2 t2
mY (t) = exp µt +
.
2
Exercise 3 Let Y be uniformly distributed on [ θ1 , θ2 ]. Show that
mY (t) =
etθ2 − etθ1
.
t(θ2 − θ1 )
3