Exercise 1 1.Let X ∼ Γ(α, β)

Exercise 1
1.Let X ∼ Γ(α, β), α, β > 0. The density function of X is, be definition, given by,
β α α−1 −βx
x e 1{x>0} ,
fX (x) =
Γ(α)
Where
Z +∞
xα−1 e−x dx
Γ(α) =
0
with the property that Γ(α + 1) = αΓ(α), Γ(1) = 1.
Z
Z +∞
Z +∞
β α α−1 −βx
βα
E(X) =
xfX (x)dx =
x
x e dx =
x(α+1)−1 e−βx dx = 1
Γ(α)
Γ(α)
R
0
0
(α+1)−1
Z +∞
α Z +∞
α
β
y
1
β
1
Γ(α + 1)
αΓ(α)
α
=
e−y dy =
y (α+1)−1 e−y dy =
=
= .
α+1
Γ(α) 0
β
β
Γ(α) β
Γ(α)β
Γ(α)β
β
0
Z +∞
β α α−1 −βx
βα
x e dx =
E(X ) =
x fX (x)dx =
x
x(α+2)−1 e−βx dx = 2
Γ(α)
Γ(α)
R
0
0
(α+2)−1
Z +∞
α Z +∞
α
y
β
Γ(α + 2)
β
1
1
e−y dy =
y (α+2)−1 e−y dy =
=
=
α+2
Γ(α) 0
β
β
Γ(α) β
Γ(α)β 2
0
(α + 1)αΓ(α)
(α + 1)α
=
=
2
Γ(α)β
β2
α
(α + 1)α α2
− 2 = 2
⇒ Var(X) = E(X 2 ) − E(X)2 =
2
β
β
β
2
Z
2
Z
+∞
2
2.Let X ∼ Γ(α1 , β), Y ∼ Γ(α2 , β), α1 , α2 , β > 0 independent. We want to prove that
β α1 +α2
xα1 +α2 −1 e−βx 1{x>0} .
Γ(α1 + α2 )
By independence the density of X + Y is given by the convolution of the densities:
Z
Z
βα
β1α
(x − y)α1 −1 e−β(x−y) 1{x−y>0} 2 y α2 −1 e−βy 1{y>0} dy =
fX+Y (x) =
fX (x − y)fY (y)dy =
Γ(α2 )
R
R Γ(α1 )
Z
α1 +α2
β
=
e−βx (x − y)α1 −1 y α2 −1 1{x−y>0} 1{y>0} dy.
Γ(α1 )Γ(α2 )
R
Now
fX+Y (x) =
• if x ≤ 0 ⇒ 1{x>y} 1{y>0} = 0 ⇒ fX+Y (x) = 0.
• if x > 0 ⇒ 1{x>y} 1{y>0} = 1x>y>0 = 1(0,x) (y) and therefore
Z
β α1 +α2
−βx
(x − y)α1 −1 y α2 −1 1(0,x) (y)dy =
fX+Y (x) =
e
Γ(α1 )Γ(α2 )
ZRx
β α1 +α2
=
(x − y)α1 −1 y α2 −1 dy = 3
e−βx
Γ(α1 )Γ(α2 )
0
Z 1
β α1 +α2
=
e−βx
xα1 −1 (1 − t)α1 −1 xα2 −1 tα2 −1 x dt =
Γ(α1 )Γ(α2 )
0
Z 1
α1 +α2
β
−βx α1 +α2 −1
e x
(1 − t)α1 −1 tα2 −1 dt .
=
Γ(α1 )Γ(α2 )
0
|
{z
}
K
2
With the change of variable βx = y.
1
it integrates to 1, so necessarily K is such that
RNote that since fX+Y is a density
Γ(α1 )Γ(α2 )
.
f
(x)dx
=
1,
that
is
K
=
Γ(α1 +α2 )
R X+Y
Exercise 2
Let Λ ∼ Γ(α, β), α, β > 0 that is
fΛ (λ) =
β α α−1 −βλ
λ e 1{λ>0} ,
Γ(α)
and X ∼ P oisson(Λ), that is a Poisson r.v. with random parameter.4 This means
P(X = k|Λ = λ) = e−λ
λk
,
k!
k = 0, 1, · · · .
β
We want to prove that X ∼ B − α, β+1
. Conditioning, we obtain for all k ≥ 0:
Z
TP P(X = k) = E 1{X=k} = E E 1{X=k} |Λ = λ = E [P(X = k|Λ)] =
P(X = k|Λ = λ)fΛ (λ)dλ =
R
Z
Z +∞
k
β α α−1 −βλ
βα
−λ λ
e
=
e−(1+β)λ λα+k−1 dλ = 5
λ e 1{λ>0} dλ =
k! Γ(α)
k!Γ(α) 0
R
α k
Z +∞
α
β
Γ(α + k)
β
1
−y α+k−1
=
e y
dy =
=
k!Γ(α)(β + 1)α+k 0
k!Γ(α)
β+1
β+1
| {z }
(α+k−1)!
k!(α−1)!
=
α k
β
α+k−1
β
1−
.
β+1
β+1
k
Exercise 4
Regarding the question asked in class we point out the following.
Given a r.v. X (real valued) we define the following generating functions:
• The moment generating function of X
tX
ΨX (t) = E e
Z
=
etx dFX (x),
R
for t ∈ MX = {t ∈ R, E etX , ∞}.
• The probability generating function of X
X
φX (t) = E t
Z
tx dFX (x),
=
R
for t ∈ MPX = {t ∈ R, E tX , ∞}.
• The characteristic function of X
χX (t) = E eitX =
Z
R
defined for all t ∈ R.
3
With the change of variable y = tx.
Maybe it is clearer if we say that, X|Λ=λ ∼ P oisson(λ).
5
With the change of variable (1 + β)λ = y.
4
2
eitx dFX (x),
Property 1
If M◦X 6= ∅, the the moment generating function of X, ΨX (t), determines the distribution
of X uniquely.
The probability generating function of X, φX (t), determines the distribution of X uniquely
only if X is a non-negative integer valued random variable.
The characteristic function of X, χX (t), determines the distribution of X uniquely.
Property 2
All
Pn these generating functions for a sum of independent random variables factor: let Sn =
i=1 Xi be a sum of independent random variables, then
ΨSn (t) =
φSn (t) =
n
Y
i=1
n
Y
ΨXi (t),
∀t∈
\
\
φXi (t),
∀t∈
χXi (t),
∀ t ∈ R.
i=1
χSn (t) =
n
Y
i=1
3
MXi .
MPXi .