Joint Probability Distributions

Joint Probability Distributions
Definition 1 The joint probability distribution function (pdf) of a couple of random
variables X and Y is defined by:
pdf
discrete
P (X = m, Y = n)
XX
P (X = m, Y = n) = 1
n
Z Z
f (x, y)dxdy = 1
R
m
R
Z Z
X
P ((X, Y ) ∈ A)
continuous
f (x, y)
P (X = m, Y = n)
f (x, y)dxdy
A
(m,n)∈A
¦
Example 1 A coin is tossed 4 times. Let X be the number of heads in the first toss,
and Y be the number of heads in the 4 tosses. Find the joint pdf of (X, Y ).
4
Example 2 Let (X, Y ) be a couple random of variable with pdf
f (x, y) = cxy , 0 < x < y < 1
Find the value of the constant c, and find P (XY < 1/2).
4
Definition 2 The marginal pdf of X and Y are defined by
X
P (X = m, Y = n) , and
discrete case: P (X = m) =
n
P (Y = n) =
X
P (X = m, Y = n)
m
Z
Z
continuous case: g(x) =
f (x, y) dy , and h(y) =
R
f (x, y) dx
¦
R
Proposition 1 Two random variables X and Y are independent if and only if:
discrete case: P (X = m, Y = n) = P (X = m) × P (Y = n) for all m and n
continuous case: f (x, y) = g(x) × h(y)
¤
Example 3 Find the marginal pdf in examples 1 and 2.
4
Example 4 (Trinomial distribution) An urn contains 3 red balls, 3 white balls, and 4
blue balls. 15 balls are selected at random and with replacement form the urn. Let X
be the number of red balls, and Y be the number of white balls among the 15 selected.
Find the joint pdf of (X, Y ), then find the marginal pdf of X and Y .
4
1
1
Transformation of Couples of Random Variables
Let (X, Y ) be a couple of random variables with joint pdf f (x, y), and let U = g(X, Y ),
and V = h(X, Y ).
question: what is the joint pdf of the couple (U, V )?
Example 5 Let X and Y be a couple of random variables with joint pdf
f (x, y) = 2 , 0 < x < y < 1
Let U = X/Y and V = Y , find the joint pdf of the couple (U, V ). Are U and V
independent.
4
Proposition 2 k(u, v) = |J(u, v)| × f (l(u, v), m(u, v))
¤
Example 6 Let X and Y be independent χ2 (2) distributions. Find the joint pdf of
U = X + Y and V = X − Y , then find the marginal pdf of V .
solving the system of equations
yields
¯
¯ x=
¯ 1 1 ¯
¯ 2 2 ¯
the Jacobian is J(u, v) = ¯ 1
¯ = − 12
¯
−1 ¯
2
and hence
g(u, v) =
u+v
2
and y =
u−v
2
2
1 −u/2
e
8
0≤u+v ≤∞ , 0≤u−v ≤∞
v
Z
u
h(u) =
−u
k(v) =
6
1 −u/2
1
e
dv = u e−u/2
8
4
0 < u < +∞
 Z +∞
1 −u/2
1


e
du = ev/2


 −v 8
4
Z





+∞
v
v=u
R
−∞<v <0
1 −u/2
1
e
du = e−v/2
8
4
-
u
o < v < +∞
v = −u
4
question: Let (X, Y ) be a couple of random variables with joint pdf f (x, y). How to
find the pdf of U = g(X, Y ) ?
Exercise 1 Let X and Y be a couple of random variables with pdf
f (x, y) =
1
x≥1, y≥1
x2 y 2
Find the joint pdf of U = XY and V = X/Y . Find the marginal pdf of U and V .
2
pu
√
solving the system of equations
yields
x
=
uv
and
y
=
v
¯
¯ √
√
u
¯
¯ √v
√
¯
¯ 2 u
2 v
¯=−1
the Jacobian is J(u, v) = ¯¯
√
2v
u ¯
1
¯ 2√uv − 2v√v ¯
and hence
g(u, v) =
Z
u
h(u) =
1
u
k(v) =
1
2u2 v
uv ≥ 1 ,
v
iu
ln u
1
1 h
=
dv
=
ln
v
1
2u2 v
2u2
u2
u
 Z +∞
1
1


du =


2
 1/v 2u v
2
Z





+∞
v
u
≥1
v
u≥1
0≤v≤1
1
1
du = 2
2
2u v
2v
v=u
1
6v = u
...
...
...
...
.
1
v≥1
R
-
u
5
Fisher distribution: Let X1 , X2 be two independent chi-square random variables
with r1 and r2 degrees of freedom, respectively. Let Y1 = (X1 /r1 )/(X2 /r2 ) and Y2 = X2 .
Find the joint pdf of Y1 and Y2 , then find the marginal pdf of Y1 .
f (x1 , x2 ) =
r1
r
1
−1 22 −1 − x1 +x2
2
x
x
e 2
2
Γ(r1 /2)Γ(r2 /2)2(r1 +r2 )/2 1
solving the system of equations yields x1 =
¯ r
¯
¯ 1 y2 r1 y1 ¯
¯ r
¯
r2
the Jacobian is J(y1 , y2 ) = ¯ 2
¯=
¯ 0
1 ¯
0 < x1 < ∞ , 0 < x2 < ∞
r1
r2
y1 y2 , and x2 = y2
r1
r2
y2
and hence
³ ´r1 /2
g(y1 , y2 ) =
r1
r2
r1
−1
2
Γ(r1 /2)Γ(r2 /2)2(r1 +r2 )/2
y1
is the pdf of the couple (Y1 , Y2 )
³ ´r1 /2
r1
r2
r1
−1
2
Z
r1
r
+ 22 −1
2
y2
∞
−
e
r1
r
+ 22 −1
2
( rr12 y1 +1) y2
2
−
( rr12 y1 +1) y2
2
y2
h(y1 ) =
y
e
Γ(r1 /2)Γ(r2 /2)2(r1 +r2 )/2 1
0
Z ∞
x
but
xα−1 e− θ dx = Γ(α)θα (pdf of a G(α, θ) distribution)
0
by comparison, α =
r1 +r2
2
and θ =
2
r1
y +1
r2 1
3
0 < y 1 < ∞ , 0 < y2 < ∞
dy2
r1
−1
r1 r1 /2 Γ((r1 + r2 )/2)
y12
hence h(y1 ) =
. r1
Γ(r1 /2)Γ(r2 /2) ( r2 y1 + 1)(r1 +r2 )/2
0 < y1 < ∞
(the distribution of Y1 is called Fisher).
The Beta distribution: Let X and Y be independent random variables with distriX
butions G(α, θ) and G(β, θ) respectively. Let U = X+Y
and V = X + Y . Find the
joint pdf of the couple (U, V ), then find the marginal pdf of U and V .
f (x, y) =
1
xα−1 y β−1 e−(x+y)/θ
Γ(α)Γ(β)θα+β
0<x<∞, 0<y<∞
solving the system of equations
yields x
¯
¯ = uv and y = v(1 − u)
¯ v
u ¯¯
¯
the Jacobian is J(u, v) = ¯
¯=v
¯ −v 1 − u ¯
and hence
g(u, v) =
Z
1
uα−1 (1 − u)β−1 v α+β−1 e−v/θ
α+β
Γ(α)Γ(β)θ
0<u<1, 0<v<∞
∞
1
uα−1 (1 − u)β−1 v α+β−1 e−v/θ dv
Γ(α)Γ(β)θα+β
0
Z
uα−1 (1 − u)β−1 ∞ 1 α+β−1 −v/θ
v
e
dv
=
Γ(α)Γ(β)
θα+β
0
h(u) =
=
Γ(α + β) α−1
u (1 − u)β−1
Γ(α)Γ(β)
0<u<1
U is called Beta distribution with parameters α and β, denoted Beta(α, β)
Z 1
1
k(v) =
uα−1 (1 − u)β−1 v α+β−1 e−v/θ du
α+β
Γ(α)Γ(β)θ
0
Z 1
v α+β−1 e−v/θ
=
uα−1 (1 − u)β−1 du
Γ(α)Γ(β)θα+β 0
=
v α+β−1 e−v/θ
Γ(α)Γ(β)
1
×
=
v α+β−1 e−v/θ
α+β
Γ(α)Γ(β)θ
Γ(α + β)
Γ(α + β)θα+β
0<v<∞
V is then a Gamma distribution with parameters α + β and θ.
Proposition 3 If X Ã G(α, θ), and Y Ã G(β, θ), and if X and Y are independent,
then
X
have a Beta distribution with parameters α and β.
a.
X +Y
b.
Y
have a Beta distribution with parameters β and α.
X +Y
c. X + Y Ã G(α + β, θ).
¤
4
2
Several Independent Random Variables
Definition 3 If X1 , X2 , . . . , Xn are n independent random variables, then
P (X1 ∈ B1 ∩ X2 ∈ B2 ∩ . . . ∩ Xn ∈ Bn ) = P (X1 ∈ B1 ) × . . . × P (Xn ∈ Bn )
¦
Notation: If X1 , X2 , . . . , Xn are n independent random variables with the same pdf,
then X1 , X2 , . . . , Xn is called a random sample of size n.
Property 1 If X1 , X2 , . . . , Xn are n independent random variables, and if g1 , g2 , . . . , gn
are n functions, then
E(g1 (X1 ) × . . . × gn (Xn )) = E(g1 (X1 )) × . . . × E(gn (Xn ))
Proof:
Z Z
Z
E(g1 (X1 )×. . .×gn (Xn )) =
...
R
=
¤
R
g1 (x1 )×. . .×gn (xn )×f1 (x1 )×. . .×fn (xn )dx1 . . . dxn
R
¡R
¢ ¡R
¢
g
(x
)f
(x
)dx
.
.
.
g
(x
)f
(x
)dx
1
1
1
1
1
n
n
n
n
1
R
R
= E(g1 (X1 )) × . . . × E(gn (Xn ))
Example 7 Let X1 , X2 , X3 be a random sample of size 3 with exponential distribution
with parameter λ = 1. Find
a) P (0 < X1 < 1, 0 < X2 < 1/2, X3 > 1),
b) E(2X12 X3 )
4
Proposition 4 (distribution of sum of independent random variables)
Let X1 , X2 , . . . , Xn be n independent random variables, and let Y = a1 X1 + . . . + an Xn
(ai are scalars), then
MY (t) = MX1 (a1 t) × . . . × MXn (an t)
¤
Proof: MY (t) = E(etY ) = E(et(a1 X1 +...+an Xn ) ) = E(eta1 X1 × . . . × etan Xn ), hence by
property 1,
MY (t) = E(eta1 X1 ) × . . . × E(etan Xn ) = MX1 (a1 t) × . . . × MXn (an t)
Proposition 5 Let X1 , X2 , . . . , Xn be a random sample of size n, and let Y = X1 +
. . . + Xn , then
MY (t) = (MX1 (t))n
Proof:
¤
simple consequence of proposition 4, with all ai = 1 and MXi are equal.
Property 2 If Xi à G(αi ; θ) for i = 1, 2, . . . , n, and if the Xi ’s are independent, then
Y = X1 + . . . + Xn à G(α1 + α2 + . . . + αn ; θ)
5
¤
µ
Proof:
MY (t) = MX1 (a1 t) × . . . × MXn (an t) =
µ
=
1
1 − θt
1
1 − θt
¶ α1
µ
× ... ×
1
1 − θt
¶αn
¶α1 +α2 +...+αn
Exercise 2 Let X1 , X2 , . . . , Xn be a random sample of size n with pdf f (x). Find the
pdf of Y = max(X1 , X2 , . . . , Xn ).
5
Solution:
We first start by finding G, the cdf of Y ,
G(y) = P (Y ≤ y) = P (max(X1 , X2 , . . . , Xn ) ≤ y) = P (X1 ≤ y ∩ . . . ∩ Xn ≤ y)
= P (X1 ≤ y) × . . . × P (Xn ≤ y) = (F (y))n , where F is the cdf of Xi
Now to find g(y), the pdf Y :
- continuous case:
g(y) = G0 (y) = ((F (y))n )0 = n × (F (y))n−1 × f (y) , where f is the pdf of Xi
- discrete case:
P (Y = y) = P (Y ≤ y) − P (Y ≤ y − 1) = (F (y))n − (F (y − 1))n ,
Exercise 3 Let X1 , X2 , . . . , Xn be a random sample of size n with pdf f (x). Find the
pdf of Z = min(X1 , X2 , . . . , Xn ).
5
6