Week 3 [20pt] Review of Continuous Random Variables and Joint

Week 3
Review of Continuous Random Variables and
Joint and Conditional Distributions
Continuous Random Variables and Joint and Conditional Distributions
Continuous random variables
X is a continuous random variable if its distribution function FX is
such that
Z x
FX (x) =
fX (u)du
−∞
We say that fX is the density. fX satisfies
fX (x) =
dFX (x)
, ∀x∈R
dx
Properties of Density Functions
1. fX (x) ≥ 0∀ x
R∞
2. −∞ fX (x) dx = 1
Continuous Random Variables and Joint and Conditional Distributions
Expectations and Moments
Expectation
For continuous X with density fX (x)
Z ∞
E[X] =
xfX (x) dx
−∞
Exercise 3.1: Show that Rfor a non-negative continuous random
∞
variable we have E[X] = 0 P(X > x) dx.
For continuous X with density fX (x)
Z ∞
E[g(X)] =
g(x)fX (x) dx
−∞
provided the integral exists
Exercise 3.2: Prove the above formula in the case g is strictly
increasing. Hint: find the distribution of Y = g(X) and use
integration by parts.
Continuous Random Variables and Joint and Conditional Distributions
Expectations and Moments
For continuous X with density fX (x)
Z ∞
V ar[X] =
(x − µX )2 fX (x) dx
−∞
As before, the mgf is given by
m(t) = E[etX ] =
Z
∞
−∞
etx fX (x) dx
Special Univariate Distributions
Expectations and Moments
Special Univariate Distributions
Special Univariate Distributions
Uniform Distribution
(
f (x; a, b) =
a≤x≤b
otherwise
0
a+b
2
V ar[X] =
(b − a)2
12
0.25
E[X] =
1
b−a
0.15
0.10
0.05
0.00
f(x)
0.20
U(3, 8)
2
4
6
x
8
10
Special Univariate Distributions
Normal Distribution
1
1
f (x; µ, σ) = √
exp − 2 (x − µ)2
2
2σ
2πσ
x ∈ R, µ ∈ R, σ > 0
V ar[X] = σ 2
0.20
E[X] = µ
0.00
0.05
f(x)
0.10
0.15
N(3, 2)
N(6, 2)
N(3, 3)
−5
0
5
x
10
15
Special Univariate Distributions
Standard Normal Distribution
A standard normal RV
X ∼ N (0, 1)
has distribution function
Φ(x) = F (x; 0, 1)
For Z ∼ N (µ, σ 2 )
P [a < Z < b] = Φ
b−µ
σ
−Φ
a−µ
σ
Special Univariate Distributions
Lognormal Distribution
If log(X) ∼ N (µ, σ 2 ), then
1
2
f (x; µ, σ ) =
exp − 2 (log x − µ) , x > 0, µ ∈ R, σ > 0
1
2σ
(2π) 2 σx
1
2.0
2
1.0
0.5
0.0
f(x)
1.5
LN(0, 1)
LN(0, 2)
LN(0, 8)
LN(0, 1/2)
LN(0, 1/8)
0.0
0.5
1.0
1.5
x
2.0
2.5
3.0
Special Univariate Distributions
Exercise 3.3
Let X ∼ N (0, 1). Show that
β2 =
µ4
=3
σ4
(i.e. kurtosis of a standard normal is 3).
Special Univariate Distributions
Exponential and Gamma Distribution
(
f (x; r, λ) =
λ
r−1 e−λx
Γ(r) (λx)
for 0 ≤ x < ∞, r > 0, λ > 0
0
otherwise
r
λ
V ar[X] =
0.8
1.0
E[X] =
0.0
0.2
0.4
f(x)
0.6
Gamma(1, 1) = Exp(1)
Gamma(2, 1)
Gamma(3, 1)
0
2
4
x
6
8
r
λ2
Special Univariate Distributions
Beta Distribution
(
for 0 ≤ x < 1, a > 0, b > 0
otherwise
a
a+b
ab
(a + b + 1)(a + b)2
3.0
V ar[X] =
0.5
1.0
1.5
2.0
2.5
Beta(5, 3)
Beta(3, 3)
Beta(2, 2)
Beta(1, 1) = U(0, 1)
0.0
E[X] =
− x)b−1
0
f(x)
f (x; a, b) =
1
a−1 (1
B(a,b) x
0.0
0.2
0.4
0.6
0.8
1.0
Special Univariate Distributions
Transformation Y = g(X)
Distribution of a Function of a Random Variable
Let X be a RV and Y = g(X) where g is injective. Then
−1 dg (y) −1
fY (y) = fX (g (y)) dy given that (g −1 (y))0 exists and (g −1 (y))0 > 0
(g −1 (y))0 < 0 ∀ y.
∀ y or
Special Univariate Distributions
Transformation Y = g(X)
Exercise 3.4
Let X be distributed exponentially with parameter α, that is
(
αe−αx x ≥ 0
fX (x) =
0
x<0
Find the density function of
(
0
1. Y = g(X) with g(X) =
1 − e−αx
1
2. Y = X β ,
for x < 0
for x ≥ 0
β>0


0 for x < 0
3. Y = g(X) with g(X) = x for 0 ≤ x ≤ 1


1 for x > 0
Special Univariate Distributions
Transformation Y = g(X)
Probability Integral Transformation
If X is a RV with continuous FX (x), then U = FX (X) is
uniformly distributed over the interval (0, 1).
Conversely if U is uniform over (0, 1), then X = FX−1 (U ) has
distribution function FX .
Special Univariate Distributions
Transformation Y = g(X)
Exercise 3.5
Suppose that you wish to generate two independent values from a
distribution whose density function is
f (x) = x +
1
2
for 0 < x < 1.
Show how such values can be obtained, given the following values
generated by a uniform pseudo-random number generator over the
range [0, 1]
x1 = 0.25, x2 = 0.46
Special Univariate Distributions
Transformation Y = g(X)
Exercise 3.6
Let Xi , i = 1, . . . , n be independent Normal random variables with
parameters µi , σi2 .
P
Find the distribution of i Xi using the MGF technique.
Joint and Conditional Distributions
Joint and Conditional Distributions
Joint and Conditional Distributions
Joint Distributions
Joint Distribution Function
Joint Distribution Function For random variables X1 , . . . Xk all
defined on the same (Ω, A, P [.]), the function F : Rk → [0, 1]
FX1 ,...,Xk (x1 , . . . , xk ) = P [X1 ≤ x1 ; . . . ; Xk ≤ xk ] ∀(x1 , . . . , xk )
Marginal Distribution Function For FX1 ,...,Xk and Xi1 , . . . , Xin a
strict subset of X1 , . . . , Xk , the function FXi1 ,...Xin . In other words
we set xj = ∞ for those j for which j 6= i1, . . . in.
Example: If X, Y, Z have joint CDF FX,Y,Z (x, y, z) then X, Z
have marginal CDF FX,Z (x, z) = FX,Y,Z (x, ∞, z).
Joint and Conditional Distributions
Joint Distributions
Joint Discrete Mass Function
Joint Discrete Mass Function For a k-dimensional discrete random
variable (X1 , . . . Xk ), the function
pX1 ,...,Xk (x1 , . . . , xk ) = P[X1 = x1 ; . . . ; Xk = xk ] ∀(x1 , . . . , xk )
Marginal Discrete Mass Function For the joint discrete density
function pX1 ,...,Xk and Xi1 , . . . , Xin a strict subset of X1 , . . . , Xk ,
the function
X
pXi1 ,...Xin =
pX1 ,...,Xk (x1 , . . . , xk ).
j6=i1,...in
Example: If X, Y, Z have joint mass function pX,Y,Z (x, y, z) then
X, Z have marginal
mass function
P
pX,Z (x, z) = yj pX,Y,Z (x, yj , z).
Joint and Conditional Distributions
Joint Distributions
Exercise 3.7
Consider the tossing of 3 different coins. Let
X1 : number of heads on the first and second coin
X2 : number of heads on the second and third coin
I
I
What is the joint density function of X1 and X2 ? Present it
in a table.
Find
I
I
I
I
I
F (0.4, 1.3)
F (0, 0)
F (1.4, 2.1)
F (−1, 2)
P (X1 = 1, X2 ≥ 1)
Joint and Conditional Distributions
Joint Distributions
Joint Continuous Density Function
Joint Continuous Density Function For a k-dimensional random
variable (X1 , . . . , Xk ), the function fX1 ,...,Xk (x1 , . . . , xk ) ≥ 0 s.t.
Z xk
Z x1
fX1 ,...,Xk (u1 , . . . , uk ) du1 . . . duk
FX1 ,...,Xk (x1 , . . . , xk ) =
···
∞
∞
for all (x1 , . . . , xk )
Marginal Density Function For the joint continuous density
function fX1 ,...,Xk and Xi1 , . . . , Xin a strict subset of X1 , . . . , Xk ,
the function fXi1 ,...,Xin .
Example: If X, Y, Z have joint density function fX,Y,Z (x, y, z) then
X, Z have marginal
R ∞ density function
fX,Z (x, z) = −∞ fX,Y,Z (x, y, z)dy.
Joint and Conditional Distributions
Joint Distributions
Exercise 3.8
A random variable (X1 , X2 ) has joint density
(
1
(6 − X1 − X2 ) for 0 ≤ X1 ≤ 2, 2 ≤ X2 ≤ 4
f (x1 , x2 ) = 8
0
otherwise
I
I
Show that f is a density function
Find
1. F(1, 3)
2. F(0, 1)
3. F(3, 5)
Joint and Conditional Distributions
Independence
X and Y are independent random variables if and only if
FX,Y (x, y) = FX (x)FY (y) ∀x, y.
For discrete random vaiables the conditions
I
FX,Y (x, y) = FX (x)FY (y) ∀x, y.
I
pX,Y (x, y) = pX (x)pY (y) ∀x, y
are equivalent.
For continuous random variables
I
FX,Y (x, y) = FX (x)FY (y) ∀x, y.
I
fX,Y (x, y) = fX (x)fY (y) ∀x, y
Independence means multiply.
Joint and Conditional Distributions
Independence
It can be shown that for independent random variables
E[XY ] = E[X]E[Y ]
Exercise 3.9: Prove this for non-negative discrete independent
random variables.
P
Sums of Independent RVs For Y = i Xi , where the Xi are
independent RVs for which the MGF exists ∀ − h < t < h, h > 0
P
Y
mY (t) = E[e i tXi ] =
mXi (t) for − h < t < h
i
Q
Thus
above.
i mXi (t)
may be used to identify the distribution of Y as
Joint and Conditional Distributions
Special Multivariate Distributions
Discrete
Multinomial Distribution
I
Generalises binomial distribution to trials with k + 1 distinct
possible outcomes
fX1 ,...,Xk (x1 , . . . , xk ) = Qk+1
i=1
Pk+1
n!
Q
xi
xi ! k+1
i=1 pi
where xi = 0, . . . , n and i=1 = n. (n is fixed, so value of
Xk+1 is determined by values of X1 , . . . , Xk )
Joint and Conditional Distributions
Special Multivariate Distributions
Discrete
Bivariate Normal Distribution
"
x1 − µ1 2
1
+
f (x1 , x2 ) =
exp −
2(1 − ρ)
σ1
2πσ1 σ2 1 − ρ2
#)
x2 − µ2 2
x1 − µ1
x2 − µ2
− 2ρ
σ2
σ1
σ2
1
p
(
for −∞ < x1 , x2 , µ1 , µ2 < ∞, σ1 , σ2 > 0, −1 < ρ < 1.
I
ρ is the correlation coefficient
I
for ρ = 0 the bivariate normal is the product of two univariate
normals
Joint and Conditional Distributions
Special Multivariate Distributions
Discrete
Multivariate Normal Distribution
For X ∼ N (µ, Σ)
r
1
1
− 21
T −1
|Σ| exp − (X − µ) Σ (X − µ)
f (x) = √
2
2π
where
 
X1
 .. 
X =  . ,
Xr


µ1
 
µ =  ...  ,
µr
Σ = E (X − µ)(X − µ)T
Joint and Conditional Distributions
Conditional Distributions and Densities
Conditional Discrete Distributions
Conditional Discrete Mass Function For discrete RVs with X and
Y with probability mass points x1 , x2 , . . . , xn and y1 , y2 , . . . , yn ,
pY |X (yj |xi ) =
P [X = xi ; Y = yj ]
= P [Y = yj |X = xi ]
P [X = xi ]
Conditional Discrete Distribution For jointly discrete random
variables X and Y ,
X
FY |X (y|x) = P [Y ≤ y|X = x] =
pY |X (yj |x)
j:yj ≤y
Joint and Conditional Distributions
Conditional Distributions and Densities
Exercise 3.10
Let Y1 and Y2 be two RVs with joint density
0
1
2
0
q3
pq 2
0
1
pq 2
pq
p2 q
2
0
2
p q
p3
I
Find the marginal densities of Y1 and Y2
I
Find the conditional density function of y2 given y1
Find
I
1. E[Y1 − Y2 ]
2. E[Y1 + Y2 ]
3. E[Y1 ]
Joint and Conditional Distributions
Conditional Distributions and Densities
Conditional Continuous Distributions
Conditional Probablity Density Function For continuous RVs X
and Y with joint probability density function fX,Y (x, y),
fY |X (y|x) =
fX,Y (x, y)
, if fX (x) > 0
fX (x)
where fX (x) is the marginal density of X.
Conditional Distribution For jointly continuous random variables X
and Y ,
Z y
FY |X (y|x) =
fY |X (z|x) dz∀ x : fX (x) > 0
∞
Joint and Conditional Distributions
Conditional Expectation
Conditional Expectation
Discrete
E[Y |X = x] =
X
yPY |X (Y = y|X = x)
all y
Continuous
Z
∞
E[Y |X = x] =
yfY |X (Y |X = x) dy
−∞
Joint and Conditional Distributions
Conditional Expectation
Exercise 3.11
A random variable (X1 , X2 ) has joint density
(
1
(6 − X1 − X2 ) for 0 ≤ X1 ≤ 2, 2 ≤ X2 ≤ 4
f (x1 , x2 ) = 8
0
otherwise
I
Find fX1 |X2 and FX2 |X1
I
Determine FX1 |X2 and FX2 |X1
I
Find E[X1 |X2 = x2 ]
Joint and Conditional Distributions
Independence of Random Variables
Stochastic Independence
Definition 1 Random variables X1 , X2 , . . . , Xn are stochastically
independent iff
n
Y
FXi (xi )
FX1 ,...,Xn (x1 , . . . , xk ) =
i=1
Definition 2 Discrete random variables X1 , X2 , . . . , Xn are
stochastically independent iff
n
Y
pX1 ,...,Xn (x1 , . . . , xk ) =
pXi (xi )
i=1
Definition 3 Continuous random variables X1 , X2 , . . . , Xn are
stochastically independent iff
n
Y
fX1 ,...,Xn (x1 , . . . , xk ) =
fXi (xi )
i=1
Joint and Conditional Distributions
Independence of Random Variables
Exercise 3.12
I
Show that for the bivariate normal distribution
ρ = 0 ⇒ f (x1 , x2 ) = fX1 (x1 )fX2 (x2 )
I
Consider the two-dimensional exponential distribution with
distribution function
F (x1 , x2 ) = 1 − e−x1 − e−x2 + e−x1 −x2 −ρx1 x2 ,
x1 , x2 > 0
Under what condition are X1 and X2 independent? What are
the marginal distributions Fx1 and Fx2 under independence?
Joint and Conditional Distributions
Covariance and Correlation
Covariance and Correlation
Covariance For RVs X and Y defined on the same probability
space
Cov[X, Y ] = E [(X − µX )(Y − µY )]
= E[XY ] − µX µY
Correlation For RVs X and Y defined on the same probability
space
Cov[X, Y ]
ρ[X, Y ] =
σX σY
provided that σX > 0 and σY > 0.
Joint and Conditional Distributions
Covariance and Correlation
Exercise 3.13
Let
(
x1 + x2
f (x1 , x2 ) =
0
for 0 < x1 < 1, 0 < x2 < 1
otherwise
I
Show that X1 and X2 are dependent
I
Find Cov[X1 , X2 ] and ρ[X1 , X2 ]
Joint and Conditional Distributions
Covariance and Correlation
Cauchy-Schwartz Inequality
Let X and Y have finite second moments. Then
(E[XY ])2 = |E[XY ]|2 ≤ E[X 2 ]E[Y 2 ]
with equality if and only if P [Y = cX] = 1 for some constant c.
Joint and Conditional Distributions
Covariance and Correlation
Exercise 3.14
Show that |ρ(X, Y )| ≤ 1, with equality if and only if Y is a linear
function of X with probability 1.
Joint and Conditional Distributions
Covariance and Correlation
Sums of Random Variables
For RVs X1 , . . . , Xn , Y1 , . . . , Ym and constants a1 , . . . , an ,
b1 , . . . , b m
" n
#
n
X
X
E[Xi ]
E
Xi =
i=1
i=1
"
V ar
n
X
i=1
#
Xi =
n
X
V ar[Xi ] + 2
i=1
XX
i
j<i


n
M
n X
m
X
X
X
Cov 
ai Xi
bj Yj  =
ai bj Cov[Xi , Yj ]
i=1
j=1
i=1 j=1
Cov [Xi , Xj ]
Joint and Conditional Distributions
Covariance and Correlation
Exercise 3.15
Let X1 , . . . , X2 be independent and identically distributed random
variables with mean µ and variance σ 2 . Let
n
1X
Xn =
Xi .
n
i=1
Derive E X n and V ar X n .