EPS-Fall2014-HW6.pdf

Engineering Probability & Statistics
Sharif University of Technology
Hamid R. Rabiee & S. Abbas Hosseini
November 30, 2014
CE 181
Date Due: Azar 22nd , 1393
Homework 6
Problems
1. A univariate Gaussian (Normal) distribution with mean µ and variance σ 2 is defined as
N (x|µ, σ 2 ) = √
(x−µ)2
1
e− 2σ2
2πσ
Suppose we have N i.i.d random variables sampled from a Gaussian distribution N (x|µ, σ 2 ).
Derive the distribution over the set of random variables. This distribution also defines an N
dimensional random variable x, for which, dimensions are statistically independent. Describe
why this equivalence holds.
2. The random variables X and Y are said to have a bivariate Gaussian distribution if their joint
density function is given by
(
"
#)
y − µy 2
1
1
x − µx 2 2ρ(x − µx )(y − µy )
p
×
+
p(x, y) =
exp −
−
2 (1 − ρ2 )
σx
σx σy
σy
2πσx σy 1 − ρ2
where the quantity ρ is called the correlation between X and Y and is defined as
ρ=
E [(X − µx )(Y − µy )]
cov(X, Y )
=
σx σy
σx σy
(a) Show that X is normally distributed with mean µx and variance σx2 , and Y is normally
distributed with mean µy and variance σy2 .
(b) Find the conditional density of X given that Y = y, and of Y given that X = x.
3. Let x ∼ N (0, 1). Show that P (x > x + |x > x) ≈ e−x for large x and small .
4. The Exponential Family of distributions over x, given parameter vector η, is defined to be the
set of distributions of the form
p(x|η) = h(x)g(η) exp {η T u(x)}
Express these list of distributions as members of the exponential family and derive expressions
for η, u(x), h(x), and g(η).
(a) Beta distribution
Beta(µ|a, b) =
where Γ(a) is the gamma function.
1
Γ(a + b) a−1
µ (1 − µ)b−1
Γ(a)Γ(b)
(b) Gamma distribution
Gam(λ|a, b) =
1 a a−1
b λ
exp(−bλ)
Γ(a)
(c) Multi-variate Gaussian distribution
N (x|µ, Σ) =
1
(2π)N/2 |Σ|1/2
exp
n
o
1
T −1
− (x − µ) Σ (x − µ)
2
5. (a) Let x1 , x2 , · · · , xn be random variables with expected values x̄1 , x̄2 , · · · , x̄n . Show that
E[x1 + · · · xn ] = x̄1 + · · · + x̄n . Assume that random variables have a joint density function,
but do not assume that the random variables are independent.
(b) Now assume that x1 , x2 , · · · , xn are statistically independent and show that the expected
value of the product is equal to the product of the expected value.
(c) Again, assuming that x1 , x2 , · · · , xn are statistically independent, show that the variance
of the sum is equal to the sum of the variance.
6. For a given probability distribution p(x|η), we can seek a prior distribution p(η) that is conjugate
to the likelihood function (p(x|η)), so that the posterior distribution (p(η|x)) has the same
funcional form as the prior.
For any member of the Exponential Family of the form
p(x|η) = h(x)g(η) exp {η T u(x)}
there exists a conjugate prior that can be written in the form
p(η|χ, ν) = f (χ, ν)g(η)ν exp{νη T χ}
where f (χ, ν) is a normalization coefficient.
Show that this distribution is indeed a conjugate for the exponential family of distributions.
7. Prove the following identity.
E [y] = E [E [x|y]]
Suppose that N is a counting random variable, with values {0, 1, · · · , n}, and that given (N = k),
for k ≥ 1, there are defined random variables X1 , · · · , Xn such that
E (Xj |N = k) = µ
(1 ≤ j ≤ k)
Define a random variable SN by
SN =
X1 + X2 + · · · + Xk
0
if(N − k), 1 ≤ k ≤ n
if(N = 0)
Show that E(SN ) = µE(N ).
8. Let Y be a gamma random variable with parameters (a, b). That is, its density is
fY (y) =
1 a a−1
b y
exp(−by)
Γ(a)
2
Suppose also that the conditional distribution of X given that Y = y is Poisson with mean y.
That is,
P {X = i|Y = y} = e−y y i /i!
Find the conditional distribution of Y given that X = i. Which distribution is it? What are its
parameters?
3