ENGG2430A-Homework 2
Due on Feb 19th, 2014
1. Independence vs correlation
(a) For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether
the random variables X and Y are independent, and whether they are uncorrelated.
i. (X, Y) is uniformly distributed over {(1, 1), (1, −1), (−1, −1), (−1, 1)}.
Solution: (X, Y) is uniformly distributed over {(1, 1), (1, −1), (−1, −1), (−1, 1)}. The joint
pmf of X and Y is
PXY (x, y) =
1
for (x, y) ∈ {−1, 1}2 .
4
We can obtain the pmf’s of X and Y by marginalization as follows.
PX (x) =
X
PXY (x, y) =
1
for all x ∈ {−1, 1}
2
PXY (x, y) =
1
for all y ∈ {−1, 1}.
2
y∈{−1,1}
PY (y) =
X
x∈{−1,1}
It follows that PXY (x, y) = PX (y)PY (y) and so X and Y are independent. This also implies
uncorrelatedness as proved in the lecture.
ii. (X, Y) is uniformly distributed over {(1, 0), (−1, 0), (0, 1), (0, −1)}.
Solution: (X, Y) is uniformly distributed over {(1, 0), (−1, 0), (0, 1), (0, −1)}. The marginal
pmfs of X and Y are
(
1
α ∈ {−1, 1}
.
PX (α) = PY (α) = 41
α=0
2
X and Y are not independent because PY (0) =
they are uncorrelated because
1
2
but PY|X (0|1) = 1 6= PY (0). However,
1 · 0 + (−1) · 0 + 0 · 1 + 0 · (−1)
=0
4
1 + (−1) + 0 + 0
E[X] =
=0
4
0 + 0 + 1 + (−1)
E[Y] =
=0
4
E[XY] =
and so Cov(X, Y) = E[XY] − E[X]E[Y] = 0.
(b) For each of the following cases, compute the marginal pdfs from the joint pdfs. Explain whether X
and Y are independent, and whether they are uncorrelated.
i. (X, Y) is uniformly distributed over the unit circle {(x, y) : x2 + y 2 ≤ 1}.
1
Solution: (X, Y) is uniformly distributed over the unit circle C := {(x, y) : x2 + y 2 ≤ 1}.
X and Y are not independent. The joint pdf is
fXY (x, y) =
1
for (x, y) ∈ C.
π
The marginal pdfs are
Z
∞
for x ∈ [−1, 1]
fXY (x, y)dy
−∞
√
Z 1−x2
fX (x) =
=
√
− 1−x2
1
dy
π
√
2 1 − x2
=
pπ
2 1 − y2
fY (y) =
π
,and similarly
for y ∈ [−1, 1].
The product distribution fX (x)fY (y) is clearly not uniformly distributed, and therefore not
equal to the joint pdf.
However, X and Y are uncorrelated as follows.
Z ∞Z ∞
xyfXY (x, y)dx dy
E[XY] =
−∞ −∞
Z
xy
=
dx dy
C π
=0
which can be seen easily by symmetry. Similarly, E[X] = E[Y] = 0, and so Cov(XY) = 0
as desired.
2 2
1
exp − x +y
ii. The joint pdf is fX,Y (x, y) = 2π
(jointly gaussian).
2
2
Solution: Notice that the joint pdf can be factored as f (x)f (y) with f (α) := √12π exp − α2 ,
which is the gaussian distribution. Therefore, X and Y are independent with marginal distribution fX (α) = fY (α) = f (α).
Alternatively, suppose we don’t know that f is a valid pdf that integrates to 1. We can
still derive the same result by marginalization.
Z ∞
2 2
1
fX (x) =
exp − x +y
dy
2
−∞ 2π
Z ∞
= f (x)
f (y)dy = cf (x)
| −∞ {z
}
c:=
Z ∞
fY (y) = f (y)
f (x)dx = cf (y)
−∞
2
fX (x)fY (y) = c f (x)f (y) = c2 fX,Y (x, y)
c2 = 1 because
Z
∞
Z
∞
1=
Z
∞
fXY (x, y)dx dy =
−∞
−∞
Z
∞
f (x)dx
−∞
f (y)dy = c2
−∞
The integration for the first equality can be performed by changing the coordinate system
Page 2
to Polar coodinates. i.e.
Z
∞
Z
∞
2 2
1
exp − x +y
dx dy
2
−∞ −∞ 2π
Z ∞ Z 2π
2
1
=
exp − r2 r dθ dr
2π
Z0 ∞ 0 r2
exp − 2 r dr
=
0
Z ∞
2 r2 exp − r2 d
=1
=
2
0
fXY (x, y)dx dy =
(c) Explain why uncorrelatedness does not imply independence.
Solution: From a) ii) and b) i), we see that uncorrelatedness does not imply independence.
Intuitively, the requirement for independence is more stringent because there are many equations involved. e.g. in the discrete case, independence requires
PXY (x, y) = PX (x)PY (y)
for all possible x ∈ X and y ∈ Y. However, the requirement for uncorrelatedness consists of only
one equation E[XY] = E[X]E[Y] and so does not cover the entire set of equations demanded
for independence.
Page 3
2. Sequence of random variables
(a) Prove that
i. Var(X − E[X]) = Var(X)
Solution: Let Z = X − E[X]. It follows that E[Z] = E[X] − E[X] = 0 by the linearity of
expectation. Furthermore,
Var(Z) = E[(X − E[X])2 ] = Var(X)
which is the desired result.
ii. Var(X + Y) = Var(X) + 2 Cov(X, Y) + Var(Y)
Solution: Assume for simplicity that X and Y have zero mean, since the mean does not
affect the variance by i). By definition,
Var(X + Y) = E[(X + Y)2 ]
= E[X2 ] + 2E[XY] + E[Y2 ]
= Var(X) + 2 Cov(X, Y) + Var(Y)
where the second equality is again by the linearity of expectation.
(b) Let Xi for i = {1, . . . , n} be continuous random variables that are identically distributed. Suppose
every pair of distinct random variables Xi and Xj have the same correlation. Compute
i. the correlation of (X2 − X1 ) and (X3 − X2 )
Solution: Let σ 2 and ρ be the variance of Xi and correlation of Xi and Xj respectively for
i 6= j. For simplicity, we can assume the variables have zero mean, since the value of the
mean does not affect the correlation, similar to a) i). Then,
ρ=
E[Xi Xj ]
, for i 6= j.
σ2
E[(X2 − X1 )(X3 − X2 )] = E[X2 X3 ] − E[X2 X2 ] − E[X1 X3 ] + E[X1 X2 ]
= ρσ 2 − σ 2 − ρσ 2 + ρσ 2
= ρσ 2 − σ 2
Similarly,
E[(X2 − X1 )2 ] = E[X22 ] − 2E[X2 X1 ] + E[X21 ]
= σ 2 − 2ρσ 2 + σ 2
= 2(1 − ρ)σ 2
By symmetry, E[(X3 − X2 )2 ] = 2(1 − ρ)σ 2 , and so the desired correlation is,
E[(X2 − X1 )(X3 − X2 )]
1
(ρ − 1)σ 2
p
=
=− .
2
2
2
2(1 − ρ)σ
2
E[(X2 − X1 ) ]E[(X3 − X2 ) ]
ii. the correlation of (X2 − X1 ) and (X3 − X2 )
Page 4
Solution: Similar to b) i), we assume Xi ’s have zero mean.
E[(X2 − X1 )(X4 − X3 )] = E[X2 X4 ] − E[X2 X3 ] − E[X1 X4 ] + E[(X1 X3 ]
= ρσ 2 − ρσ 2 − ρσ 2 + ρσ 2
=0
Thus, the desired correlation is 0.
(c) Explain why the answers to b) i) and ii) are the same/different.
Solution: The difference in b) i) and ii) is because, in b) i), the differences (X2 − X1 ) and
(X3 −X2 ) share the same random variable X2 , but the differences in b) ii) do not. The correlation
in b) i) is negative because, when X2 is large, the first difference (X2 − X1 ) tends to be large
but the second difference (X3 − X2 ) tends to be small.
Page 5
© Copyright 2026 Paperzz