CMPE 107 - Homework 9
Exercise 1
Given a random variable x that has an exponential pdf with parameter λ, find the 90%
confidence interval, i.e., the interval [0,T] such that P(0<x≤T = 0.9).
P (0 < x " T ) = 1# e#$T = 0.9 hence T=-ln(0.1)/λ
!
Exercise 2
Given two random variables x and y, prove that ρx,y=E[x,y], where x and y are the
standardized versions of x and y.
" x,y =
[
] = E %' x # µ
E ( x # µ x )( y # µ y )
$ x$ y
& $x
x
y # µy (
* = E [ x y]
$y )
Exercise 3
!
Given two random variables, x and y, and z=ax+by where a and b are constant. Suppose
that E[x]=5, E[y]=10, σx2=1, σy2=3, Cov[x,y]=2. Compute E[z] and Var[z].
E [ z] = E [ ax + by ] = aE [ x ] + bE [ y ] = 5a + 10b
2&
2
#
Var[ z] = E ( z " µ z ) = E % a( x " µ x ) + b( y " µ y ) ( =
$
'
[
!
]
(
[
)
]
= a 2) 2x + b 2) 2y + 2abE ( x " µ x ) + ( y " µ y ) = a 2) 2x + b 2) 2y + 2abCov ( x,y ) = a 2 + 3b 2 + 4ab
Exercise 4
!
Consider two random variables, x and y, with joint pdf defined as follows:
"2y , 0 ! y ! 1,0 ! x ! 1
f x, y ( x, y ) = #
(see figure, where lighter color represents smaller
0,otherwise
$
values and darker color represents larger values.)
y
1
1
x
a.
b.
c.
d.
e.
Prove that x and y are statistically independent.
What are the mean and variance of x and of y ?
What are the covariance and the correlation of x and of y ?
What is P(x > 0.5 and y > 0.5)?
What is P(x + y > 1)?
a. To prove that x and y are independent, we need to compute the marginal pdf’s fx(x)
and fy(y) , and verify that fx,y(x,y) = fx(x) fy(y).
%' 1
"
2y dy, 0 $ x $ 1 %1, 0 $ x $ 1
f x ( x ) = # f x ,y ( x, y) dy = �
=&
(0,otherwise
' 0, otherwise
!"
(
%' 1
"
2y dx, 0 $ y $ 1 %2y , 0 $ y $ 1
f y ( y ) = # fx , y ( x, y ) dx = �
=&
( 0, otherwise
' 0, otherwise
!"
(
from which one sees that that fx,y(x,y) = fx(x) fy(y).
b. The mean and variance of x are 0.5 and 1/12 respectively (note that x is uniform
1
1
2
2
between 0 and 1). The mean of y is: E[ y] = ! 2y 2 dy = [ y 3 ]0 = . To compute the
3
3
0
1
1
2
1
variance of y, we first compute E[ y 2 ] = ! 2y 3 dy = [ y 4 ]0 = . The variance of y is
4
2
0
1
4
1
2
2
2
thus ! y = E[ x ] " E [x ] = " = .
2 9 18
c. Since x and y are independent, they are also uncorrelated, that is, their covariance is 0
and their correlation is 0.
d. Since the variables are independent, P(x > 0.5, y > 0.5) = P(x > 0.5) P(y > 0.5) =
!
!
1
1 #2 2%
3
(
)
(
)
f
x
dx
f
y
dx
=
y
=
"0. 5 x
"0.5 y
2 $ 2 & 0.5 8
e. The solution of the equation x + y = 1 is the line joining the points (0,1) and (1,0).
The solution is thus the area of the joint pdf in the half plane beyond this line:
Exercise 5
! ax! by
U (x)U(y) .
The random variables x and y have joint pdf f x, y ( x, y ) = ab e
a. Prove that the variables x and y are statistically independent.
b. Compute E[xy].
c. Compute P((x, y) ! D) , where D is the square shown in the figure (note: the color in
this figure does not represent the values of the joint pdf as in the previous exercise!)
y
1
1
x
a. First, let’s compute the marginal densities:
"
fy ( y) =
"
! ax !by
# f ( x, y ) dx = # ab e
x, y
!"
U( x)U (y) dx =
!"
"
"
!a
= be !by U( y )# ae ! ax dx = be ! byU( y )$ e ! ax & = be ! byU( y )
% a
'0
0
Similarly, f x ( x ) = ae! ay U( x ) . Thus, the two variables are exponential. The
!ax !by
U( x)U (y) = fx , y ( x, y ) therefore
product of the marginals is f x ( x ) f y ( y) = ab e
the variables are independent.
1
b. Given that the two variables are independent, E[xy ] = E [x ]E[ y ] =
(remember
ab
that the expectation of an exponential random variable of parameter c is 1/c).
c. Method 1:
1 1
1
1
P((x, y) ! D) = " " fx , y ( x, y ) dx dy = " f x ( x ) dx " f y ( y) dy =
0 0
# ax 1
# bx 1
0
0
= [ #e
0
0
] [ #e ] = (1 # e )(1# e )
#a
#b
Method 2: first compute the cdf of x and y:
Fx ( x ) = (1 ! e !ax )U(x), Fy ( y ) = (1! e ! by )U (y)
.
Fx, y ( x,y ) = Fx ( x )Fy (y ) = (1 ! e !ax )(1 ! e !by )U(x)U (y)
Then, using the formula of Exercise 7:
P (( x,y ) " D) = Fx,y (1,1) + Fx,y (0,0) # Fx,y (0,1) # Fx,y (1,0) =
= (1# e#a )(1# e#b )
! Exercise 6
Consider two random variables x,y with joint pdf defined as follows:
"a, x + y ! 1
f x, y ( x, y ) = #
$0, elsewhere
a. Find the value of the constant a
b. Prove that E[xy]=0 (i.r., x and y are uncorrelated)
c. Prove that x and y are not independent.
1
1
a. The area of the diamond-shaped support of the pdf is 2. Therefore, a = 0.5
1 1! x
" "
E[xy ] =
b.
# # xy f (x, y) dx dy =0.5 # xy dx dy = 0.5 # # xy dx dy
x ,y
x + y $1
!"!"
1
1! x
!1 x !1
1! x
1
1
% y2 (
2
= 0.5 # x # ydx dy = 0.5 # x ' * dx = 0.5 # x(1! x ) dx
2 ) x !1
!1 x !1
!1 &
!1
The last integral is null because its argument is an odd-symmetric function.
Therefore the variables are orthogonal (E[xy]=0).
c. To prove that x and y are not independent, we need to compute the marginal pdf’s
fx(x) and fy(y) and verify whether fx(x)fy(y) = fxy(x,y).
1! x
"
fx ( x) =
# f ( x, y) dy = 0.5 # dy = 1 ! x
x ,y
x !1
!"
f y ( y ) = 1! y
f x ( x ) f y ( y) = (1! x )(1! y ) $ fxy ( x, y )
hence x and y are not independent.
The product of the marginal pdfs is shown below. Can you prove it is a pdf in
itself (i.e., it satisfies the requirements for a pdf)?
1
1
Exercise 7
Let x,y be two random variables with joint cdf Fx,y(x,y). Show that
P(x1< x ≤ x2, y1< y ≤ y2) = Fx,y(x2,y2) – Fx,y(x1,y2) – Fx,y(x2,y1) + Fx,y(x2,y2).
We begin by noting that Fx, y ( x2 , y2 ) = P(x ! x2 , y ! y2 )
Now, {x ! x2 } = {x ! x1 } " {x1 < x ! x 2 } and {y ! y2 } = {y ! y1 } " {y1 < y ! y2 }
(assuming x2>x1, y2>y1). Using the distributive property of intersection with respect to
union,
{x ! x2 } " { y ! y2 } = ({x ! x1 } " { y ! y1}) # ({x ! x1} " {y1 < y ! y 2 }) #
#({x1 < x ! x2 } " { y ! y1}) # ({x1 < x ! x2 } " {y1 < y ! y2 })
We have seen in class that P({x ! x1} " {y1 < y ! y 2 }) = Fx ,y ( x1, y2 ) # Fx ,y ( x1 , y1 )
It is easy to see that all events within brackets are disjoint. Therefore,
Fx, y ( x2 , y2 ) = Fx, y ( x1 ,y1 ) + ( Fx ,y (x1 , y2 ) ! Fx, y ( x1 , y1 )) +
+ ( Fx, y ( x 2 , y1 ) ! Fx, y ( x1 ,y1 )) + P({x1 < x " x 2 } # {y1 < y " y2 }) =
= Fx , y ( x1 ,y 2 ) + Fx ,y ( x2 ,y1 ) ! Fx ,y ( x1 , y1 ) + P({x1 < x " x2 } # {y1 < y " y2 })
from which P(x1< x ≤ x2, y1< y ≤ y2) = Fx,y(x2,y2) – Fx,y(x1,y2) – Fx,y(x2,y1) + Fx,y(x1,y1).
Exercise 8
Let x,y be two independent discrete random variables with pmf’s px(n) and pY(n), and let
#
$ p (n " k ) p (k )
z=x+y . Prove that the pmf pz(n) of z can be expressed as pz ( n ) =
x
y
k="#
#
$ P (( z = n ) | y = k ) P ( y = k ) =
pz ( n ) = P(z = n) = (total probability) =
k="#
#
=
$ P (( x + y = n ) | y = k ) P ( y = k ) = $ P (( x = n " y ) | y = k ) P ( y = k ) =
k="#
k="#
#
= (since x and y are independent) =
#
$ P (x = n " k )P (y = k ) = $ p ( n " k ) p ( k )
x
k="#
!
!
#
k="#
y
© Copyright 2026 Paperzz