Math 562 Homework 2 September 26, 2003 Dr. Ron Sahoo I hear

Math 562
Homework 2
September 26, 2003
Dr. Ron Sahoo
I hear and I forget. I see and I remember. I do and I understand.—Confucius
Direction: This homework worths 80 points and is due on October 24, 2003.
In order to receive full credit, answer each problem completely and must show all
work. Graduate students should do all problems; and the undergraduate
students are required to do any 8 problems from the group A and any 8 problems
from the group B.
GROUP A
1. Given θ, the random variable X has a binomial distribution with n = 2 and
probability of success θ. If the prior density of θ is

 k if 12 < θ < 1
h(θ) =

0 otherwise,
what is the Bayes’ estimate of θ for a squared error loss if the sample consists
of x1 = 1 and x2 = 2.
2. Suppose two observations were taken of a random variable X which yielded
the values 2 and 3. The density function for X is

 θ1 if 0 < x < θ
f (x/θ) =

0 otherwise,
and prior distribution for the parameter θ is
( −4
3θ
if θ > 1
h(θ) =
0
otherwise.
If the loss function is quadratic, then what is the Bayes’ estimate for θ?
3. Suppose one observation was taken of a random variable X which yielded
the value 2. The density function for X is
2
1
1
f (x/µ) = √ e− 2 (x−µ)
2π
− ∞ < x < ∞,
and prior distribution of µ is
1 2
1
h(µ) = √ e− 2 µ
2π
− ∞ < µ < ∞.
If the loss function is quadratic, then what is the Bayes’ estimate for µ?
4. Let X1 , X2 , ..., X5 be a random sample of size 5 from a distribution with
probability density
f (x) =

 θ1

0
if 2θ ≤ x ≤ 3θ
otherwise,
where θ > 0. What is the maximum likelihood estimator of θ?
5. Suppose X and Y are independent random variables each with density
function
(
2 x θ2
for 0 < x <
0
otherwise.
f (x) =
1
θ
If k (X + 2Y ) is an unbiased estimator of θ−1 , then what is the value of k?
6. Let X1 , X2 , ..., X5 be a random sample of size 5 from a distribution with
probability density
f (x) =

 1 − θ2

0
if 0 ≤ x ≤
1
1−θ 2
otherwise,
where θ > 0. What is the maximum likelihood estimator of θ?
7. What is the maximum likelihood estimate of β if the 5 values 54 , 23 , 1, 32 ,
β
5
were drawn from the population for which f (x; β) = 12 (1 + β) x2 ?
5
4
8. Given θ, the random variable X has a binomial distribution with n = 3 and
probability of success θ. If the prior density of θ is
h(θ) =

k

0
if
1
2
<θ<1
otherwise,
what is the Bayes’ estimate of θ for a absolute difference error loss if the sample
consists of one obervation x = 1?
9. Eight independent trials are conducted of a given system with the following results: S, F, S, F, S, S, S, S where S denotes the success and F denotes
the failure. What is the maximum likelihood estimate of the probability of
successful operation p ?
10. Suppose X1 , X2 , ... are independent random variables, each with probability
of success p and probability of failure 1 − p, where 0 ≤ p ≤ 1. Let N be the
number of observation needed to obtain the first success. What is the maximum
likelihood estimator of p in term of N ?
GROUP B
11. Let T1 and T2 be estimators of a population parameter θ based upon the
same random sample. If Ti ∼ N θ, σi2 i = 1, 2 and if T = bT1 + (1 − b)T2 , then
for what value of b, T is a minimum variance unbiased estimator of θ ?
12. Let X1 , X2 , ..., Xn be a random sample from a distribution with density
function
f (x; θ) =
1 − |x|
e θ
2θ
− ∞ < x < ∞,
where 0 < θ is a parameter. What is the expected value of the maximum
likelihood estimator of θ ? Is this estimator unbiased?
13. A random sample X1 , X2 , ..., Xn of size n is selected from a normal distribution with variance σ 2 . Let S 2 be the unbiased estimator of σ 2 , and T be
the maximum likelihood estimator of σ 2 . If 20T − 19S 2 = 0, then what is the
sample size?
14. Suppose X and Y are independent random variables each with density
function
(
2 x θ2
for 0 < x <
0
otherwise.
f (x) =
1
θ
If k (X + 2Y ) is an unbiased estimator of θ−1 , then what is the value of k?
15. Let X1 , X2 , ..., Xn be a random sample from a population with probability
density function
f (x; θ) =

 θ1

0
if 0 < x < θ
otherwise ,
where θ > 0 is an unknown parameter. If X denotes the sample mean, then
what should be value of the constant k such that kX is an unbiased estimator
of θ ?
16. Let X1 , X2 , ..., Xn be a random sample from a population with probability
density function
f (x; θ) =

 θ1

0
if 0 < x < θ
otherwise ,
where θ > 0 is an unknown parameter. If Xmed denotes the sample median,
then what should be value of the constant k such that kXmed is an unbiased
estimator of θ ?
17. Let X1 , X2 , ..., Xn be a random sample from a population X with density
function
f (x; θ) =

θ
 (1+x)
θ+1

0
for 0 ≤ x < ∞
otherwise,
where θ > 0 is an unknown parameter. What is a sufficient statistic for the
parameter θ?
18. Let X1 , X2 , ..., Xn be a random sample from a population X with density
function
f (x; θ) =


x
θ2

0
x2
e− 2θ2
for 0 ≤ x < ∞
otherwise,
where θ is an unknown parameter. What is a sufficient statistic for the parameter θ?
19. Let X1 , X2 , ..., Xn be a random sample from a distribution with density
function
f (x; θ) =

 e−(x−θ)

0
for θ < x < ∞
otherwise,
where −∞ < θ < ∞ is a parameter. What is the maximum likelihood estimator
of θ? Find a sufficient statistics of the parameter θ.
20. Let X1 , X2 , ..., Xn be a random sample from a distribution with density
function
f (x; θ) =

 e−(x−θ)

0
for θ < x < ∞
otherwise,
where −∞ < θ < ∞ is a parameter. Are the estimators X(1) and X − 1 are
unbiased estimators of θ? Which one is more efficient than the other?