Practice Test for Test II
1. Let X1 , . . . , Xn ∼ Uniform(−θ, θ) where θ > 0.
(a) Find the maximum likelihood estimator θbn .
(b) Find a minimal sufficient statistic.
P
(c) Show that θbn → θ.
(d) Find the limiting distribution of n(θ − θbn ).
2. Let X ∼ Bernoulli(θ) be a single coin flip.
Suppose that θ ∈ Θ = {1/3, 2/3}. Hence θ can only take 2 possible values.
(a) Find the maximum likelihood estimator.
(b) Let the loss function be
(
b =
L(θ, θ)
1 if θ =
6 θb
b
0 if θ = θ.
Find the risk function of the maximum likelihood estimator. Since θ only takes the
b and R(2/3, θ).
b
values 1/3 and 2/3, you only need to find R(1/3, θ)
(c) Show the the maximum likelihood estimator is minimax.
3. Let X1 , . . . , Xn be iid from a Binomial (k, θ).
(a) Find a minimal sufficient statistic S for θ.
(b) For each of the following say, whether it is sufficient, minimal sufficient, or not
sufficient.
!
!
n
X
X
X
T = X1 , T =
X i , T = X1 ,
X i , T = X1 ,
Xi .
i
i
i=2
(c) Let τ (θ) = P (X = 1) = kθ(1 − θ)k−1 . Define U = 1 if X1 = 1 and 0 otherwise.
Show that U an unbiased estimator of τ .
(d) Find the mle τb of τ .
P
(e) Show that τb → τ .
(f) Find the limiting distribution of τb.
4. Construct an example where Xn
distribution to X + Y .
X and Yn
1
Y , but Xn + Yn does not converge in
5. Let Xi ∼ N (θi , 1) for i = 1, . . . , n. Let γ =
(γ − γ
b)2 . Find the risk of the mle.
Pn
2
i=1 θi .
Find the mle γ
b. Let L(γ, γ
b) =
6. Let X1 , . . . , Xn ∼ Uniform(0, 2). Let
n
1X 2
X .
Wn =
n i=1 i
(a) Show that there is a number µ such that Wn converges in quadratic mean to µ.
(b) Show that Wn converges in probability µ.
√
(c) What is the limiting distribution of n(Wn2 − µ2 )?
7. Let X1 , . . . , Xn ∼ Bernoulli(p).
(a) Let T = X3 . Show that T is not sufficient.
P
(b) Show that U = ni=1 Xi2 is minimal sufficient.
8. Let
1
X1 , . . . , Xn ∼ N (0, 1) +
2
In other words, with probability 1/2, Xi is drawn
1/2, Xi is drawn from N (θ, 1).
(a) Find the method of moments estimator θb of θ.
1
N (θ, 1).
2
from N (0, 1) and with probability
b
(b) Find the mean squared error of θ.
9. Let X1 , . . . , Xn ∼ N (θ, 1). Let τ = eθ + 1.
(b) Consider some loss function L(τ, τb). Define what it means for an estimator to be
a minimax estimator for τ .
(c) Let π be a prior for θ. Find the Bayes estimator for τ under the loss L(τ, τb) =
(b
τ − τ )2 /τ .
10. Let X1 , . . . , Xn ∼ Normal(θ, 1). Suppose that θ ∈ {−1, 1}. In other words, θ can only
take two possible values.
(a) Find a minimal sufficient statistic.
(b) Find the maximum likelihood estimator. Is the maximum likelihood estimator a
sufficient statistic?
(c) Find the risk function using the loss function
(
6 θb
b = 1 if θ =
L(θ, θ)
b
0 if θ = θ.
2
11. Let Xi ∼ Bernoulli(pi ) for i = 1, . . . , n. The observations are independent but each
observation has a different mean. The unknown parameter is p = (p1 , . . . , pn ).
(a) Let ψ =
Pn
i=1
pi . Find the maximum likelihood estimator of ψ.
(b) Find the mean squared error (MSE) of the maximum likelihood estimator of ψ.
(c) Suppose we use the following prior distribution
π(p1 , . . . , pn ) = 1.
Find the Bayes estimator of ψ.
Hint: Recall that the Beta (α, β) density is
Γ(α + β) α−1
p (1 − p)β−1 .
Γ(α)Γ(β)
If W ∼ Beta(α, β) then E[W ] = α/(α + β) and
Var(W ) = αβ/((α + β)2 (α + β + 1)).
12. Let X1 , . . . , Xn ∼ N (µ, 1).
(a) Let T = max{X1 , . . . , Xn }. Show that T is not sufficient.
(b) Let us use the improper prior π(µ) ∝ 1. Find the Bayes estimator of ψ = µ2 .
3
© Copyright 2026 Paperzz