Economics 203A Homework #7
Anton Cheremukhin
December 5, 2005
Exercise 1 Prove that the sample mean of a random sample of size n from N (µ, σ 2 ) with σ 2 known
is an efficient estimator of µ
iid
2
In (µ) = nE[s(Xi , µ)2 ] =
Proof. Xi ∼ N (µ, σ 2 ) , V "arX = V ar[ n1 ΣXi ] = n12#ΣV arXi = σn ,
µ (X −µ)2
¶2
∙³
√
h
´2 ¸
i
∂(− i 2 −log 2πσ)
∂ log f (Xi ,µ)
(Xi −µ)2
2σ
=
nE
=
nE
nE
= σn2
∂µ
∂µ
σ4
Hence V arX =
σ2
n
=
1
In (µ)
which means it is an efficient estimator of µ.
Exercise 2 Prove that the sample mean of a random sample of size n from b (1, θ) is an efficient
estimator of θ
iid
= V ar[ n1 ΣXi ] = ni12 ΣV arXi = np (1 − p)
Proof. Xi ∼ hb (1, θ) , p ≡i θ V arX
h
2
∂ ∂(log[pxi (1−p)1−xi ])
i ,p)
= −nE ∂p
=
In (p) = −nE ∂ log∂pf (X
2
h ³
´i ∂p
h ³
´i
∂ log(1−p)
∂ log p
∂
∂
1
1
= −nE ∂p Xi ∂p + (1 − Xi ) ∂p
= −nE ∂p Xi p + (1 − Xi ) 1−p
i
h
i
h
1
1
1
n
=
−n
−p
= p(1−p)
+
(1
−
p)
= −nE −Xi p12 + (1 − Xi ) (1−p)
2
p2
(1−p)2
Hence V arX = np (1 − p) =
1
In (p)
which means it is an efficient estimator of θ.
Exercise 3 Let X have a gamma distribution with α = 4 and β = θ.
(a) Find the Fisher information I (θ).
(b) If X1 , . . . , Xn is a random sample from this distribution, show that the MLE is an efficient
estimator of θ.
iid
2
4θ
Proof. Xi ∼ Γ (4, θ) , V arX = V∙ ar[ n1 ΣXi ] = n12 ΣV arX
i = n
¸
h 2
i
Xi
1
3
∂ ∂(log[ 6θ4 Xi exp(− θ )])
i ,θ)
=
In (θ) = −nE ∂ log∂θf (X
= −nE ∂θ
2
∂θ
∙ µ
¶¸
X
£ ∂ ¡ 4 Xi ¢¤
£
¤
∂[−4 log θ− θi +C]
∂
i
= −nE ∂θ
]
= −nE ∂θ
− θ + θ2 ] = −nE θ42 − 2 X
=
∂θ
θ3
£4
¤
= −n θ2 − θ23 4θ = 4n
2
© P
ª
Pθ
i
θ̂MLE = arg maxθ [ log[ 6θ14 Xi 3 exp(− Xθi )]] = θ| (− 4θ + X
) = 0 = X4
θ2
1 4θ2
Hence V ar X4 = 16
= In1(θ) which means θ̂MLE is an efficient estimator of θ.
n
1
Economics 203A Homework #8
Anton Cheremukhin
December 5, 2005
Exercise 1 Let X1 , . . . , X10 be a random sample from N (0, σ 2 ). Find a best critical region of size
α = 5% for testing H0 : σ 2 = 1 against H1 : σ 2 = 2. Is this a best critical region for testing against
H1 : σ 2 = 4? Against H1 : σ 2 = σ 21 > 1?
√ 2 ´n
P
³ ´n
h 2 2P i
2πσ1 exp[−( i xi 2 )/2σ 21 ]
σ −σ
σ0
³
´
√
Proof.
=
= σ1 exp 2σ1 2 σ20 i x2i
n
P
0 1
1/ 2πσ20 exp[−( i xi 2 )/2σ 20 ]
Pn
2
P
x
1 ;x1 ,...,xn )
i=1 i
≥ k ⇔ i x2i ≥ c,
∼ χ2 (n)
if σ 1 > σ 0 then L(σ
L(σ0 ;x1 ,...,x
σ2
Pn ) 2
P
Critical region C = {x| i xi > σ 20 · χ20.05 (10)} = {x| i x2i > 18.307} is independent of σ 1 .
Hence this is the best critical region for any σ 1 > σ 0 .
L(σ1 ;x1 ,...,xn )
L(σ0 ;x1 ,...,xn )
³
1/
Exercise 2 Let X1 , . . . , Xn be a random sample from a distribution with pdf f (x) = θxθ−1 , 0 <
x < 1, zero elsewhere. Show Q
that the best critical region for testing H0 : θ = 1 against H1 : θ = 2
takes the form{(x1 , . . . , xn ) : ni=1 xi ≥ c}
θ 1 −1 )
Πn
1 ;x1 ,...,xn )
i=1 (θ1 x
=
Proof. L(θ
n
L(θ0 ;x1 ,...,xn )
Πi=1 (θ0 xθ0 −1 )
L(θ1 ;x1 ,...,xn )
if θ1 > θ0 then L(θ0 ;x1 ,...,xn ) ≥ k ⇔
³ ´n Q
θ −θ
= θθ10 ( ni=1 xi ) 1 0
Qn
i=1 xi ≥ c
Exercise 3 Let X1 , . . . , Xn be a random sample from N (θ, 100). Find a best critical region of size
α = 5% for testing H0 : θ = 75 against H1 : θ = 78.
P
P
P
L(σ1 ;x1 ,...,xn )
θ1 −θ0
1
2
2
=
exp
[
(x
−
θ
)
−
(x
−
θ
)
]
=
exp
i
0
i
1
2
2
i
i
i (xi
L(σ0 ;x1 ,...,xn )
2σ
σ
P
√ x−θ
L(θ1 ;x1 ,...,xn )
1
if θ1 > θ0 then L(θ0 ;x1 ,...,xn ) ≥ k ⇔ x = n i xi ≥ c,
n 10 ∼ N (0, 1)
10
16.45
Critical region C = {x|x > 75 + √n u0.05 } = {x|x > 75 + √n } is independent
Hence this is the best critical region for any θ1 > θ0 = 75.
Proof.
1
−
θ0 +θ1
)
2
of θ1 .
Economics 203A Homework #9
Anton Cheremukhin
December 5, 2005
Exercise 1 Let X1 , . . . , Xn be i.i.d. N (0, θ). Show that
Proof. f (x, θ) = exp[
−
P
i
Xi 2
2θ2
]
i=1
Xi2 is a sufficient statistic for θ
´n
√1
= k1 (u(x), θ)k2 (θ)
2
P2πθ
n
2
i=1 Xi is a sufficient statistic.
³
By factoriztion theorem u(x) =
Pn
Exercise 2 Prove that the sum of the observations of a random sample of size n from a Poisson
distribution with mean θ is a sufficient statistic for θ.
Proof. Xi ∼ P oisson(θ) ⇒ u(x) = Σni=1 xi ∼ Σni=1 P oisson(θ) = P oisson(Σni=1 θ) = P oisson(nθ)
f (x,θ)
g(u(x),θ)
=
Qn
θ xi e−θ
i=1
xi !
(Σn x )
(nθ) i=1 i e−nθ
(Σn xi )!
i=1
=
(Σn
xi )!
1
Qi=1
n
n
nΣi=1 xi
i=1 xi !
= h(x)
⇒
Sufficient by definition.
x
Exercise 3 Let X1 , . . . , XP
n be i.i.d. from a distribution with pdf f (x) = (1 − θ) θ, x = 0, 1, 2, . . .,
zero elsewhere. Show that ni=1 Xi is a sufficient statistic for θ
Q
n
i
Proof. f (x, θ) = ni=1 (1 − θ)XP
θ = (1 − θ)Σi=1 Xi θn = k1 (u(x), θ)k2 (θ)
By factoriztion theorem u(x) = ni=1 Xi is a sufficient statistic.
Exercise 4 Let X1 , . . . , Xn be i.i.d. from a beta distribution with parameters α = θ and β = 2.
n
Y
Show that the product
Xi is a sufficient statistic for θ
i=1
£
¤
Q
Γ(θ+2) θ−1
Proof. f (x, θ) = ni=1 Γ(θ)Γ(2)
xi (1 − xi )2−1 = (Πni=1 xi )θ−1 − (Πni=1 xi )θ [θ (θ + 1)]n = k1 (u(x), θ)k2 (θ
By factoriztion theorem u(x) = Πni=1 Xi is a sufficient statistic.
Exercise 5 In Exercises 1, 2, 3, and 4, show that the MLE of θ is a function of the sufficient
statistic for θ.
q
q
2
Σn Xi 2
Σn
n
1 n
1
i=1 Xi
2 =
−
n
log
θ]
=
{θ|
=
}
=
Σ
X
u(x)
Proof. 1) θ̂ML = arg maxθ [− i=1
2θ2
θ3
θ
n i=1 i
n
Σn X
2) θ̂ML = arg maxθ [Σni=1 Xi log θ − nθ] = {θ| i=1θ i = n} = n1 Σni=1 Xi = n1 u(x)
Σn Xi
3) θ̂ML = arg maxθ [Σni=1 Xi log (1 − θ) + nθ] = {θ| i=1
= n} = 1 − n1 Σni=1 Xi = 1 − n1 u(x)
1−θ
log Πn
log Πn x
1
i=1 xi
4) θ̂ML = arg maxθ [log θ + log (θ + 1) + θ
] = {θ| 1θ + θ+1
= − ni=1 i }
n
Hence, θ̂ 1 + θ̂ 1 +1 + log nu(x) = 0 and θ̂ML is a function of the sufficient statistic.
ML
ML
Exercise 6 Let X1 , . . . , Xn P
be i.i.d. from a distribution with pdf f (x) = θ exp (−θx), x > 0, zero
elsewhere. Show that Y = ni=1 Xi is a sufficient statistic for θ. Prove that n−1
is the unbiased
Y
minimum variance estimator of θ
1
Q
Proof. f (x, θ) = ni=1 θ exp (−θxi ) = exp (−θΣni=1 xi + n log θ) = exp[p(θ)K(X) + q(θ)] - a
regular exponential
p.d.f. Hence Y =
= Σni=1 Xi is a complete sufficient statistic.
R∞
R ∞K(X)
−xθ
E[X] = 0 R xθ exp (−θx) dx =
θ 0 xe dx = 1θ
R
∞
∞
E[X 2 ] = θ 0 x2 e−xθ dx = 2 0 xe−xθ dx = θ22
V ar[X] = E[X 2 ] − E[X]2 = θ12
n
n
E[Y ] = θ
V ar[Y ] = θ2
Y (as a sum of exponential distributions) has a gamma distribution with
{αβ = nθ , αβ 2 = θn2 } ⇔ {α = n, β = 1θ }
R∞ 1
α−1
1
θ
α−2 −x/β
E[ Y1 ] = 0 Γ(α)β
e
dx = Γ(α−1)β
= β(α−1)
= n−1
αx
Γ(α)β α
Hence n−1
is an unbiased estimator of θ.
Y
By Lehman-Scheffe Theorem it is a unique unbiased minimum variance estimator of θ.
2
© Copyright 2026 Paperzz