*The paper was written while second author was on leave from Leningrad
State University, Leningrad.
•
-e
AN EXAMPLE OF SINGULAR STATISTICAL EXPERIMENTS
ADMITTING LOCAL EXPONENTIAL APPROXI~1ATION
R.Z. Hasminskii
Institut of ~obZems of Information Transmission
Aaad. Soi. USSR~ Mosoow
LA. Ibragimov*
Department of Statistios
University of North Carolina atChapeZ HiZZ
I
Institute of Statistics Mimeo Series No. 907
February, 1974
..
AN EXAMPLE OF SINGULAR STATISTICAL EXPERIMENTS
ADMITTING LOCAL EXPONENTIAL APPROXIMATION.
by
n.z.
Hasminskii
Inst1tut of Problems of Information Transmission
Acad. Sci. USSR~ Moscow
I.A. Ibragirnov I
leningrad State University
leningrad
SUMMARY.
In this paper the authors study the asymptotic minimax properties
of statistical estimates constructed from independent observations with a
density having jumps to zero.
-e
1.
Introduction. Statement of problem. Conditions.
Consider a sequence of indentically distributed independent random
observations
with common distribution P
e depending on unknown parameter e. We shall
suppose that X. are real random variables and ~hat the distributions . P
J
e
are continuous with respect to Lebesgue measure.
We denote by f(x;e)
density of P relative to Lebesgue measure.
e
In the·i"regular· case" of smooth dependent of f(x; e)
the
on e the "joint
n
density n f(X.;e),
1
~
J
after proper normalization, may be approximated'when
lThe paper was written when the second author was at the Department of
Statistics, University of North Carolina at Chapel Hill.
2
n
~ ~
by normal family (see [1], Appendix).
This idea of local asymptotic
normality was developed in important papers [2], [3] of L. LeCam.
It was
brilliantly explored in J. Hajek's paper [1].
In this paper we wish to investigate a singular case of a discontinuous
n
density f,
when however the joint density n f(X.;B)
1
admits the approxi-
J
mation by a nonnormal but also by a simple exponential family.
Such a more
general concept of local asymptotic exponentiality was also introduced by
LeCam.
See, in particular, the Example 3 of part 6 of [4] which is very
close to the theme of this paper.
We now give precise statements of the restrictions which will be imposed
on the density function
I.
_~
f(x}e).
The function f(x;e)
is defined and measurable on the closed
rectangle F' x aC where the parametric set a is an open interval on
the real line.
For any points
B,
etl a, e ~ B
t
,
=
Jlf(X;6) - f(x;6 t )1 dx
>
0 •
-~
II.
The function f(x;B)
is absolutely continuous in a for fixed
2 (6), ••• X > X2 (B), where
are monotone differentialbe pairwise nonintersecting curve.s
x in each of regions x
<
Xl
(e), xl(e) < x
Xl (e)l ••• x2 CB)
defined for 6 € aC and 0 < IXj(B)1 < ~,
< X
sign xj(B)
= sign
xkCB) •
III. The following limits exist uniformly in e for e in any compact
subset of
a:
fCx;B)
where the functions Pkce), Qk(6) are continuous on
k
a and
= 1, •.• ,
,
3
IV.
Let
f'(xia)
denote the derivative of the absolutely continuous
00
component of f(xi6).
00
The integrals jf'(Xia)dX,
Jlf'(Xia)ldX are con-00
tinuous function of a.
The asymptotic behavior of the estimates of the parameter a under
the conditions stated was investigated in our paper [5].
Here we add one
more condition
V.
Either all
qk(6)
=0
or all
Pk(6)
=0
The exponential distribution exp{-(x-6)}, x
density which satisfyies all conditions I-V.
the interval
[a - ~ , a + ~]
•
>
a,
gives an example of
The uniform distribution on
satisfyies the conditions I - IV but not V.
We prove in part 2 that under the conditions I - V the normalized
-e
likelihood ratio
zn ~)
=
n
n
n f(X.;e +u/n)/nf(X.ie)
1
J
1
J
admits the good approximation by an exponential family (see Th. 2.1).
In part 3 we investigate general statistical experiments which admit
the approximation by one sided exponential families.
OUr results are
analogous to those which Hajek discovered for locally asymptotic normal
families (see [1], [8]).
In part 4 we return to the case of independent identically distributed .
observations and establish some asymptotic minimax properties of Bayes estimates among all sequential estimates
[{tn }, 0]
with EaO
~
n.
Here and below Pa {'} and E {'} denote probability and expectation
e
generated by all sequence of observations when e is the true value of
the parameter.
4
We use also below the following notation.
X(A)
always denotes the indicator of A.
If A is an event, then
For example, if
is a random
~
variable,
=
X{~ > x}
2.
I,
~
> x
{ 0,
~
S
x .
Asymptotic exponential ity of normalized 1i keli hood ratio.
Define random function
Zn(u)
by
= nn _f(X.;6+u/n)
....J"-I
f(X j ;6)
where 6 is a "true" value of parameter.
-e
asymptotic behavior of Zn(u)
the case sign x~(6)
by mappings
(x,6)
when n
0, qk(e)
>
,
= O.
We investigate here the
We will consider below only
+~.
Three other cases reduce to this one
(x,-6), (x,e) + (-x,-G).
+
Define random moments Tn as
Tn
=
inf{u:
k (6+u/n)
Let further pee) = p
X
>
Xj
=
>
xk(G)
2xk(6)
Zn (u) =
for some k
Pk (6)
{:PU.
= l, .•.r,
j
and let the random function
u > T
n
Theorem 2.1. Assume that conditions I to V are satisfied (and
.
s1gn
xk' 0
> ,qk
(2.1)
e
= 0) .
Then the following relations hold
lim Ph
n~
n
> u}
=
= l, ...n}
r
e pu ,
u
1
u S 0
,
~
0
5
(2.2)
in probability;
and moreover
(2.3)
sup Ealz (u) }ulsH
n
Z (u)1
n
->0,
n+=
0
<
H< =
Proof. The (2.1) follows from the Poisson approximation of binomial distriThe proof of (2.3) is broken down into a few lemmas.
bution.
(
lemma 2.1.
{~n}
If
is a sequence of non-negative random variables con-
verging in distribution to a random variable
(i)
lim inf EI~
(ti)
EI~nl
n
n
+
I
EI~I
<
-e
t.,
J
= I
u
= ~
Itn '
are uniformly integrable
Define now the random processes
J
fl(X.; 8)
[f(X j ; e+u)
f(Xj;e)
=
inf{v: X (6+V) > X > X (8)
k
j
k
T = n min (Tl, .•• T ).
n
n
J
Evidently,
nj(u)
by
T.
=
T.
then
~ EI~I
For proof see [6], p. 183.
and random variables
t
I
for some k
J
= l, ••. r}
.
lemma 2.2. Under conditions I - V
(2.4)
lim Ealx(T.
u+o
J
Proof. Evidently, lim X(T.
u+O
J
>
>
u) n.(u) J
t.1J =
u) n.(u) = t.
J
J
Further,
EaX(T j > u) nj(u)
<
~ SjU dz
e
S
Ee{x(T j >
U)~
0
with Pe-probability 1.
e+u
f If~(Xj;z)1 dz
a
Jlf'(x;z)ldx + Jlf'(X;S)ldX • Ell;1 .
-=
-co
S
6
Because of Lemma 2.1 (i), it follows that
l~m
Ee{x(T j
>
u)lnj(u)l}
=
=
flfl(X;a)ldX
According to (ii) of Lemma 2.1, the X(T.
u)ln.(u)1
>
J
Eelljl .
are uniformly
J
integrable and therefore:
a IX(TJ.'
E
Lemma 2.3.
i. I
> u) n. (u) -
J
J
-+
O.
If the conditions I - V are satisfied, then for every
=
Pa{Tj > u, Inj(u)1 > O/Iul}
Pe {T.
+U J
u, In.(u)1 > ~/lul,fCX.;e)
J
J
>
o(lul),
+ O} =
u
-+
~ >
0 ;
oClul),
u
-+ 0 •
Proof. Since C2.4), it follows that
-e
PS{T •
J
>
u, In.(u)1 > ~/Iul}
J
+ Pe{lijl >
~/2Iul}
S
S
Pe{T. > u, In.(u) - i.1 > o/2Iul} +
J
J
J
~
E&{x(T j > u) Inj(u) - ijl} +
+~Ea{x(lijl>o/2Iuj)lijl}
As for the second assertion, if
f(x;e+u) <
=
o(lul)·
If(x;6+u)/f(x;S) - 11 > 0 then
t. If(x;a+u) - f(x;s)l,
1
0 > min(o/2, 1/2) .
1
Hence
Pe {T.
+u J
S
u, In.Cu)1 > o/Iul, f(X.;e)
J
J
~ Ee{x(T j
S
+
>
> u, Inj(u)I
>
+ O}
o/Iul) Inj(u)l}
S
S
¥Ee{x(T. > u) In.Cu) -i.ll +
vI
J
J
J
¥VI Ee{x(ln.Cu)1
J
>
a/luI> 1l.1}
J
= o(lul) •
0
7
Lemma 2.4.
Let the conditions I - V are fulfilled, then
co
f& t j
J f'(x;&)dx
•
•
-co
f xk(&) Pk(&)
• p.
Proof. The direct calculation.
The next two lemmas give the necessary estimates of the deviation of
Zn (u)
from Zn (u) •
Lemma 2.5.
Under conditions I - V for every 0
<
H < co
(2.6)
Proof. Fix a small positive number 6
f(X.;6+u/n)
-e
If all differences
<co,j=l,2, ... n,
.,.1
J
<~.
f (X.; 6)
J
then
Itn Z (u) pul s
n
Hence, uniformly in
~
n
I£(t.p)I + ~ £ In.(u/n) -
+ 26
~
r
n 1
1
J
n
I
J
t.1J
+
InJ.(u/n)I •
lui s H •
(2.7)
s 86
+ 0(1) .
We used here the law of large numbers to estimate the first righthand term,
and lemma 2.2 to estimate the second and third ones.
8
Further, by (2.5)
Ee{IZn(u) - epul X(T
(2.8)
>
n
n f(X.;e+u/n)
u, u{1
J
1
f(X.;S)
-
11
>
5}} s
J
s n Pe+u/n{tnl (u/n) I > 6n/lul , Tl > u/n , f(x!;e) + O} +
+ e- Pu n PS{ln l (u/n) I > 6n/lul , T1 > u/n} = 0(1) , n + ~
uniformly in
lui s H.
The inequalities (2.7), (2.8) give us the desired
result (2.6).
lemma 2.6. Under the conditions I - V
sup Ee{IZ (u) - Z (u)1 X(T < u)}
n
n
n
0(1), n
=
1~lsH
+
~
•
Proof. We have
-e
Ee{lzn(u) - Zn(U) I X(T n < u)}
By definition of
X(T n
<
u)} s Pe+u/ n {Tn < u} .
the last probability is
T
n
Ps /
+u n
= Ee{Zn(U)
n
r
{u
U
j=l k=l
r
S
n ~
k=l
Then since lim f(x;S+u/n)
{xk(S + u/n) > X. > xkCS)}}
S
J
jxk(e+u/n)
f(x;e+u/n)dx
~(e)
= 0,
x t xk(s+u/n)
uniformly in u
tions III and IV)
Xk(e+u/n)
xk(S)
f(x;e+u/n)dx S 0(1) Ixk(s + u/n) - xk(s) I
J
This last inequality proves the lemma.
= o(l/n)
(condi-
.
The assertions (2.2) and (2.3)
are evident consequences of lemmas 2.5, 2.6 and theorem 1 is proved.
9
3.
Locally aSymptotically one sided exponential families of distributions.
Limiting distributions of regular estimates; local asymptotic minimax
property of Bayesian estimates.
Let us consider (following H4jek [1]) a sequence of statistical experi-
n n
Xn, An , Pel,
ments {
~
1,
where
e runs through an open set 0 E R1
Let a(n) t = be a sequence of normalizing factors.
and assume it to be the true value of parameter.
Take a point
Denote
Zn(u)
e
~
0
the Radon-
Nikodym derivative of the absolutely continuos part of P~+u/a(n) with
respect to
P~.
For the sake of brevity we shall write below Pe
P~ and Pe,u instead P~+u/a(n)' Note that the definition Zn(u)
instead
is just
the same as in Section 2.
Assumption 3.1. Assume that either
-_
(3.1)
Zn(u)
=
Z
=
{_PO + 0p (1).
u < Tn
u > 't
n
0p (1),
or
(3.2)
n
(u)
{ e-po + 0p (1),
opel)
u > -£
,
u <
-£
n
n
where p is a positive number, the positive random variables
Pu
> u} = e- , and opel)
n
variables which go to zero in probability when n + = .
measurable and satisfy lAm Pe{'t
't n are An
denotes random
Theorem 2.1 of Section 2 gives us an example of statistical experiments
satisfying Assumption 3.1.
We shall consider below (without special warn-
ing) the case (3.1) of Assumption 3.1 only.
Zn (u) =
Denote
epu ,
{o
u >
T
n
10
Pe,u
Define now the measure
on An by the formula
JZn(U)
=
Pe,u(A)
dP e
, Ae An
A
Theorem 3.1. Under Assumption
IdPe ,u - dPe,u I
JXn
(3.3)
3.1 for every
+
U> 0
0,
~
n +
•
Proof. For every fixed u both sequences {Zn(u)}, {Zn(u)} tends in
distribution to the random variable
=
Z(u)
where P{T
-e
>
y}
= e- PY .
{e
PU
,
o,
u > T ,
for u ~ 0
Futher,
Zn(u), ""Zn(u)
and by Lemma 2.1 the random variables
with respect to
Pe'
(3.4)
are uniformly integrable
so that
Eelzn (u) - Zn (u)1
+
0,
n
+
~
•
Next, let p~:~, P~~~ denote the regular and the singular parts of
Pe,u with respect to Pee
p(s){Xn }
e,u
S
1
Then for u > 0
~ p(r){T >u}
e,u n
=1
- Ee{Z (U)'X(T >u} + 0,
n
n
n
~
•
n+
~
+
Therefore,
fXn IdPe ,u - dPe,u I
S
Eelzn(u) - Zn (u)1
Theorem 3.2. Let us assume that
(3.1) holds.
of estimates of parameter e satisfying
+
n} + 0,
p(s){X
e,u
Consider a sequence
.
{t~}
n
11
Pe,u {a(n)(t n - e - u/a(n»
(3.5)
<
y}
F(y)
+
for every u
£
R'
in continuity points of some distribution function F(y), y€R' •
Then we have
where Hp
i.e.
.
In
is the exponential distribution on
= e- PY ,
1 - Hp (y)
y > 0,
and
with parameter p,
(0,00)
G is a certain distribution function
Rl •
Proof.
Let
f(s)
denotes the characteristic function of F and rewrite
(3.5) in the form
where
-e
~
n
= a(n)(t n
f(s)
= ePu
- e).
+ ius
We have from this and (3.3)
fXn eis~n
X(T
n
>
u) dP
e + 0(1), n
+
00
•
MUltiplying both part of the last equality on e-A~ A = - ~ + iv, ~ > 0 ,
and integrating relative to u,
we find
J
f(s~ = lA Xn eis~n (1 - e-Atn)dP e
p+A-1S
= !1!l - 1 J eis~n-Atn
A
Now we' can choose v
f(s)
=~
lim
is-p
= s,
A
~
"n
=0
dP
e
+
0(1)
=
+ 0(1) .
and then we have
JXn eis(tn-t n) dPe =~
Jeisy dG(y)
is-p
= ~'g(s)
lS-P'
_00
where g(s)
is the characteristic function of G.
Because p/is-p is the
characteristic function of Hp the theorem is proved.
Suppose that the loss resulting from replacement of the true vau1e of
the parameter
e by its estimate t n is assumed to be w(a(n)(tn-e»
,
12
when w(.)
is a given function.
The mean loss, the reisk function, is
then
E~n) w(a(n)(t n -
(3.6)
e)) •
We shall assume below that the function w in (3.6) satisfies the following
condition:
(3.7)
w(y)
=w(IY/)
; w(y) t , Y ~ 0 ; w(y) ~ 0 , w f const.
co
J w(y)
(3.8)
e-ey dy <
e
co,
> 0 •
o
Theorem 3.3. Under assumption 3.1, any sequence of estimates
satisfies (for the w of (3.7), (3.8))
(3.9) lim liminf su~ E(n){w(a(n)(t -u))}
6+0
n+co I6-u 1<0
n
-e
{t}
n
for
o
~
min P f w(y-u) ePY dy .
u _ co
Proof. Denote trough wa (y) a truncated version of w:
w (y)
a
= min{w(y),
a} •
We have for any b < 0 • a(n)
6+b/a(n)
J
su~ E(n){w{a(n)(t -u)}} s a(n)
E(n){w(a(n)(t -u))} du
/u-e 1<6 u
n
b u n
t
b
= f
o
6
Ee,u {w(t n - u)}du ,
where we denote a(n)(t -6) = ~. Further, by Theorem 3.1 for n
b
n
n
b
l
l
b Ee ,u {wet n -u)}du ~ b Ee,u {wa (tn -u)}du =
J
f
o
b
= tEe
=~
0
Jwa(t n- u) Zn(u)du + 0(1)
miRCb,'t )
Ee
f n Wa(tn-u)ePUdu
o
=
+ 0(1)
~
+ co
=
e
13
T
t
~
Ee{X(Tn<b)
In wa(tn-u)epudu} 0
_00
- a/bp
~
0(1)
+
min
y
f W (y
J
a
_00
• ~ Ee{X(Tn<b)ePTn} - a/bp
- u)epudu •
=
+ 0(1)
o
= min
Jwa(y-u)epudu
p
y
= a2
liminf
n+oo
~
we have
su~
le-ul<~
o
E(n){w(a(n)(t -u»} ~
u
n
J a (y-u)epudu - l/ap
min p
W
y
o
lim min p
y
.
_00
Finally, under conditions (3.7), (3.8)
-e
+ 0(1) .
-00
Hence, putting here b
a+oo
- a/bp
o
Jwa (y-u)dpudu = miny Jw(y-a)epudu
_00
_00
and Theorem 3.3 is proved.
Return to the case of independent identically distributed observations
satisfying conditions I-V.
We have noted already that under conditions
I - V the assumption 3.1 is always fulfilled with
the sequence of estimates i (w)
n
J
wet -u)
n
n
=i
n
J
= n.
Define now
in such a way that
n
n f(X.ju)du = min
1
QCn)
y
J w(y-u) n f(X.;u)du
J
1
It is easy to deduce from [5] that under a wide conditions upon
(but a little more restrictive that I - V and (3.7), (3.8»
tn
and
w
the estimates
have the following properties:
(i)
of
f
the limit distribution of net -6) coincides with the distribution
n
0
Y when y is the point of minimum of pf w(y-u)epudu and
w
w.
1
PY
P{T>y} = e- , y > 0 •
-00
T +
o
(ii)
lim Eew(n(t
n
-e» = p J w(Yw-u)epudu
n
.
14
So from the point of view of Theorem 3.2, 3.3 the estimates tn(w)
are asymptotically "good" estimates with respect to the loss function w.
4.
The case of sequential estimation.
We continue to consider the sequence
{X.} of independent identically
J
distributed observations satisfying the conditions I-V.
'given:
{t }, t
n
1) a stopping time
n
= t n (XI, ••• Xn )
dom variable
.
a;
2) a sequence of statistics
e we use the ran-
As an estimate of parameter
ta(X1, ••• Xa ).
estimation plan.
Assume we are
We call a pair d
= [{tn }, a]
a sequential
We want to prove here that from the point of view of
Theorem 3.3 sequential schemes are not better than fixed sample schemes.
-e
Theorem 4.1. Denote Vn the collection of all sequential plans
d
= [{tn }, aJ
with Eea
q
~
lim
= O~O
a
n, e
0.
l'1m 1n
. f
n~
o
(4.1)
;;e
~
inf p .
y
V
n
Under the conditions I - V
a I -u ,a
nEt
le-ul~o
u a
su~
f Iy-ulaepudu,
a
~
> 0 •
-co
Proof. We will at first show that we need consider only such plans . d
~
Vn
for which
Pta
(4.2)
Lemma 4.1. For any
a > 0
a sequence of plans dn
and
;;e en}
= 1,
e > 0 .
there exist a positive number
= [{t~n)},
o(n)]
such that
e
P {a ;;e
u n
= e(a) > 0 and
en} = 1, u € e
lim lim su
a
(n)
a
o n le-ul~o n Eulto(n) - ul ~ qa + a •
The proof of Lemma 4.1 coincides exactly with the proof of Lemma 2.5 of [7]
and we omit it.
15
lemma 4.2. The following relation holds
where
sup
max
O<u<H lsksn
Ip(k,n,u)I ~ 0, n ~ = .
Proof. By definition of Zn (u),
Ee{Zn(u)IXl"~'~}
= ePu
Ee{X(Tn>U) Ixl,···Xk } =
k
n
1
k
= ePu n x(Tj>u/n) n Eex(T.>u/n)
= epu X(Tk>uk/n) (1 -
r
£
and b
>
N.
Xj > xk(e)}})
n-k
=
+ p(k,n,u)) .
Let now d
lemma 4.1 that d satisfies (4.2).
N>
=
H
P{~{Xk(e+u/n) >
= Zk(Uk/n)(l
The lemma is proved.
J
~
V.
We may (and will) suppose by
n
Let us fix also two positive numbers
We specify the choice of the numbers
£,
N, b later.
Using Lemma 4.2, we obtain after a slight modification of the first half of
(3.10) •
(4.3)
16
Hence, to prove the theorem it is sufficient to show that for every
a > 0 it is possible to choose
Nn
(4 .4)
L (n) a I/J ~ 1
k=ne: k
k
N, b,
€
in such a way that
To estimate the left side of (4.4) we use the following results.
.. "
Lemma 4.3. For any a
o<
e: < N < b
>
0 there exists such a choice of numbers
in (4.3) that
(4.S)
Lemma 4.5. Let us suppose we are given positive numbers
satisfy conditions (4.5) and let
u > O.
·e
g(u)
k ~ 0 which
be a convex decreasing function of
I/J
Then
(4.6)
and hence
(g (u) = u
-a )
(4.7)
The inequality (4.4) and hence the assertion of the theorem is a
simple consequence of the relations (4.5) and (4.6).
follows with ease from Jensen's inequality.
L g(k/n)
The last of them
Namely,
I/J k ~ (I-a) I
~ (I-a) g (1
n
"kl/J
L--.!.)
LI/J
j
and (4.6) is proved.
To prove the first inequality (4;5) we define the number
and note that because of lemma 4.1 for all sufficiently large
e =b
n
-
Ib
17
SUp P
{T > bo/Nn}
o~u~a e+u/Nn
0
= o~u~a
SUp
o
P /N {n{T. > b/Nn}}}
e+u n 1 J
~
[En]
~ o~~~a Pe +u/ Nn {~
=
SUp
o~u~a
{T j
>
b/Nn}}
=
e u n {T1 > b/Nn}) [e:n] ~ exp{-pe~/2N}
(P + /N
Using this last inequa1tiy and Tchebichev's inequa1tiy, we obtain
inf Pe+ INn {o ~ Nn, T~ bo/Nn} ~ 1 - exp{-pe~/2N} - l/N . u
0
o~usa
Applying once more the arguments we used to derive (4_3), we find
inf Pa /N {o
+u n
O~us S
•
~
~
Nn,
~
T
0
Nn
Ea{L X(o
en
1
a
+ 0(1) S
B
-
fo
bo/Nn}
~
a
: f Pe IN {a
p
+u n
0
= k)
Nn
Ea{L
en
X(T k S bk/Nn)
x(o
= k)
~
Nn, T
a
f ZNn(u)du}
S
balNn}du
+
X(T k S bk/Nn) •
P
k
~
0
,Zk(uk/Nn)du}
."
N Nn n
+ 0(1) ~ -a L k Ee{x(a = k)
. X(T
0
e:n
•
bk/Nn)ePTk} + 0(1) .
Hence,
r t/Jk ~ (l-exp {-pe:1b"/2N} - lIN) (l-l/v'b) + 0(1) .
(4 _8)
The estimate of rkt/Jk may be obtained in the same way.
since Ee+u/ Nn
0 ~
n we have
sup Ea+u/ Nn {a'X(Tk
u
Therefore
~
bk/Nn)} s n .
b
n
~~ f
Ee+u/Nn{a X(T O
S
ba/Nn)}du
Nn O
~ ~n r Ee{X(T k S bk/Nn)-x(a
p en
It
At first,
Nn
- ~n ~ Ea{X(T k S bk/Nn)-x(a
P e:p
= k)}
.
= k)ePTk
~
u}
+ 0(1) ~ Lkt/Jk - nN/bp + 0(1)
and
Lkt/Jk s n(l + N/bp) + 0(1) .
=
18
The inequalities (4.8) and (4.9) prove (4.5) because we can simultaexp{-€~/2N}, NIb, lIN
neously make
as small as we want.
Remark.
.
The theorem 4.1 is an analogue of the theorem 3.3 for the case
a
w(x) = Ixl , a > 0. In fact, we proved a little more. Indeed, for a
-
function w satisfying (3.7), (3.8) define
.gW(A)
= g(A)
J° wp:l (x-v) )ePvdv
= m;n
-co
Then, like (4.3),
su~
le-u ls6
If we suppose that
E w(na(t -u»
u
g(>.)
0
r ~k
g(k/n) •
is convex and continuous at
by lemma 4.5 we find for every a
r ~k g(kln)
~
~
>
(i)
(I-a).
is convex and continuous.
w(x)
g(~~~)
•
=
{ 1,
then
+ 0(1)
a
but also for all
It will be, for example, if
is convex and satisfies (3.7), (3.8);
a,
(ii) w(x)
=1
°
Hence, theorem 4.1 holds not only for w(x) = Ixl
which g(A)
>.
w for
19
REFERENCES
.
-
.e
•
•
[1]
H~jek,
[2]
LeCam, L. (1956), On the asymptotic the~ryof estimating and testing
hypotheses. Proa. Srd Berkeley Symp., University of California
Press. Vol. 1, pp. 129-156.
[3]
LeCam, L. (1960), Locally asymptotically normal families of distributions. Univ. California Pull. Statist., Vol. 3, pp. 27-98.
[4]
LeCam, L. (1972), Limits of experiments.
University of California Press.
[5]
Ibragimov, I.A., Has'minskii, R.Z. (1972), Asymptotic behavior of
statistical estimates for samples with a discontinuous density.
Mat. Sbornik" torn 87, N: 4 (English translation: Math. USSR
Sbo~nik" Vol. 16, N: 4, pp. 573-606).
[6]
Lo~ve,
[7]
Ibragimov, I.A., Has'minskii, R.Z. (1974), On a sequential estimation .
Theo~. Prob. and Appl., to appear (in Russian).
[8]
H4jek, J. (1970), A characterization of limiting distributions of regu1ar estimates. Z. Wahr8aheinUahkeit8t1u3o~ie and Vero. Gibiete.,)
Vol. 14, pp. 323-330 •
J (1972), Local asymptotic minimax and admissibility in
estimation. Proa. 6th Be~keley Symp., University of California
Press. Vol. I, PRo 175-194.
Proa. 6th
Be~keley
Symp.,
M. (1963), Probability Theory. Van Nostraud, Princeton.
© Copyright 2026 Paperzz