~
*The paper was written while second author was on leave from Leningrad
State University, Leningrad.
AN EXAMPLE OF SINGULAR STATISTICAL EXPERIMENTS
ADMITTING LOCAL EXPONENTIAL APPROXI~MTION
by·
R.Z. Hasminskii
Institutof ProbZems of Information TPansmission
Aaad. Sai. USSR~ Mosao1.J)
I.A. Ibragimov*
Department of Statistias
University of North Co:t'oZina at ChapeZ Hill
Institute of Statistics Mimeo Series No. 907
February, 1974
AN EXAMPLE OF SINGULAR STATISTICAL EXPERIMENTS
ADMITTING LOCAL EXPONENTIAL APPROXIMATION.
by
R.Z. Hasminskii
Institut of Problems of Information Transmission
Acad. Sci. USSR, Moscow
I.A. Ibragimov l
Leningrad State University
Leningrad
SUMMARY.
In this paper the authors study the asymptotic minimax properties
of statistical estimates constructed from independent observations with a
density having jumps to zero.
1.
Introduction. Statement of problem. Conditions.
Consider a sequence of indentically distributed independent random
observations
with connnon distribution Fe depending on unknown parameter a.
suppose that
Xj
are real random variables and
are continuous with respect to Lebesgue measure.
~hat
the distributions . Fe
We denote by f(xje)
density of P relative to Lebesgue mea~ure.
e
In the·i"regularcase" of smooth dependent of f(xje)
n
density n f(X.je),
1
We shall
on
the
e the "joint
after proper normalization, may be approximated when
:I
IThe paper was written when the second author was at the Department of
Statistics, University of North Carolina at Chapel Hill.
2
n
~ ~
by normal family (see [1], Appendix).
This idea or local asymptotic
normality was developed in important papers [2], [3] of L. LeCam.
It was
brilliantly explored in J. Hajek's paper [1].
In this paper we wish to investigate a singular case of a discontinuous
n'
density f,
when however the joint density n r(x.;e)
admits the approxi-
J
I
mation by a nonnormal but also by a simple exponential family.
Such a more
general concept of local asymptotic exponentiality was also introduced by
LeCam.
See, in particular, the Example 3 of part 6 of [4] which is very
close to the theme of this paper.
We now give precise statements of the restrictions which will be imposed
on the density function
I.
f(x}6).
The function f(x;e)
is defined and measurable on the closed
aC where the parametric set a is an open interval on
the real line. For any points a, ote a, e ~ e t ,
rectangle F' x
~
Jlf(Xie) - f(xiet)! dx
>
0 •
_00
II.
The function f(x;6)
x in each of regions x
xI(S)1 ••• x2 (o)
defined for 0 e
is absolutely continuous in e for fixed
x > x2 (e), where
are monotone differentialbe pairwise nonintersecting curves
eC
<
and 0
Xl (el, Xl (e)
<
IXj(6)1
<
x
< ~,
<
x2 (e) J
•••
sign xj(e)
= sign
III. The following limits exist uniformly in 6 for
subset of
xk(e) •
e in any compact
e:
f(x;6)
where the functions Pk(O), qk(O)
k=I, ... I",
are continuous on a and
rlpk(e) - qk(8) I
>
0 .
3
IV.
Let f'(x;e)
denote the derivative of the absolutely continuous
00
component of f(x;e).
00
JIf
The integrals jf'(X;e)dX,
MOO
tinuous function of e.
1
(x;e) Idx are con-
-00
The asymptotic behavior of the estimates of the parameter e under
the conditions stated was investigated in our paper [5].
Here we add one
more condition
V.
Either all
qk(e)
=0
or all
Pk(a)
=0
The exponential distribution exp{-(x-e)}, x
density which satisfyies all conditions I-V.
the interval
[a - ~ , a +
;1
.
>
a,
gives an example of
The uniform distribution on
satisfyies the conditions I - IV but not V.
We prove in part 2 that under the conditions I - V the normalized
likelihood ratio
zn (u) =
n
n
n f(X.;e +u/n)/n f(X.;e)
1
J
1
J
admits the good approximation by an exponential family (see Th. 2.1).
In part 3 we investigate general statistical experiments which admit
the approximation by one sided exponential families.
OUr results are
analogous to those which Hajek discovered for locally asymptotic normal
families (see [1], [S}).
In part 4 we return to the case of independent identically distributed .
observations and establish some asymptotic minimax properties of Bayes estimates among all sequential estimates
[{tn }, 0]
with Eaa
~
n.
Here and below P {'} and E {'} denote probability and expectation
a
e
generated by all sequence of observations when e is the true value of
the parameter.
4
We use also below the following notation.
X(A)
always denotes the indicator of A.
If A is an event, then
For example, if
~
is a random
variable,
x{~ >
2.
=
xl
I, t
{0, t
x •
~
Asymptotic exponentiality of normalized likelihood ratio.
Define random function Zn(u)
=
Z (u)
n
by
n f(X. ;e+u/n)
n _ ..J
_
1 f(Xj;e)
where e· is a "true" value of parameter.
asymptotic behavior of Zn(u)
~
> x
when n
the case sign x~(e)
>
by mappings
(x,-e), (x,e)
(x,e)
+
0, qk(e)
= O.
+
+
We investigate here the
=.
We will. consider below only
Three other cases reduce to this one
(-x,-e).
Define random moments Tn as
inf{u: xk(e+u/n) > Xj > xk(e) for some k = l, ... r, j = l, •••n} •
Let further pee) = p = Lxkce) Pk(e) and let the random function
Tn
=
z
n
(u)
. {:PU.
U
>
T
n
Theorem 2.1. Assume that conditions I to V are satisfied (and
sign xk > 0, qk
(2.1)
= 0).
Then the following relations hold
e -pu ,
lim Ph
n~
> u}
n
=
{1 ,
u
~
0
u~O
5
(2.2)
in probability;
-> 0
n-+oo
and moreover
(2.3)
z
sup Eelz (u) (u)1 ->0,
n
n
n+=
lul~H
Proof. The
bution.
0
<
H< ~
follows from the Poisson approximation of binomial distri-
(2.1)
The proof of (2.3) is broken down into a few lemmas.
lemma 2.1.
{t~}
If
is a sequence of non-negative random variables con-
verging in distribution to a random variable
I
(i)
lim inf EI~
(ii)
EI~nl + Eltl < ~ <~ I~nl
n
n
~ EI~I
For proof see [6], p. 183.
and random variables t., T.
=
Tj
Evidently,
lemma 2.2.
=
u
are uniformly integrable
Define now the random processes
f(X.; 6+u)
f' (X. i 6)
]
--::~J~~ - I ,
[ f(X j ; a)
k (6+v) > Xj > x k (6)
L
n = n min (TI, •.• Tn ).
inf{v:
nj(u)
by
J
J
1
then
~
.f. .
I
=
J
for some k
X
J
,
= l, ..• r}
.
Under conditions I - V
lim Eelx(T.
(2.4)
u+O
J
Proof. Evidently, lim X(T.
u+O
>
J
n.(u) -
> u)
J
u) nJ.(u)
=
l.1J =
0
with Pe-probability 1.
lJ'
Further,
a+u
EaX(T j
>
u) nj (u)
a+u
~
-u1
J
a
~
EahCT j
00
~
dz
Jlf'(XiZ)ldX
-~
+
>
1
u) U
f If~(Xj;z)1 dz
a
Jlff(Xia)ldX
-00
=
Ell j
I.
~
6
Because of Lemma 2.1 (i), it follows that
l~m
Ee{X(Tj
=
u)lnj(u)l}
>
=
flff(X;e)ldX
According to (ii) of Lemma 2.1, the X(T.
>
J
u)ln.(u)1
J
Eelljl .
are uniformly
integrable and therefore:
Lemma 2.3.
If the conditions I Pe{Tj
Ps+u{Tj
>
>
Inj (u) I
u,
u, Inj(u)1
>
V
are satisfied, then for every IS
> 0/ lu
I} =
0
(Iu /),
+ O} =
o/lul,fCXjie)
> 0
u .... 0 ,
oClul),
u .... 0 •
Proof. Since C2.4), it follows that
Pe{T j
~
>
u, InjCu)1
+ Pe{lljl
>
IS/lui}
>
o/zlull
Pe{Tj
S
~
S
+ ~ Ee{xClljl
>
<}
u, Inj(u) - ljl
E&{x(T j
IfCxiS+u)/fCxie) - 11
IfCx;6+U) - f(XiS)I, lSI
s
>
J
l¥lvI Ee{x(T.J
S
+
u, In.Cu)1
>
J
¥ Ee{x(T.
1
>
J
-¥-vI Ee{xCln. (u) I
J
>
+ Ol
o/Iul, f(X.ie)
u, In.(u)1
J
>
s
IS/lui) In.(u)l} s
J
u) In.Cu) -l.ll +
0/ lui)
J
> 0
min(IS/2, 1/2) .
>
Hence
>
1S/2Iul} +
u) InjCu) - ljll +
>
1
Pe {T.
+u J
>
tS/2Iuj) Iljll = o(lul) .
As for the second assertion, if
f(Xie+U)
>
J
Il.J /l =
oClul) .
then
7
Lemma 2.4. Let the conditions I - V are fulfilled, then
00
E6 lj
J f'(x;6)dx
•
f k(6) Pk(6)
•
• p.
X
-co
Proof. The direct calculation.
The next two lemmas give the necessary estimates of the deviation of
Zn(u)
from ZnCu) •
Lemma 2.5. Under conditions I - V for every 0
<
H<
co
(2.6)
Proof. Fix a small positive number 6
<~.
f (X. ; e+u/n)
J
_
1
= 1,2, ••• n
< co, j
f(X. ;e)
~
If all differences
,
J
then
~
IIct.p)I
+ 26
~
n 1
Itn Z (u) pul s
n
n
Hence, uniformly in
lui
S
1
J
r
+
InJ.(u/n)I •
f(X.; e+u/n)
n
S
+
- t.1 +
J
H.
Ee{lz (u) - epul X(T
n
n
(2.7)
~
r In.(u/n)
n 1 J
>
u, n{1
1
n
J
f(X.;e)
-
11
<
6}}
S
J
lu'lePu Ee{!. IL(t. - Eet.) I} +
n 1 J
J
lui Ee{ln l (u/n) - tIl XCTI
>
u/n)} + 261ul EelnlCu/nJI s
S B6 + 0(1) •
We used here the law of large numbers to estimate the first righthand term,
and lemma 2.2 to estimate the second and third ones.
8
Further, by (2.5)
Ee{/Zn(u) - epul X(T
>
n
(2.8)
n f(X.; e+u/n)
u, u{l· J
1
f(X.;e)
-
11
>~}} ~
J
~ n Pe+u/ n{ln1 (u/n)1 > 6n/lul , T > u/n , f(X~;e)
1
+'e- Pu n Pe{ln l (u/n) I > ~n/lul ,11 > u/n}
uniformly in
lui ~ H.
=
+ O}
+
0(1) , n ~ ~
The inequalities (2.7), (2.8) give us the desired
result (2.6).
Lemma 2.6. Under the conditions I - V
Zn (u)1
sup Ee{IZ (u) n
lul~H
Proof.
X(T < u)}
n
=
0(1), n ~ ~ .
We have
Ee{IZn(u) - Zn(u) I X(T n < u)}
By definition of
= Ee{Zn(u)
X(T n
<
u)} ~ Pe +u/ n {Tn < u} .
Tn the last probability is
n
Pe+ /
u n
r
U {xk(e + u/n) > X. > xk(e)}} s
j=l k=l
J
r fXk(e+u/n)
s n l
f(x;e+u/n)dx
k=l xk(e)
{u
Then since lim f(x;e+u/n) = 0,
x t xk(e+u/n)
uniformly in u
tions III and IV)
Xk(e+u/n)
.
xk(e)
f(x;e+u/n)dx ~ 0(1) Ixk(e + u/n) - xk(e) I
J
This last inequality proves the lemma.
= o(l/n)
(condi-
.
The assertions (2.2) and (2.3)
are evident consequences of lemmas 2.5, 2.6 and theorem 1 is proved.
9
3.
Locally asymptotically one sided exponential families of distributions.
Limiting distributions of regular estimates; local asymptotic minimax
property of Bayesian estimates.
Let us consider (following H4jek [lJ) a sequence of statistical experin An , Pa}'
n n ~ 1, where a runs through an open set e s R1
ments {X,
Let a(n)
t ~
be a sequence of normalizing factors.
and assume it to be the true value of parameter.
Take a point
Denote
Zn(u)
a
~
e
the Radon-
Nikodymderivative of the absolutely continuos part of P~+u/a(n) with
respect to
P~.
For the sake of brevity we shall write below. Pe instead
P~ and Pe,u instead P~+u/a(n)' Note that the definition Zn(u)
is just
the same as in Section 2.
Assumption 3.l. Assume that either
e
(3.1)
Zn(U)
=
{ePU
+ 0p(l),
0p (1),
u <
T
n
u >
T
n
or
Z (u)
n
(3.2)
where p
=
{ e -Pu + 0p (1),
U>
-e;
,
u <
-e;
opel)
n
n
is a positive number, the positive random variables
measurable and satisfy l~m PeiTn > u} = e-Pu , and 0p(l)
variables which go to zero in probability when n + ~ •
T
n
are
An
denotes random
Theorem 2.1 of Section 2 gives us an example of statistical experiments
satisfying Assumption 3.1.
We shall consider below (without special warn-
ing) the case (3.1) of Assumption 3.1 only.
Zn (u)
e
PU ,
{o
Denote
u < Tn
u > Tn
10
N
Define now the measure Pe,u on An by the formula
J Zn(u) dPe
=
Pe,U(A)
A
Theorem 3.1. Under Assumption
(3.3)
3.1 for every
IdPe ,u - dPe,u I
JXn
~
0
u > 0 .
n~CD.
Proof. For every fixed u both sequences {Zn(u)}
{Zn (u)}
tends in
distribution to the random variable
=
Z(u)
where P{T > y}
= e- PY •
{ePU ,
o,
for u ~ 0
Futher,
and by Lemma 2.1 the random variables
with respect to
Pe'
Next, let
Zn (u)
are uniformly integrable
so that
Eelzn (u) -
(3.4)
Z (u),
n
zn (u)1 ~ 0,
n ~ CD •
per) pes) denote the regular and the si~gular parts of
e,u' e,u
Pe,u with respect to
Pe'
Then for u > 0
p(s){Xn } s 1 - p(r){T >u}
e,u
e,u n
=1
- Ee{Z (U)'X(T >u}
n
n
~ 0,
n ~
CD
n
~
•
Therefore,
f
xn
IdPe
,u
- dP
e,u
IS
Eelz (u) - Z Cu)1 + p(s){Xn }
n n e ,u
Theorem 3.2. Let us assume that
(3.1) holds.
of estimates of parameter e satisfying
~
0,
Consider a sequence
CD
•
{t~}
11
(3.5)
Pe ,u {a(n)(t n- ·e - u/a(n))
<
y}
F(y)
+
for every u
~
R'
in continuity points of some distribution function F(y), y£R' •
Then we have
where H is the exponential distribution on (O,~) with parameter p,
p
i. e. 1 - Hp(y) = e- PY , y > 0, and G is a certain distribution function
.
in
1
R.
Proof. Let f(5) denotes the characteristic function of F and rewrite
(3.5) in the form
Ee,u {exp{is~ n - isu}}
where
~
n
=
a(n)(t n - e).
f(5)
= ePu
+
ius
f(s) ,
+
We have from this and (3.3)
fxn eis~n
X(T
n
>
u) dP
e
+
0(1),
Multiplying both part of the last equality on e -AU,A
and integrating relative to u,
f(5~ =1 f
p+A-1S
A Xn
=-
n
~
+
~
•
+ iv, ~ > 0 ,
we find
eis~n
(1 -
e-Atn)dP
e
+ 0(1)
=
= f(5)
J nei5~n -Atn dPe+ O
(1)
A _ '1
A
.
X
Now we can choose v
= 5,
~
=0
and then we have
~
f(5)
=~
lim
lS-P
JXn eis(~n-tn)
dP
e
=~
is-p
J eisy dG(y) = l'SP_P'g(S)
,
_Clll
where
g(s)
is the characteristic function of G.
characteristic function of Hp
Because p/is-p is the
the theorem is proved.
Suppose that the loss resulting from replacement of the true vaule of
the parameter
e by its estimate t n is assumed to be w(a(n)(tn-e)) ,
12
when w(-)
is a given function.
The mean loss, the reisk function, is
then
E~n) w(a(n)(t n -
(3.6)
We shall assume below that the function
e)) •
w in (3.6) satisfies the following
condition:
(3.7)
w(y)
= w(IYI)
; w(y) + , Y ~ 0 ; w(y) ~ 0 , w $ const.
co
f w(y) e-ey dy
(3.8)
< co,
€ >
0 •
o
Theorem 3.3. Under assumption 3.1, any sequence of estimates
satisfies (for the w of (3.7), (3.8))
(3.9) lim liminf
su~ E(n){W(a(n)(tn-u))}
6~O
n~ IS-ul<6
Proof.
Denote trough wa(y)
{t}
n
for
o
~
min P f w(y-u) ePY dy .
u "-co
a truncated version of w:
wa (y)
= min{w(y),
a} •
We have for any b < 0 - a(n)
e+b/a(n)
su~ E(n){w{a(n)(t -u)}} s a(n)
f
E(n){w(a(n)(t -u))} du
lu-e 1<6 u
n
b u n
b
=
t of
E6,u
where we denote a(n)(t -6) = ~.
b
n
n
l
-b
,u {wetn -u)}du
fEe.
o
b
{w(~n
e
- u)}du ,
Further, by Theorem 3.1 for
b
~ bl E ,u {t"a (z; n -u) }du =
f
= tEe f wa(~n-u)
0
e
Zn(u)du
+ 0(1)
=
miR(b,t )
= tEe
Jn
o
Wa(l;n-u)ePUdu + 0(1)
~
n
~ co
=
e
13
T
fn0wa (~n -u)epudu} - a/bp + 0(1) ~ min J w {y - u)epudu
y
a
_00
_00
1
pT
• b Ee{X{Tn<b)e n} - a/bp
=
+ 0(1)
o
J wa{y-u)epudu - a/bp
= min p
Y
Hence, putting here b
= a2
liminf
n~oo
~
we have
su~
le-ul<~
o
y
_00
o
lim min p
y
f
_00
E(n){w(a(n){t -u»} ~
u
n
J wa {y-u)ePUdu - l/ap .
min p
Finally, under conditions (3.7),
a~
+ 0(1) •
_00
(3.B)
w (y-u)dpudu
a
o
=min J w(y-a)epudu
y
_00
and Theorem 3.3 is proved.
Return to the case of independent identically distributed observations
satisfying conditions I-V.
We have noted already that under conditions
I - V the assumption 3.1 is always fulfilled with
a(n)
= n.
Define now
the sequence of estimates i (w) = t
in such a way that
n n
n
n
wet -u) n f(X.;u)du = min w(y-u) n f(X.;u)du
J
n
1)
Y
J
1)
It is easy to deduce from [5] that under a wide conditions upon
(but a little more restrictive that I - V and (3.7), (3.8»
f
and
w
the estimates
in have the following properties:
the limit distribution of net -e) coincides with the distribution
(i)
of
T +
n
Y
w
when y
P{T>y} = e-PY , y
(li)
w
>
0 •
0
is the point of minimum of pf w(y-u)epudu and
J
-00
14
So from the point of view of Theorem 3.2, 3.3 the estimates tn(w)
are asymptotically "good" estimates with respect to the loss function w.
4.
The case of sequential estimation.
We continue to consider the sequence {X.} of independent identically
J
distributed observations satisfying the conditions I-V.
given:
{t }, t
n
1) a stopping time a;
n
= t n (Xl, ••. Xn )
dom variable
2) a sequence of statistics
As an estimate of parameter e we use the ran-
.
ta(Xl, ••• xa ).
estimation plan.
Assume we are
We call a pair d
= [{t n},
a sequential
0]
We want to prove here that from the point of view of
Theorem 3.3 sequential schemes are not better than fixed sample schemes.
Theorem 4.1. Denote Vn the collection of all sequential plans
~
d
= [{tn },
with Eeo
aJ
lim
= 6~O
qa
(4.1)
~
~
n, e
~
e.
Under the conditions I - V
l'1m 1n
. f
su~
na Eu Ita-u Ia ~
n~ V le-ul~6
o n
inf p .
y
f Iy-ulaepudu,
a > 0 .
-co
Proof. We will at first show that we need consider only such plans. d
~
Vn
for which
P{a
(4.2)
Lemma 4.1. For any
a > 0
a sequence of plans d n
and
~
En}
= 1,
€ >
0 .
there exist a positive number
= [{t~n)},
a(n)]
such that
P
u
{o
E
n
~
= tea) > 0 and
En} = 1, u ~ e
lim lim su
a
(n)
a
6 n le-ul~o n Eulto(n) - ul ~ qa + a .
~<
TIle proof of Lemma 4.1 coincides exactly with the proof of Lemma 2.5 of [7]
and we omit it.
15
Lemma 4.2. The following relation holds
where
sup max Ip(k,n,u)1
O<u<H lsksn
Proof.
By definition of
The lemma is proved.
~
lemma 4.1 that
N>
and b
£
N.
+
m •
Zn (u),
Let now d e V.
n
d satisfies (4.2).
>
0, n
+
We may (and will) suppose by
Let us fix also two positive numbers
We specify the choice of the numbers
E,
N, b later.
Using Lemma 4.2, we obtain after a slight modification of the first half of
(3.10).
a
n
su~ E It _ul
le-ulso u a
N
x(a=k)
Nn
(4.3)
+ 0(1)
= ~ Ee I
En
k
.
• ,..
Zk (unN)du
+ 0(1) ~ mIn p
x
X(T
b
l
-b
b
n
~ ~ Ee En
r
~
a
J Ee uIN nIn(ta -e)
+
- u/Nladu
~
0
JIn(t a-e)
o
- u/Nl
a
ZnN(u)du
b
X(o=k)f In(ta-e) - u/Nl
0
.
0
f Ix-u Iepudu b"L.
N ~n
p
-m
kb pT
k < Nn) e k
a
+
+
k=n£
n a+l x(a=k)'
Ee (k)
Q(l) + O(l/b) .
16
Hence, to prove the theorem it is sufficient to show that for every
a > 0 it is possible to choose N, hI e in such a way that
Nn
Ckn)a '''k
N Ee k
n X( 0= k) X( Tk<Nn
kb) ePT k .
(4 .4)
t.t
'+' ~ 1 - a, ."'+'k = pb·
k=ne
To estimate the left side of (4.4) we use the following results.
Lemma 4.3. For any a
o <.
€
there exists such a choice of numbers
> 0
< N < b in (4.3) that
(4.5)
Lemma 4.5.
Let us suppose we are given positive numbers
satisfy conditions (4.5) and let
u >
o.
k
~
0 which
be a convex decreasing function of
Then
I
(4.6)
~
g(u)
1JJ
and hence
(g(u)
g(k/n)
1JJ
l+a
k ~ (I-a) gel-a)
= u- a )
t
l.
(4.7)
(n/k)
a
1JJ
k
~
l+a a
(l-a)(l_a)
The inequality (4.4) and hence the assertion of the theorem is a
simple consequence of the relations (4.5) and (4.6).
follows with ease from Jensen's inequality.
L g(k/n)
1JJ
k ~ (I-a)
. k'IJ
~ (I-a) g (~
The last of them
NamelYI
I
I -.15.)
11JJ j
and (4.6) is proved.
To prove the first inequality (4.5) we define the number
B=b -
and note that because of lemma 4.1 for all sufficiently large n
Ib
17
SUp P IN
OSuSB e+u n
{t
>
o
= OSuSB
sup
bo/Nn}
0
Pe IN {n{T.
+u n 1 J
[En]
S sup Pe IN {n {To> b/Nn}}
'osus(3
+U n 1
J
=
sup (P + IN {T
OSusB e u n 1
>
>
b/Nn}}} S
=
b/Nn}) [en] S exp{-pe~/2N}
Using this last inequa1tiy and Tchebichev's inequa1tiy, we obtain
inf P IN {o S Nn, t S bo/Nn} ~ 1 - exp{-peA)/2N} - l/N •
Osus8 e+u n o ,
Applying once more the arguments we used to derive (4.3), we find
B
inf P IN {o S Nn, T S bo/Nn} S :
Pe IN {o S Nn, T S bo/Nn}du
OSuSB e+U n
0
p
+u n
0
Nn
0
B
= Ee{L X(o = k) x(t k S bk/Nn) ZNn(u)du} +
J
i
J
m
+ 0(1) S
B
•
Nn
Ee{L X(o
en
1 .
e
0
= k)
X(T
k
S
bk/Nn) •
-Zk(uk/Nn)du} + 0(1) S ISNNnn
L k Ee{x(o = k)
Jo
P en
o
X(T S bk/Nn)ePTk} + 0(1)
k
0
0
Hence,
L l/Ik
(4.8)
~ (l-exp {-peIb/2N} - I/N)(1-1/vb) + 0(1) .
The estimate of Lkl/lk may be obtained in the same way.
since
Therefore
and
Lkl/lk S n(1 + N/bp) + 0(1) •
At first,
=
18
The inequalities (4.8) and (4.9) prove (4.5) because we can simultaneously make exp{-e/ib/2N}, NIb, lIN as small as we want.
Remark.
The theorem 4.1 is an analogue of the theorem 3.3 for the case
a
w(x) = Ixl , a > O. In fact, we proved a little more. Indeed, for a
function w satisfying (3.7), (3.8) define
gW(A)
o
J w(A-1(x-v»epvdV
= g(A) = m~n
.co
Then, like (4.3),
su~
E w(na(t -u»
le-uls6 u
If we suppose that
g(A)
a
t
~
(i)
°
w(x)
1,
then
l+a
w(x)
= Ixl a
but also for all
It will be, for example, if
is convex and satisfies (3.7), (3.8);
_ {o,
A =I
g(kln) ~ (I-a} gel-a) + 0(1)
is convex and continuous.
(ii) w(x)
•
>
Hence, theorem 4.1 holds not only for
which g(A)
g(k/n) •
is convex and continuous at
by lemma 4.5 we find for every a
L ~k
r ~k
~
w for
19
REFERENCES
•
[1]
H~jek,
J (1972), Local asymptotic minimax and admissibility in
estimation. Proa. 6th Berke'tey Syrrrp., University of California
Press. Vol. 1, pp. 175-194.
[2]
LeCam, L. (1956), On the asymptotic theory of estimating and testing
hypotheses. Proa. 3rd BerkeZey Symp., University of California
Press. Vol. 1, pp. 129-156.
[3]
LeCam, L. (1960), Locally asymptotically normal families of distributions. Univ. California Pull. Statist., Vol. 3, pp. 27-98.
[4]
LeCam, L. (1972), Limits of experiments.
University of California Press.
[5]
Ibragimov, I.A., Has'minskii, R.Z. (1972), Asymptotic behavior of
statistical estimates for samples with a discontinuous density.
Mat. Sbo~ikJ tom 87, N: 4 (English translation: Math. USSR
Sbornik J Vol. 16, N: 4, pp. 573-606).
[6]
Lo~ve,
[7]
Ibragimov, I.A., Has'minskii, R.Z. (1974), On a sequential estimation.
Theor. Prob. and AppZ., to appear (in Russian).
[8]
H4jek, J. (1970), A characterization of limiting distributions of regular estimates. Z. WdhrsaheinUahkeitstheorie and Vel'7J. Gibiete':I
Vol. 14, pp. 323-330 .
Proa. 6th BerkeZey Symp.,
M. (1963), Probability Theory. Van Nostraud, Princeton.
© Copyright 2026 Paperzz