Park, Byeong U.; (1987).Local Asymptotic Normality for Independent Not Identically Distributed Observations in Semiparametric Models."

LOCAL ASYIIPfOTIC NORMALITY FOR INIEPENDENT N<Jf IDENTICAlLY
DIsrRIBUfEI) OBSERVATIafS IN SEIIIPARAJIETRIC JfDDELS
by
BYEONG U. PARK
UNIVERSITY OF NORTII CAROLINA AT aIAPEL HILL
October 1987
ABSTRACf
A set of conditions ensuring local asymptotic normality for
independent but not necessarily identically distributed observations in
semiparametric models is presented here.
The conditions are turned out to
be more direct and easier to verify than those of Oosterhoff and van Zwet
(1979) in semiparametric models.
Examples considered include the simple
linear regression model and Cox's proportional hazards model without
censoring where the covariates are not random.
AMS 1980 Subject Classification: Primary 62 G 05.
Key words and phrases: Local asymptotic normality. semiparametric models.
uniform Hellinger differentiability.
1
1.
Introduction.
In the theory of asymptotic estimation and testing hypotheses, local
asymptotic normality (LAN for short) of a family of distributions has had an
important role since its importance was introduced by Le Cam (1960).
In
particular, H!jek-Le Cam's convolution theorem (see H!jek (1970) and Le Cam
(1972»
and asymptotic minimax theorem (see H!jek (1972) and Le Cam (1972).
(1979»
was derived from LAN property of a family of distributions although
Le Cam's results covered families which may not be locally asymptotically
normal.
Another but recent results of this sort were established by Begun
et al. (1983) in semiparametric models.
In view of this importance of the notion of LAN, many authors
presented sets of conditions for LAN.
Very broad conditions ensuring LAN
for independent identically distributed observations in parametric models
were given in
~jek
(1972).
Begun et al. (1983) gave a set of conditions
for LAN of independent identically distributed observations in
;', f'-:} ~.
: '_it
: ..
:",iiI1''-~
; ~~'::
'\.
semiparametric models and using LAN they established the representation
theorem and the asymptotic minimax lower bounds for regular estimates.
>·f:~~;.... ,'):'IT'··<"
". /r;'-,
However, in many statistical models. the observations are not
homogenous even though they are
inde~ndent.
:;')".'"
J,.
10
J
Examples include various
'.. ( .
regression models where the covariates are constant.
"
;.;,
In this paper we
.~~'L
present a set of condi ti~ns ensuring LAN for independent but not necessarily
c
;',~do";;q
,"::
~,..
:::7
identically distributed observations
in semiparametric models.
.
.
.~ t
:·'18:,:; ; &;
~1!L.., i'
al. (1983) did it. we
wi~l
:'
As Begun et
be able ,to use LAN for establishing the
,£;'
;; ~
.\:' .) B
appropriate representation theorem and the asymptotic minimax theorem in
- !
~
r;<"' c.,
,.oJ[ :
; ;~.
this inhomogenous semiparametric situation.
in a subsequent paper.
;;y. ',..: '
We shall return to this point
2
For independent but non-identically distributed observations in
parametric models. conditions ensuring LAN were given by Kushnir (1968) .
....
~
r"
Phillipou and Roussas (1973) and Ibragimov and Khas '~i~skii (1975).
Le Cam
(1960). Oosterhoff and van Zwet (1979) also considered the independent not
identically distributed case in general situations but their conditions are
not so transparent to verify in semiparametric models.
Section 2 of the paper is devoted to recall the necessary definitions
and notations.
The main results including some examples will be provided in
section 3 and the proof of Theorem 3.1 in section 3 will be given in section
4.
2.
Definitions. Notation and Assumptions
For notational ease. first we will deal with the simplest but most
important type of semiparametric model with a parametric component e € 8.
1
where 8 is an open set in R • and a single nonparametric component g € '-9
'd
.( ;,·I:J?,lD
~s:).i:: 5,
D:-:i.>~_,
where '-9 is a specified set of density functions.
1, , ?
.
Our conditions and results
n·.F ':,'
C-,', ViA..l
for this one-dimensional parametric component can be extended for a
." ".'-:):.-{
-.
•.•<
multi-dimensional parametric component in the obvious manner. which will be
;' f~ ..1<:::
-, )C".
.f.. J c
sketched in the remarks at the end of section 3 .
.,
j . . qs-.. . _
Lt
...)l.~;~
!,
Let 8 be an open subset of R and Cj be a specified set of densi ty
r
•.• ..:r.
functions with respect to a a-finite measure v on some measurable space
: ('.
(~.~).
measurable space
(~"~j)'
J
~j
such that ~j dominates P j . a.g '
j
~:.j1(' : -~
_
e
,
/~.
• •g
involved.
J
measures on the
We assume that there is a a-finite measure
'{
.-'q.:'
d P
~;"';
._
(n=I.2~ ... ) be a pr~~bility
i.&: : nJ' ~
L:" ,"
? :~o _, ! .~: ".; "~"j"C!C' r,.:_ D', '_ "
let P j • e •g • j=I •...• n
<':',' 9l dB !>.d
a€
~;J.:::
,
~.
on
J
8, g € Cj, j.~ I, and let fj(e.e,g) =
'.:J3 r~~:;"Jf~" ::_:;,~:1,~. i.~
for a specified version of the Radon-Nikodym derivative
'bo
1
.':
L.·~·
_
~ :: ~
'."J -: J:A.< ' .
Define (~.~) = Hj =1 (~j'~j) and let Pa.g be the product measure
of Pj.e.g.j ~ 1. induced on~.
Let Xl.~ •...• Xj € ~j be independent
3
observations, the j-th of which has density f .(-,a,g) with respect to
J
~J"
Pa(n) is the projection of Pa
onto ~-1 (~.,~.) and Ea(n) is the
,g
,g
JJ J
,g
corresponding expectation. The norm 1I-lI j = II II
denote the usual
~.
J
on
~j
L2 (~j)-norm.
In the sequel, for brevity of notation, we set fj(a,g) for the
random variable f .(x.,a,g).
J
For a,a* €
J
e
g, g*
and
let
€~,
rj(a,a* ;g,g* ) = r.(a,a* ;g,g* ;X.)
(2.1)
J
~
J
* *
= 2[£~(9
.g)
1]
_
fj(a,g)
These rj's are by definition functions from
~j
to [-2,00].
The value 00 is
taken on the singular part of P j , a* ,g* with respect to P j , a ,g . Obviously,
r.(a,a*;g,g*) € L2 (P j a ). These terms rj's will be useful in proving the
J
, ,g
main theorem in section 4.
(2.2)
*
Now define
*
An(a,a ;g,g ) = log
d pen)
9*,g*
(n)-
d Pa,g
f
j
(9* ,g* )
=,log ~=1 f .(a,g)
J
In section 3 we will establish the asymptotic normality for A (a,a
n
g,gn)
n
(so LAN) for the sequence (a ,g ) such that
n n
In~(an-a) - hi ...
1
for some h € R and
2
such P € L (v)
(2.3)
P€
and
0
IIn~(~-g~) -
'2
L (v) respectively.
Pll
Let
... 0
v
~
as n ... 00
be the collection of all
i.e.
~
= {P € L 2 (v)
I
~ ~ ) - Pll
lin~ (~-g
v
for some sequence g
..
as n ... 00
... 0
n
with all g
n
€
~}.
,
~
throughout the paper we will rely on the follOWing Assumption about
as it
appears in Begun et al. (1983).
, , ' .
ASSUJWfION S.
The set
~
2
defined in (2.3) is a subspace of L (v).
4
In addition to the comments on this Assumption in Begun et al. (1983)
we can say that this is equivalent to the condition that the tangent space
is equal to the tangent set in Theorem 1, section 4, Chapter 3 of Bickel et
al. (1986) (see Pfanzagl (1982) or Bickel et al. for the definition of
tangent space and tangent set).
Assumption S ensures that we can approach
to g along two opposite directions and this fact will be used in the proof
of Lemma 3.1 which concerns about the rate of convergence for the singular
parts of f .(a ,g ) with respect to fj{a,g).
J
n
n
We conclude this section with the following definition of the uniform
Hellinger differentiability of
DEFINITION 2.1.
{f~{a,g)}.
(Uniform Hellinger differentiability of {f~{a,g)}).
J
The sequence of the root densities {f~{a,g)} is said to be uniformly
J
Hellinger-differentiable at (a,g) € 8 x
if there exist random functions
~
2
2
2
a € L {Jl.) and bounded linear operators B.: L {v) -+ L {Jl.) such that
J
_J
J
sup
~
~
~
~
./ IIf .(a ,g )-fj{a,g) - {P j _: a{a -a) + B .{2:-g )}II
(2.4) 1<_J~n
J n n
• -n
J '""11 _ Jl j -+ 0 as n -+
p.
J,
lan-a I +
1I~-g~lIv,
~
~
for all sequences (a ,g ) such that a -+ a and g -+g
n n
n.
n
-.for all n
~
(X)
2
in L (v) where g
n
€ ~
1.
Note that when f j
=f,
(2.4) reduces to
H~~linger
differentiability
of f ~ defined in Begun et al. (1983).
~
,.
\"
. . . . .· r
3.
Main Results
:
c_,""
Let (a , ~) be any sequence such that
n
~::'l
~..,
.
.'
In {an-a)-hi -+ 0 and lin {~-g )~Pllv -+ O.
':
~
(3.1)
.'
~
\;-
.,~.~
for some h € R1 and 13 € L2 (v) respectively.
•
,""
)
"
as n -+
(X)
Then the following proposition
is an immediate consequence of uniform Hellinger differentiability of
~
{f .(a,g)}.
J
5
~
PROPa;ITION 3.1.
Suppose {fj(e,g)} is uniformly
Hellinger-differentiable at (e,g)
~=1 IIf~(en'~)
(3.2)
2
where a j € L
(~j)
f~(e,g)
-
Note that if f j
-
n~
Then
ajlle
j
~
0
as
n
~
IX)
=f,
a. = h p. e + Bj ~
J
J,
then this proposition is reduced to Proposition
2.1 in Begun et al. (1983).
In addition to (3.2) we have also
< aj,f ~j > =
(3.4)
=f~(e,g)
x~.
is given by
(3.3)
with f~
a
€
0
for any j
~
1
under the uniform Hellinger differentiability condition.
Our conditions for LAN will be based on (3.2) and (3.4) because those are
weaker than the uniform Hellinger differentiability condition.
Here are our
conditions ensuring LAN for the family Peen).
,g
(e1)
(3.2) and (3.4) hold with a j given by (3.3).
(C2)
The random functions ajf ~ , where a is given by (3.3), satisfy
j
j
~
Lindeberg's conditions
(3.5)
for any
(3.6)
~
> O.
1
n-40 n
lim -
And
~
2
1 lIa ll
= a
J = . ,j ~j
.2.
Remark 3.1.
2
For independent and identically distributed case (C2) is
automatically satisfied.
Remark 3.2.
The condition (C2) is closely related to the condition
-n~'
-t
.)
1:-
~'
-, '''Hi'~'
(3.7) in Oosterhoff and van Zwet (1979)
conditions applied to
rj(e,en:g,~) d~fined
" .. ,
,
\'
>
i
'j'"
~
which amounts to the Lindeberg's
by (2.1).
However (C2) is
/
easier and more direct to verify than (3.7) in their conditions when we know
our derivatives aj's.
In fact, Pj,e is typically just the usual parametric
score function for D. i.e.
~[~DJ
log
fjeD.g)] f1eD.g)
and
Aj~ can be
6
,
.....
-
found heuristically by calculating the first derivative--oflOg[f .(O,g )]
J
T}
with respect to T} at T}=O times f~(O,g) where gT} = g + TJl3g~.
(A3) and (A4) in Phillipour and R2uss~s (1973), even
Remark 3.3.
though their results are for parametric models, are much stronger than (C2).
Lemma 3.1.
Under Assumption S and (C1), we have
J{x;fj(x,O,g)=O} fj(x,On,gn)~j(dx) = O.
lim ~=1
(3.7)
n~
Proof.
~
lin
Also since
large n.
e
* € ~
~
From Assumption S there exist
~
~
(~
for all n
- g ) + cfjllv -+ 0
for some c
is an open subset of R1 , 0**
= O+c(O-On )
n
Hence we can find 0: €
e
~
1 such that
> O.
€
e
for sufficiently
for all n ~ 1 such that In~(O:-O) + chi -+
Hence from (C1) we get
O.
~
2. =l
(3.8)
j
~**
~
IIf j (On,gn) - fj(O,g) + n
~
2
c ajll~j -+ 0
as
f ~.(O,g)}11 2
-+ 0
n -+
00.
This together with (3.2) implies
~
2..
J=
1~**
1 IIf ~j (O ,g ) - f ~.(O,g) + -{fj(O
,R: )
n n
J
c
n -n
-
J
~j
as n -+
00
Now in view of Lemma. 5 in Chapte~ 17, section 2 of Le Cam (1986), (3.7)
follows immediately from (3.8).
•
In the proof of Lemma 3.1 we can notice that Assumption S amounts to
the richness of the contingent at g in the spirit of Theorem 3.1 in Le Cam
.
(1986).
;
" ~p:- ~;.,. i':-.~
../ ,
~
Also note that (3.7) in Oosterhoff and van Zwet (1979) implies
(3.7) in the present paper.
Theorem 3.1.
Now we state our main theorem as follows.
Under Assumption S and conditions (C1), (C2), we have,
f
for any e.
> 0,
(3.9)
Po
~
(n)
{IA -2n
,g
n
n
·~i ~
,',
\.-
2
:?:
ajq'j)f j (~j,O,g)",+ 2
j=l
.
"..
. ,.'
};
C1
Thus, under Po(n) ,
,g
2
2
An -+d N(-20 , 40)
as n -+
00
J. > e.}
-+ 0
as n -+
00
7
cont iguous .
Thus Theorem 3.1 asserts that the condition (e1) and (C2) are
e
sufficient for the LAN of the family Pe(n) at (e.g) €
.g
S.
x
~ under Assumption
The proof will be given in section 4 and is mostly benefited by
Ibragimov and Khas'minskii (1975).
Remark 3.4.
When we have a triangular array of independent
observations Xn 1•.... Xnn ..... Theorem 3.1 remains valid under the
corresponding conditions on this triangular array.
Suppose Xnj has a density fnj(-.e.g) with respect to
follows:
define all the terms necessary using f
section 2.
instead of f
nj
j
~nj'
We
as we did it in
Then we have the following theorem without any difficulty.
Theorem 3.2.
Under Assumption S and the corresponding conditions
with f nj . we have. for any
(3.10)
We formulate this as
~
> O.
p(n){IA e.g
n
Thus. under p(n)
e.g'
An
'2 " '2
t
"'1:
0;,-
and the sequences {rrx;=1
'i
N(-2,a • 4 a)
~d
.,
..3
,.
J
n
~
00
"l
_
fnj~xnj·en·~)}
,
~
as
and
{rfJ=l
fnj(xnj.e.g)}
are
_
contiguous.
l
Remark 3.5.
k
;"p.)
, .
When we have a parametric
component a
,.
;;
'
',,'
€
e.
where
e
is an
..
open set in m . DefinitIon 2.1. Proposition 3.1. Lemma 3.1 and Theorem 3.1
':~:"
must be altered as
I- I denotes
(2.4)'
.~~,4r·:;.~'
~L:
t::c,.
~.n'.
fOllOWS~'.,~ft~,!}0;9-~ I
; O.
1I~-g~lIv ~ 0
as n
~
00
where
the usual Euclidean norm. instead of (2.4) we require that
.- (.
~
i J.S
-!~
.
sup
~
~
~~
HHn IIf j(a.~)-f j~~,:.g) - {Pj .~(O~-0), + B/gn -g )}1I~j ~ 0
lan-ol +
as n ~
1I~-g~lIv
2
for some k-vector Pj.o of function in L
(~j)
and Bj a bounded linear
00
8
operator as in Definition 2.1.
The required change in Proposition 3.1 is
that (3.3) must be replaced by
(3.3)'
where In~(a -a)-hi ~ 0
~
and
n
~
~
1311 v
lin (g - g ) n
~
as n
0
~ co.
Lenuna 3.1
and Theorem 3.1 also must be understood with these changed a • a and a ..
n
J
Example 3.1. In this example we consider the simplest regression
model as follows:
=a Xj
Y.
J
where
~j's
+
~.
J
are i.i.d. and have an unknown common density g with respect to
1
1
lebesgue measure v on R and a € R and xj.s are not random.
density fj(-.a. g )
sup
j
IX j I ~
= g(-
C for some C
-8x ).
j
If the Fisher information I
>0
2
?-1 x j
n
J-
is finite.
~ a > 0, then (C1) and (C2) are
!
and
g
Then Y has a
j
satisfied with P . (-) = -21 x j g- g ~ (- -8x j ). (Bj l3)(-) = 13(- -8x j ) and
j a
2
~ = {13 € L (v) : < 13. g~ > = O}. »ence Assumption S also holds and An is
.."i ..
mean -2 aI
g
j
asymptotically normal with
Example 3.2.
constant covariates.
wi thout censoring.
~n
varianc~
and
aI g
In this example we consider Cox's regression model with
For the purpose of simplicity, we treat the case
We observe Tj having hazard function given by
A.(t)
=
,',
1
J
= A (t)
exp(aZ )
0",'1:: "
j
=
a
€
R
coc' " ,: '!
where A (-)
g G- (-). G(-) = 1-G(-)
I_ g du. v is Lebesgue measure on
o
' 1 ' ,"
.)(
...•
+
' .
'.0. _.
.
.1
R • Z.'s € Rare covariates and constant in constrast to that they were
d~,
J
,
, _ ~.,) .
II.... ,
treated as iid random variables with known density in Efron (1977). Tsiatis
"u
"
0..
.;(
: ..... " , . , "
(1981), Begun et al. (1983). Bickel et al. (1986} and elsewhere.
.t: F :
<9 = {all densities with. r~s~~t.
Let
I,·~.P'.·}
+
t9~~SgJ.l~ measure v on R }.
Then
P[T
~ t]
j
r
= G jet)
9
where r j = exp[9 Zj]
Hence the density of T with respect to v is
j
_r j -1
f j (t,9,g) = r j g(t)G
(t).
(3.11)
As Begun et al. (1983) pointed it out for random Z, (3.2) fails with B. and
J
p. 9 defined by (3.11) if we approach g having bounded support along with
J,
g 's which have support outside that of g.
To see this, let
n
~
1 ~
1
~
~
gn = [(1- ..hi)g +..hi g" ] / c n
and g", g have supports S and S
c
where
22
c n = 1 - ..hi + ii
respectively.
Then for certain fixed 9
n
~
~
IIf j (9,gn) - f j (9,g) - Bj 1311
};
j=l
'(';.t,;'£·:
_
•
2
:,'
n
~
if e
9Z.
J ~ 1 for all j.
s
fj(t,9,~
)dt ~ 1
11
This happening can be avoided if we restrict
2
ourselves to ~ = {p € L (v)
in Begun et al. (1983).
}; J
j=l
I < 13, g~ > =
0, support (13) C support (g)} as
Now it is not so hard to see that (C1) and (C2)
1
2
hold with Pj ,9 and Bj given by (3.11) if s~plZjl ~ C and ii ~=n Zj e
for some C and a
9Z j
-+a
> O.
~.
4.
Proof of Theorem 3: Ie "
By expanding A by Taylor·'s formula for max Ir l
nj
n
l~j~n
r .
nJ
=r.(9,9
;g,g ),
J
n
n
An =
< Eo
we obtain
~=1
r nj -
~ ~=1 r~j
+
A ~=1
wjnlrnjl3
with
•
10
where Iwjnl ~ 1.
2
< a j . f ~j > = 0 for any j
By the fact that
~ 1.
n~ ~=1 aj(Xj)f~(Xj) has the asymptotic normal distribution with
parameter (0.4a2 ) under (C2).
From this fact it suffices to show the
equalities
(4.1)
(4.2)
lim p(nl { max
lim
p(nl{1
~
e.g . 1
2
r nj - 4<12 1 >
p(nl{1
~
e.g . 1
r nj - 2 n ~
lim
n~
(4.4)
>~}=o
e.g I~Hn
Jl-llXl
(4.3)
Irnj l
n~
lim
n~
J=
J=
~} = 0
n
};
a
j=1
3
p(nl{1
e,g .~ 1 Irnj 1 > ~} =
j
j
f -% + a
2
1 > ~}
= 0
0
J=
Note that (4.4) is a direct consequence of (4.1) and (4.2) so that we need
to prove only (4.1). (4.2) and (4.3).
Proof of (4.1).
n
~ }; p(n){lr
j=1
e.g
nj
The following inequalities are obvious.
I > ~}
.,;
.'I'U
~ j=1~ p~n){1
~ r . _ 2 n~ a j . fj~1 > ~} + j=1~ p~n){la.
fj-%I > n:~} .
.g j=1 nJ
.g
J
By Chebyshev's inequality, we see that the first sum on the right tends to
zero by Proposition 3.1 and the second one does so by (C2).
Proof of (4.2).
(4.5)
we obtain
Using
the·~se~ul
inequality
labl ~ a a 2/2 + b 2/2a
a
> O.
11
~
2
~
Eo
+
n~
j=1
E (n)
e,g
0
j=1
n
2
n
2
]
lIa.1I
+ ~ IIr jll
J Jl j
j=1
n Pj,e,g
By Proposition 3.1 and the condit ion that n
-1
2
n
~
lIa.1I
j=1
J Jl j
~
2 , the right hand side of the last inequality can be made arbitrarily small
if we let n
1
n
n
. 1
J=
-
r . - 2 n ~ a j f j ~12
nJ
~ [~ ~
aE.
for any a > O.
I
~
a
2
j
f
~
-1
j
first and then let a
00
,
~oo.
The relative stability of
which is guaranteed by (C2) (cf. Gnedenko and Kolmogorov
(1968), Corollary 2 on p. 141), ensures
n
! ~
n
j=1
a 2 f- 1 ~ 0 2
j
in
j
pen)
e,g
•
- probability
since by Lenma 3. 1 we have
n,-,
2··
lim ~ ~ t{f =o'}0 a j ~j
n--' n j;:1
j
1
(4.6)
Proof of (4.3).
I
= O.
From Proposition 3.1
2 n
where <->p
~
a
j
f
~
j
>p
+ 0(1)
j,e,g
2
is the usual inner product in L (P j , e,g ).
j,e,g
•
Using (4.5) again we can see
(4.7)
I
n
~
J·=1
<r
. - 2 n ~ a j f j ~ , 2 n ~ a j f . ~ >p
J
nJ
j , e, g
I
•
12
a
II
n
~ 2 j:1 r nj - 2 n
Letting n
~
and then a
00
~
~
aj fj
~112
1
4
n
lim
n~
because of (4.6).
2
n
2
}; IIr nj ll p
= 4
j=1
j,e,g
0
Now LelllD8. 3.1 together with (4.8) implies that
n
n
lim E
(}; r ) =
~ e ,g
. 1 nj
u~
J=
(4.9)
2
-0
after going over to expectations in the identity
n
n
2
fj(xj,en'~)
j:1 r nj = 4 j:1 [fj(Xj,e,g)
n
r
- 1) - 4 j:1
p(n){1
~ r - 2 n~ ~ aj fj~
e,g j--1 nj
+
j=1
{I ~
P(n)
e,g
~
~2 E(n)
E
~
j=1
2
n~' j=1
~
a
f~j
j
I> 2~}
[n
en)
~ tn
~]
r j) -. 2 n, ;:}; .,.. a j f j
e , g .}; 1 (rn j - Ee
. ,g n
" . '" j l'
J = ' ''.
,_' = .
•-
~nll
2.};
r
E
01 > E}
(r . - E(n) r ) - 2
nJ
e,g nj
J=1
nj - 2 n
~
L'
~112'
P
aj fj
j,e,g
Therefore (4.3) follows from Proposition 3~i.
•
2
.,
nJ
Hence, for sufficiently large n, we get
~
j
0 ensures that both of two terms on the right
hand side of (4.7) tend to zero, which entails
(4.8)
2
Pj,e,g + 2a Ii j:1 lIaj ll/-L
REFERENCES
Begun. J.M .. Hall. I.J .• Huang. I. and lellner. J.A. (1983). Information
and asymptotic efficiency in parametric-nonparametric models. Ann.
Statist. 11. 432-452.
Bickel. P.J .. Klaassen. C.J .. Ritov. Y. and Wellner. J.A. (1986). Efficient
and adaptive inference in semiparametric models. Preprint.
Efron. B. (1977). The efficiency of Cox's likelihood function for censored
data. Journal of the American Statist. Assoc .. 72. 557-565.
Gnedenko. B.V. and Kolmogorov. A.N. (1968). Limit Distributions for Sums of
Independent Random Variables. Addison-Wesley. Reading. Mass.
ffi!jek. J. (1970).
estimates.
A characterization of limiting distributions of regular
Z. lahrsch. verw. Gebiete. 14. 323-330.
ffi!jek. J. (1972) Local asymptotic minimax and admissibility in estimation.
Proc. Sixth Berkeley Symp. Math. Statist. Probab. 1. 175-194. Univ.
of California Press. Berkeley.
Ibragimov. I.A. and Khas·minskii. R.Z. (1975). Local asymptotic normality
for non-identically distributed observations. Theory of Prob.
Applications. 20. 246-260.
Kushnir. A.F. (1968). Asymptotically optimal tests for a regression problem
of testing hypotheses. Theory of Prob. Applications. 13. 647-666.
4It
Le Cam. L. (1960). Locally asymptotically normal families of distributions.
Univ. of California Publications in Statistics. Vol. 3. No.2.
37-98.
Le Cam. L. (1972). Limits of experiments. Proc. Sixth Berkeley Symp. Math.
Statist. Probab. 1. 245-261. Univ. of California Press. Berkeley.
Le Cam. L. (1979). On a theorem of J. Hajek. In Contribution to
Statistics; Jaroslav Hajek Memorial Volume. Ed. J. Jureckova.
119-135. Reidel. Dordrecht.
Le Cam. L. (1986). Asymptotic Methods in Statistical Decision Theory.
Springer. New York.
Oosterhoff. J. and van Zwet. I.R. (1979).
A note on contiguity and
Hellinger distance. In Contribution to Statistics; Jaroslav ffi!jek
Memorial Volume. Ed. J. Jureckova. 157-166. Reidel. Dordrecht.
Pfanzagl. J. (1982). Contribution to A General Asymptotic Statistical
Theory. Lecture Notes in Statistics. Springer. New York.
•
Phillipou. A.N. and Roussas. G.G. (1973). Asymptotic distribution of the
likelihood function in the independent not identically distributed
case. Ann. Statist .• 1. 454-471.
Tsiatis. A.A. (1981). A large sample study of Cox's regression model.
Statist .. 9. 93-108.
•
Ann.