i
LIKELIHOOD RATIO TESTS FOR
ORDER RESTRICTIONS IN EXPONENTIAL FAMILIES
by
Tim Robertson
Dep.artment of Statistics
University of I01J)a
and
University of Nopth CaroUna at Chapel, Hi 1, 1,
and
Edward J. Wegman·
Department of Statistics
Univer»sity of North CaroUr.a at Chapez. Hin
Institute of Statistics Mimeo Series No. 995
April, 1975
I
'!
'·
* The'.wQskof this author was supported in part by the Air Force Office of
Scientific Research, under Contract No. AFOSR-75-2840.
LIKELIHOOD RATIO TESTS FOR
ORDER RESTRICTIONS IN EXPONENTIAL FAMILIES
by
Tim Robertson
Department of Statistics
Univ?2"sity of Iou>a
and
Unive2"sity of N02"th Ca2"o'tina at ChapeZ Bi'LZ
and
Edward J. We gman *
Depa2"tment of Statistics
Unive2"sity of N02"th C(j;po'tina at ChapeZ BiZZ
ABSTRACT
This report is on a continuation of the work discussed in Robertson and
Wegman (1975).
The results of a Monte-Carlo study of the power of the 1ike1i-
hood ratio statistic considered in the previous paper are discussed.
The
asymptotic distributions for the likelihood ratio statistics for testing homogeneity against trend and trend against "otherwise" when the sampled distributions belong to an exponential family are given.
-2-
1. Introduction: This report is on a continuation of the work discussed in
Robertson and Wegman (1975).
Car~ostudy
In Section 2 we discuss the results of a Monte-
of the power of the likelihood ratio statistic considered in the
previous paper.
In order to be able to make comparisons we included in this
study a statistic proposed by Van Eoden (1958) for testing the same hypothesis.
In Section 3, we also consider tests for trend in parameters when the
parameters involved arise from a distribution of the exponential type.
The
distribution for the likelihood ratio statistic for testing a trend hypothesis
about normal means is shown to be the asymptotic distribution for the likelihood ratio statistic for an analogous test whenever the sampled distributions
are members of the exponential family.
2. Monte Carlo Study: In this section, we report the results of a MonteCarlo study of the power of the likelihood ratio statistic considered by
Robertson and Wegman (1975).
Following the notation of Barlow, Bartholomew,
Bremner and Brunk (1972), suppose we have independent random samples from
each of k normal populations indexed by Xi: i
~(Xi)
= 1,
is the mean of the population indexed by Xi
order on S
,k.
2,
and
«
Suppose
is a partial
= {Xl'
x2 , ••. ,Xk}. A function reo) on S is isotone provided r(x.):s r(x.) whenever x. «"x.... Robertson and Wegman consider the
1.
J
1..J
likelihood ratio for testing the null hypothesis
American Mathematical Society 1970 Subject Classification: Primary 62F05;
Secondary 62E15
Key Words and Phrases: Isotonic tests, Power, Exponential families, Monte Carlo
-3-
versus H2 - HI where HZ places no restriction on ~(o). In this MonteCarlo study, we restrict our attention to a null hypothesis which specifies a
Hi:
linear order, i.e.
~(xl) ~ ~(xz) ~ ••• ~ ~(xk) • We also take the
variances a2 (x.)
to be one and draw the same number of items from each
1
population.
Computation of the likelihood ratio test statistic is discussed
and a taple of critical values for this null hypothesis is given in Robertson
and Wegman.
Van Eeden (1958) proposed another statistic for testing
Hi
against
H2 - Hi '
namely Tiz = maxlSiSk_l (X(X i +l ) - X(x i )) where X(x i ) is the
sample mean of the i th population. Let a be a "target" significance level
and let
aZ(x.)
1
a* =a/(k - 1) • TIle critical point for Van Eeden's test when
=1
is
where n is the number of observations made on each population and where
~
a
is defined by
(1/121r)
2
I~
e-~ dx
= a*
•
~a*
The true significance level
a O = sUP~(o)eff*
above by a and below by a - ~a2
.01
a
so that
= .01
.04875 S a O S .05
1
P[Ti2 ~ ta*I~(o)]
For this study we chose a
for a
= .05
and
is bounded
= .05
and
.00995 S a O S .01 for
•
Normal pseudo-random variates were generated according to the well-known
Box-Mueller transform and sample means,
X(xi ) 0' based on n
= 100
were
*
-4-
calculated.
For three different types of alternate hypotheses estimates of
the power and the standard error of the estimate of the power were calculated
based on 1000 replications of the Monte-Carlo experiment.
the means were taken to be linear according to the rule
i = 1, 2, ••. ,k,
andOfinally for
k
a
= 3,
= .01
6, 9, 12;
and
.05.
In the first study
~(xi)
B = 1, 1/2, 1/3, •.. ,
=e
Q
i ;
1/JO,,1/20~ ••• ,1/80
Results of this study are given in Tables
1 and 2 and Figures 1 and 2.
As we might reasonably expect, the likelihood ratio statistic beats Ti2'
often impressively so, as illustrated by Figures 1 and 2.
k
= 12, a = 1/10
and a
= .05,
For example for
Ti2's power is approximately
the power of the likelihood ratio statistic is still 1.
.27 while
For alternatives of
this type, the powers of both tests increase as k increases and, of course
as
a
Hi*:
increases.
~(xi)
= 0;
a =0
The case
corresponds to the null hypothesis
i = 1, 2, ••• ,k and, hence, here
of the significance level.
the.~ower
is an estimate
These estimates of the significance levels for
Tiz generally underestimate the "target" significance level as Van Eeden's
theory predicts but in most cases this estimate is within two standard
deviations of the target level.
Since Ti2
is based on differences between adjacent sample means one
might reasonably expect it to be more sensitive to alternatives where one or
more of the differences between adjacent population means is large.
gives estimated power for slippage alternatives of the type
••• -
~(~)
-8/20 and
= 0 and
-9/10.
~(xl) =
~(x2)
Table 3
= ~(x3)
=
-1/90, -2/80, -3/70, -4/60, -5/50, -6/40, -7/80,
Table 4 gives further data for slippage alternatives:
-5-
p(X i ) = -.35 while U(X j ) = 0 for j ~ i ; i = 1, 2, ••• ,12. Also given
in Table 4 is a step type alternative for which p(xi ) = -.35 for j S i and
P (xi) = 0 for j > i ; i = 1, 2, ••• ,12 •
In Table 3 the likelihood ratio statistic is more powerful than Ti2
except for
p(xi ) = -(1/90)
or
-(2/80).
.':'-.
In these two instances the slippage
is so small that the power is essentially equal to the size of the test.
As
expected, the differences in power for Ti2 and the likelihood ratio statistic
in Table 3 are not nearly so dramatic as those in Tables 1 and 2.
Heuristically
one might predict this since the likelihood ratio statistic is based on all the
means simultaneously and hence should be more sensitive to the sorts of alternatives in Tables 1 and 2 compared to those in Table 3.
l~e.
may,
in Table 4, compare powers as the location,
slipped mean ranges from
I
through 12.
i , of the
The power of the likelihood ratio
monotonically decreases with this location shift whereas the power of Ti2
stays relatively constant.
and significantly for
i
For i = 10, 11 Ti2 beats the likelihood ratio
= 11.
Notice that as the location of the slippage
increases the alternative comes closer to satisfying the null and in fact for
i
= 12,
Hi
is satisfied so that the powers approximate the size of the test.
Finally for the step alternatives, the likelihood ratio statistic has
maximum power near k/2
and its power decreases in both directions while Ti2
has essentially constant power.
Hi
Again notice that the case i
so we have another estimate of the size of the test.
= 12
satisfies
./
-6-
Power for
Likelihood Ratio Test
Statistic. T12
...
~
1
1
2
1
"3
1
4"
1
5
1
6"
1
7"
1
8"
1
9
1
TO
1
20
1
30
1
40
1
50
1
60
Power for Ti2
6
9
12
3
6
9
12
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
.910
(.009)
1
(0)
.999
(.001)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
.951
(.007)
.810
(.012)
1
(0)
1
(0)
1
(0)
.863
(.011 )
.921
(.008)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
.648
(.015)
.485
(.016)
.723
(.014)
.568
(.016)
.940
(.008)
.774
(.013)
.666
(.015)
.545
(.016)
1
(0)
1
(0)
.363
(.015)
.461
(.016)
.397
(.016)
.342
(.015)
1
(0)
.997
C·002)
.989
(.003)
.968
(.006)
1
(0)
1
(0)
1
(0)
1
(0)
.670
(.015)
.466
(.016)
.337
(.015)
.258
(.014)
.226
(.013)
.187
(.012)
.264
(.014)
.436
(.016)
.346
(.015)
.282
(.014 )
1
(0)
.170
(.012)
.229
(.013)
.244
(.014)
.151
(.011)
.096
(.009)
.461
(.016)
.233
(.013)
.997
(.002)
.856
(.011)
.614
(.015)
.416
(.016 )
.076
(.008)
.054
(.007)
.050
(.007)
.055
(.007)
.110
(.010)
.075
(.008)
.067
(.008)
.114
(.010)
.087
(.009)
.081
(.009)
1
(0)
.887
(.010)
.552
(.016)
.362
(.015)
.240
(.014)
.183
(.012)
.164
(.012)
.143
.049
(.007)
.034
(.006)
.038
(.006)
.065
(.008)
(. Oll)
.305
(.015)
.241
(.014)
.209
(.013) :.
.054
(.007)
.056
(.007)
.036
(.006)
3
1
(0)
1
(0)
.997
(.002)
1
70
.080
(.009)
.052
(.007)
1
80
.068
(.008)
.181
(.012)
.151
(.011 )
.127
(.010)
.121
(.010)
.091
(.009)
0
.060
(.008)
.051
(.007)
•
.315
(.015)
.059
(.008)
.089
(.009)
.079
(.008)
.074
(.008)
.070
(.008)
.593
(.016)
.453
(.016)
.368
(.015)
.298
(.014)
.269
(.014)
.121
(.010)
.086
(.009)
.079
(.008)
.073
(.008)
.067
(.008)
.065
(.008)
.046
(.007)
.063
(.008)
.048
(.007)
.043
(.006)
.038
(.006)
.041
(.006)
.049
(.007)
.060
(.008)
TABLE 1
Monte Carlo estimates of power and standard errors (in parentheses) for the likelihood
ratio test and for the test statistic Tiz, Significance level, a = .05 and
alternatives
~(xi)
=B•
i •
-7-
Power for
Likelihood Ratio Test
Statistic, T12
.,
~
Power for Tt2
3
6
9
12
3
6
9
12
1
(0)
1
(0)
.983
(.004)
1
(0)
1
(0)
1
(0)
.992
(.003)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
.955
(.007)
1
.829
(.012)
1
(0)
1
(0)
1
.601
(.016)
.412
(.016)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
.973
(.005)
.653
(.015)
.401
(.016)
.250
(.014 )
.300
(.014)
.228
(.013)
.178
(.012)
.998
(.001)
.979
(.004)
.949
(.007)
1
(0)
1
(0)
1
(0)
1
(0)
.065
(.008)
1
(0)
1
(0)
.052
(.007)
1
.145
(.011 )
1
20
1
(0)
.709
(.014 )
1
(0)
.991
(.003)
.050
(.007)
.018
(.004)
.075
(.008)
.055
(.007)
.036
(.006)
.042
(.006)
.026
(.005)
.301
(.014 )
.694
(.015)
.009
(.003)
.139
(.007)
.075
(.008)
.366
(.015)
.014
(.004 )
.015
(.004)
1
80
.010
(.003)
.023
(.005)
0
.008
(.003)
.013
(.004)
.063
(.008)
.055
(.007)
.037
(.006)
.009
(.003)
.127
(.010)
1
70
.050
(.007)
.027
(.005)
.021
(.004)
.024
(.005)
.023
(.005)
.013
(.004)
.882
(.010)
.261
(.014)
1
1
2"
1
3"
4"
5
1
6"
1
7"
1
8"
1
9
10
1
30
1
40
1
SO
1
60
I
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
.189
(.012)
.083
(.009)
.075
(.008)
.009···
(.003)
.642
(.015)
.351
(.015)
.197
(.013)
.124
(.010)
.088
( .009)
.007
(. 003)
.009
(.003)
.004
(.002)
.005
(.002)
1
(0)
.901
(.009)
.512
(.016)
.320
(.015)
.200
(.013)
.124
(.010)
.095
(.009)
.091
(.009)
.073
(.008)
.030
(.005)
.016
(.004)
.590
(.016)
.374
(.015)
.239
(.014)
.148
(.011)
.107
(.010)
.161
(.012)
.119
(.010)
.095
(.009)
.103
(.010)
.072
(.008)
.026
(.005)
.091
(.009)
.023
(.005)
.018
(. 004)
.012
(.003)
.017
(.004 )
.020
(.004 )
.017
(.004 )
.017
(.004 )
.012
(.003)
.020
(.004)
.015
(.004)
.015
(.004)
.007
(.003)
.004
(.002)
.008
(.003)
.005
(.002)
.012
(.003)
.005·
(.002)
.015
(.003)
.017
(.004)
.018
(.004)
.018
(.004)
TABLE 2
Monte Carlo estimates of power and standard errors (in parentheses) for the likelihood
ratio test and the test statistic,
alternatives
~(xi)
=e •
i .
Ti2'
Significance level,
a = .01
and
-8-
u(x~
..
Power for the
Likelihood Ratio Test
St ati s t'le. T12
3
Power for Ti2
6
9
12
3
6
9
12
1
- 90
.055
(.007)
.054
(.007)
.050
(.007)
.056
(.007)
.037
(.006)
.055
(.007)
.052
(.007)
.048
(.007)
2
- 80
.065
(.008)
.055
(.007)
.049
(.007)
.041
(.006)
.036
(.006)
.039
(.006)
.042
(.006 )
.043
(.006)
-70
3
.086
(.009)
.075
(.008)
;075
(.008)
.059
(.008)
.060
(.008)
.053
(.007)
.056
(.007)
.056
(.007)
4
- 60
.115
(.010)
.085
(.009)
.081
(.009)
.072
(.008)
.064
(.008)
.053
(.007)
.055
(.007)
.048
(.007)
5
.188
(.012)
.148
(.011)
.105
(.010)
.091
(.009)
.104
(.010)
.074
(.008)
.064
(.008)
.063
(.008)
-40
6
.277
(.014)
.229
(.013)
.189
(.012)
.168
(.012)
.171
(.012)
.140
(.011)
.116
(.010)
.099
( .009)
7
- 30
.479
(.016)
.447
(.016)
.383
(.015)
.336
(.015)
.320
(.015)
.258
(.014)
.226
(.013)
.200
(.013)
8
- 20
.916
(.009)
.909
(.009)
.873
(.010)
.839
(.012)
.761
(.014)
.688
(.015)
.638
(.015)
.597
(.016)
9
- 10
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
- 90
.014
(.004)
.011
(.003)
.Oll
(.003)
.003
(.002)
.008
(.003)
.008
(.003)
.005
(.002)
.004
(.002)
2
- 80
.010
(.003)
.005
(.002 )
.003
(.002)
.009
(.003)
.007
(.003)
.007
(.003)
.007
(.003)
.009
(.003)
3
-70
.023
(.005)
.019
(.004)
.012
(.003)
.018
(.004)
.011
(.003)
.009
(.003)
.Oll
(.003)
.014
(.004)
4
- 60
.026
(. DOS)
.022
(.005)
.019
(.004)
.017
(.004)
.014
(.004 )
.016
(.004 )
.020
(.004)
.018
(.004)
5
a=.Ol - 50
.058
(.007)
.040
(.007)
.027
(.005)
.021
(.004)
.027
(.005)
.023
(.005)
.014
(.004)
0.13
(.004)
6
- 40
.101
(.010)'
.087
(.009)
.069
(.008)
.051
(.007)
.057
(.007)
.040
(.006)
.031
(.006)
.027
(.005)
7
- 30
.260
(.014)
.215
(.013)
.153
.133
(.Oll)
(.Oll)
.147
(.011)
.102
(.010)
.084
(.009)
.071
(.008)
8
- 20
.757
(.014)
.742
(.014)
.694
(.015)
.633
(.015)
.550
(.016)
.483
(.016)
.440
(.016 )
.409
(.016 )
9
- 10
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
1
(0)
a-.05
- SO
.-
TABLE 3
Monte Carlo estimates of power and standard errors (in parentheses) for the
likelihood ratio test and the test statistic, Ti2' with slippage alternatives
~(xi) =
0,
i = 2, 3, .. , ,k
and
~(xl)
as indicated.
-9-
Step Alternative
Likelihood Ratio Test
Tr2
Slippage Alternative
Likelihood Ratio Test
Tr2
~
.05
.01
.05
.01
.05
.05
.01
.01
1
(.015)
.434
(.016)
.470
(.016)
.272
(.014 )
.664
(.015)
.434
(.016)
.470
(.016)
.272
(.014)
2
.640
(.015)
.411
(.016)
.250
(.014 )
.455
(.016)
.250
(.014 )
.368
(.015)
.271
(.014)
.937
(.008)
.987
(.004)
.813
(.012)
.629
(.015)
.625
(.015)
.449
(.016)
.476
(.016)
.486
(.016)
.944
(.007)
.271
(.014)
.285
(.014)
.995
(.002)
.478
(.016)
.489
(.016)
.998
(.001 )
, (.016)
.270
(.014)
.267
(.014)
.262
(.014)
- .249
(.014)
.243
(.014)
.253
(.014)
.268
(.014 )
.051
(.007)
.012
(.003)
3
4
5
6
7
8
9
10
11
12
.664
.369
(.015)
.600
(.016)
.558
(.016)
.353
(.015)
.520
(.016)
.529
(.016 )
.483
(.016)
.411
(.016)
.297
(.014)
.275
(.014)
.293
(.014)
.030
(.005)
.124
(.010)
.008
(.003)
.322
(.015)
.255
(.014)
.205
(.013)
.460
(.016)
.433
(.016)
.459
(.016)
.449
(.016)
.468
(.016)
.447
(.016)
.438
1
(0)
.981
(.004)
.985
(.004)
.988
(.003)
.995
(.002)
.996
(.002)
.977
(.005)
.988
(.003)
.990
(.003)
.927
(.008)
.950
(.007)
.667
(.015)
.056
(.007)
.813
(.012)
.430
(.016)
.009
(.003)
.461
(.016)
.435
(.016)
.285
(.014)
.270
(.014)
.269
(.014)
.461
(.016)
.450
(.016)
.470
(.016)
.262
(.014)
.249
(.014 )
.449
(.016)
.442
(.016)
.058
(.007)
.253
(.014)
.243
(.014)
.268
(.014)
.014
(.004)
TABLE 4
Monte Carlo estimates of power and standard error (in parentheses) for the likelihood
ratio test and the test statistic,
or slippage of jump is
,
-.35
and
Ti2 with slippage or step located at
k
is
12.
i.
Size
· 10
1
.9
.8
.7
T
12
.6
.5
Tb
.4
.3
.2
.1
.01
1
1
1
2"
"3
1
1 1
4" 5""6
1
111
20
30 40
1
"8 10
111
60 80 100
FIGURE 1
Power as a function of
Statistic,
T12
and for
S
(~(xi) = S • i)
Ti2.
a
= .01
for the Likelihood Ratio Test
and
k
=3
.
-11-
1
.9
.8
.7
.6
.5
.4
.3
.2
.1
.01
1
1
'2
1
1
3"
!4 56
1 1
1
8
1
1
1
1
111
10
20
30
40
60
80 100
FIGURE 2
Power as a function of
T12
and for
Tiz,
a.
S
(~(X.) = S •
= .01
1
and
i)
k = 12 .
for the Likelihood Ratio Test Statistic,
-lZ-
3. Tests of Trend for an
Class of Distributions: Now let us turn
ExponeDt~al
our attention to extensions of the likelihood ratio test to distributions of
the exponential type.
Suppose yeo)
is a a-finite measure on the Borel sub-
sets of the real line and consider a regular exponential family of distributions
defined by the probability densities of the form
(3.1)
f(x; e,T) = exp[Pl(a)P2(T)K(x) +
S(X~T) +
q(a;T)]
6
with respect to y and with
-~
s al
<
aZ s
~.
€
(6 l ,6 Z)
and
T ~
T
We make the following
assumptions:
(3.2)
PI (0)
and q(o;T)
both have continuous second derivatives on
for all
T
~
(6 1 ,6 2)
T.,
and
We are thinking of T as fixed so that all derivatives are with respect to
If X is any random variable having density function
f(x; e,T)
Theorem 9 on page 52 of Lehmann (1959), the integral,
f
can be twice differentiated, with respect to
obtaining E[K(X)]
=e
and V[K(X)]
then using
f(x; e,T)dy(x)
a, under the integral
= [Pl(a)PZ(T)] -1
.
e •
=I
sign~
,
-13-
Suppose we have independent random samples from each of k populations
.
t 0 tea
h bove exponent14
·'1 f aml·1 y were
h
th e 1-.th popu 1at·lon has parabe 1onglng
G(xi ) and T i (T i is known). Let the items of the random sample
from the i th population be denoted by X.. : j = 1, 2, ••• ,n l. and suppose «
meters
IJ
is a partial order on 8
= {Xl'
x2 ' ••• ,xk}.
Consider the following
hypotheses:
HI: SeQ)
is isotone with respect to «
and HZ places the restriction on S(Q).
We consider a likelihood ratio
against H2 - HI • The maximum
1
of G(o) under H2 is given by G(o) where e(x.) =
1
1
Furthermore, the maximum likelihood estimate of the common
statistic for testing HI
likelihood estimate
n.l
K(X.1 .) •
J= l
J
value of a(o) under
n: r.
HO is given by
~
GO = [~
Li= I n.P2(T.)]
1
1
-1
k
~
.~.
L1 =1 n.P2(T.)S.
1
1
1
so that from Robertson and Wright (1975) it follows that the maximum likelihood
estimate of G(Q)
under HI
is a(o) = E[e(o)IL]
where
L is the a-lattice
of subsets of 8 induced by «
(cf. Barlow et. al. (1972)). The expectation
is taken with respect to the space (8, 2S , 0) where 0 is the probability
measure on the collection, 28 , of all subsets of S which assigns mass
n.
1
0
P2(T.)
1
t
[L~J= I n.P2(T.)]
to the singleton
J
J
{Xl.}.
Maximum likelihood
estimation of parameters of distributions belonging to an exponential family
were first discussed by Brunk (1955).
For a discussion of this and related
-14work see Barlow et. al. (1972).
If A12 is the likelihood ratio for testing Hi
T12 = -2lnA 12 then
against HZ - Hi
and
"
anq q(o; Ti ) about e(x
by using Taylor's Theorem with
i)
second degree remainder term and substituting for PI (iB(x i )) and q(ieexi ); Ti )
we obtain
Expanding Pl(o)
- [n.e(x')P2
eT .)P'lt ea.)
111
1
0
2- 1
+
n.q"
(8.; T.)
111
0
2- 1]
"
2
o [e(x.)
- e(x.)]
1
1
where a i and 8i converge almost surely to 0(x i ) • This convergence follows
"
from well-known properties of 6(Xi ) and eexi) • Now from (3.4),
q (e(xi);T i )
= -6exi)pi(S(Xi))P2(Ti)
Theorem 3.1. If f(x;e,T)
satisfy (3.2) - (3.4) and if
= n
k = n then as n
-+
00
,
so that
is of the form (3.1) where Pl(o)
6(X
l)
= ~ex2) = ... = e(xk)
and q(o;T)
and n l
= n2 = •••
-15}
where
X(x ), X(X2), ••• ,X(x ) are independent normal random variables having
k
1
zero means and v(xex1)) ~ pz(T i )-l. The expectation E(X(o)IL) is taken
regarding X(o) as a function on the space (5, 25 , ~). (Note that
~({x.})
= PZ(T.)
1
1
Proof:
r~J= 1 PZ(T.)
.)
J
t
be 60 , Then from (3.5), using wellknown properties of the conditional expectation operator
Let the cornmon value of 6(0)
~k
[6(x
" ')Pl
' ' (a )P2(T )
T12 = -Li-l
i
i
i
+ q"
(Si; Ti )]
o [ E(1ii (a(o) - eO) IL) (Xi) - Iii (a(xi )
"
Now 8 (Xi)
variances
-
60) J2 •
is the sample mean of i.i.d. random variables having means
-1·
<
Let·, Zn
I
[pi (6 0 ) • Pz (T i )] ,
0<)
eO and
.
be the .2k·· dimensional ,random
•
vector defined by
...
1
Zni = 6(x i )P '(a i )pz(T i )
+
q" (Sii
T
i) ;
i = I, 2, ••. ,k
Using the Law of Large Numbers, the Central Limit Theorem and Theorem 4.4 of
Billingsley (1968),
+
+
Zn converges weakly to Z where
= Y(x.1- k);
i
=k +
1, k
+
2, ..• ,Zk
where Y(x1), Y(xZ)' ••• ,Y(xk) are independent normal random variables having,
zero means and V(Y(xi ») = [pi(6 0 )P2(T i )]-1. The conditional expectation
-16+
operator is continuous so that T12 is a continuous function of Zn
follows from Corollary 1 of Theorem 5.1 of Billingsley (1968) that
The desired result now follows since q" (6 0 ; Ti )
pi (6 0)P2(T i ) from (3.4).
= -eoPl
It
(e O)P2(T i ) -
Theorem 5 of Robertson and Wegman (1975) now yields
Corollary 3.2.
If the hypotheses of Theorem 3.1 are satisfied then for each
real number t
where
X~_t is a x2 random variable having k - t degrees freedom and as in
Barlow et. al. (1972).
levels.
P(t,k)
is the probability that E(X(o)IL)
The probabilities P(t,k)
depend on the partial order «
takes on t
and on the
We now show that Corollary 3.2 provides the large sample approximation to
the critical level for testing HI
against HZ - HI'
As with the proof of
Theorem Z of Robertson and Wegman (1975) this property is a consequence of the
fact that our isotonic estimators can be viewed as projections on closed convex
cones in the Hilbert space of all functions on S
=
x2 ' ••• ,xk} with
n(xi)wi and
(y(o), nCo)) = L~=l Y(xi )
n·P2(T.)
+ r~ 1 n.PZ(T j ) • Suppose 6(0) satisfies HI but not Ho
1
1
J=
J
inner product defined by
w.1
= {Xl'
0
'
let VI' v 2 ' ••• ,vh be the distinct values among 6(x l ), 6(X2), .•• ,6(xk)
-17-
and let 5.
1
= {x.;
J
e(x.)
J
= V1,};
i
= 1,
2, .•. ,H.
Define the partial order.
on 5 by xa ~ x if and only if xa « x and xa ' x/3 ~ 5i for some
e
a
i . Let L(e) be the a-lattice of subsets of 5 induced by ss and let
SS
I(e)(I)
be the collection of all functions on 5 which are isotone with
respect to SS«<) • The collection 1(6)(1)
is a closed convex cone in the
Hilbert space of all functions on 5 and E{n(o)IL(e))
projection on
I(e)(I)
in this space (cf. Brunk (1965)).
(3.6)
(E(o(o)IL))
is the
Furthermore
I c I (e)
and using Corollary 2.3 of Brunk (1965), if E(o(o)IL(e)) ~ I· then
E(oCo)IL(e)) = E(n(·)IL~ •
Lemma 3.3. If minx 5 n(x.)
i~
max
1
1
~
max
xi~
5 n(x.) ~ ••• ~ max 5 o(x.)
Xi € 3
1
Xi~ h
1
5 n(x 1·) ~ minx 5 o(x1·) ~
2
i€ 2
then E(n(0)IL(6)) = E(n(o)/L) •
Proof: It suffices to show that E(n(o)IL(e))
~Co)
e I.
5uppose
= E(nCo)IL(e))
and xa « x • If xa ' xa e Si for some i then
a
xa ~~ x and ~(xa) ~ ~(xa) • 5uppose xa « 5i , xa € 5j and i ~ j •
e
5ince 6(0) is isotone with respect to « the sets 51 + 52 + ••• + 5i and
51 + 52 + ••• + 5j are in L so xa~e 51 + 52 + ••• + 5i and therefore
i > j . Now ~(Xa)(~(Xa)) is an average of the values of nCo) at points in
5. (5.)
1
J
so
and Ho)
~
I •
-18-
let PaCE)
For any 6(0)
under the assumption that
PO(E)
be the probability of the event E computed
is the true vector of parameter values and let
6(0)
be the probability of E computed under HO '
=n2 =
Theorem 3.4. If S(o) satisfies HI and n l
Proof: Define
L(S)
A
1
A
~
= n then
and the sets 51' 52' ••• ,5 h as before. Now
so for sufficiently large n with probability one
S$,
e(xi ) a.s •• 6(X )
i
min 5 S(x n ) ~ max. 5 SCx n )
xR,€
= nk
Xt€
2
~
A
~
•••
~
max
xR,
€S
h
6(x n )
~
and from (3.5) and
Lemma 3.3
for sufficiently large n with probability one.
Using an argument similar to
the one used for Theorem 3.1 we obtain
where XCo)
is defined as in Theorem 3.1.
fact that piCS(o))
Here it is necessary to use the
is positive and constant on the sets 51' 52' ••• ,5h •
The desired result follows from (3.6) since
l~=l
P2(T )
i
[E(X(o)/L(Q.)) (xi) - X(x i )]2
= IIE(X(o)IL(6))
- X(o)11
s IIE(X(o)IL) - XC o)I/ 2
2
-19It seems clear that the hypothesis nl = n Z = ... = nk could be relaxed.
It was required to take n inside the conditional expectation. However, the
measure on 28 on which the expectations depend, also depends on ni so that
such a relaxation would still require some assumption about the way the n.'
s
1
go to infinity.
Likelihood ratio tests for testing HO against HI are discussed in
Barlow et. al. (1972). However we have been unable to find an analogue to
Theorem 3.1 for this test.
An
(
argument similar to the argument given for
Theorem 3.1 yields
Theorem 3.5. Let TOl
= -2lnAOl
where AOI is the likelihood ratio for
testing HO against HI and assume the hypotheses of Theorem 3.1 are
satisfied. Then
and X(o)
is as in Theorem 3.1.
Corollary 3.6. If the hypotheses of Theorem 3.1 are satisfied then
for all
t
(cf. Theorem 3.1 of Barlow et. al. (1972)).
In closing it is worthwhile to point out that among families with densities
of the form given by (3.1) are the Normal, Binomial, Poisson and Exponential
families.
In particular, in the normal case if the
~(xi)
are known and
,
-206(X )
i
is taken to be a 2 (xi ) , then one may form likelihood ratio tests for
trend in the variance.
Such a test is useful in the analysis of residuals
~cedure
to determine, for example, if there
hence if weighted least squares is appropriate.
isotonized variance estimate can be used to
hted least squares procedure.
€?CT'
"tl
11
r+
lA
..g
Il3
Po
Po
. ....s
r+
z §Po
0
....~
• Bromner, J. M. and Brunk, H. D. (1972).
"Jroder- Restnations, Wiley, New York •
....
tr-genae of Pr>obabiUty MeasUX'es, Wiley,
~
CD
Il3
CD
~
CD
lA
"0
11
0
Il3
(')
;:T
~
Ii
r+
e
::r ~
....
n
~
........::s
0
Il3
~
;;j
lA
....
r+
r+
~
r+
CD
0
~
r+
.
·
~
10
"'-l
U1
'-J
·
~
en
r+
lA
(')
lA
~
S
CD
0
en
CD
0
0
::s....
(')
~
00
0
Ii
H1
,1l
Ii
CD
I>
Ie
Ie
1m
Iz
Ie
Ie::
13
ihood estimates of ordered parameters,
16.
xpectation given a a-lattice and applications.
1350.
estimating ordered parameters of probability
ertation, University of Amsterdam),
Amsterdam.
CD
Hypothe8es~
Wiley, New York.
~
tistiaaZ
....0
T. (1975). A likelihood ratio test for
~ normal means. North Carolina Institute of
Il3
77.
CD
lA
lA
..::s
CD
lA
.
;;j
OQ
Ii
.... 9
....Hl
....
Z
0
11
~
(')
lA
....
....r+
....0
· ~..
....
r-..
H1
r+
Il3
r+
CD
(I)
'Tl
~RENCES
::r
CD
Po
© Copyright 2026 Paperzz