ON BARTIETT'S TEST AND LE~NN'S TEST FOR
HOHOOENEITY OF VARIANCES
by
Hisao Nagao
Nariaki Sugiura
University of North Carolina
Hiroshima University
and Hiroshima University
Department of Statistics
University of North Carolina at Chapel Hill
Institute of Statistics Mimeo Series No. 598
November 1968
This research was supported by the National Science Foundation,
Grant No. GU-2059, and the Sakko-kai Foundation.
ON BARTLEr.r r S TEST AND LEHW.NN r S TEST FOR
HOl-100ENEITY OF VARIANCES
by
NariakiSugiura
and
Hisao Nagao
1.
Introduction
~
summary.
two tests due to Bartlett [2)
The purpose of this paper is to compare
and Lehmann [4J
(= M test)
homogeneity of variances of k normal populations.
M test
,~s
established by Pitman [5), whereas
biased in Section 2.
(= L test)
for
The unbiasedness of the
the L test is shown to be
It is well known that these two test statistics have
asymptotically the same
~
distribution with k - 1 degrees of freedom
under the hypothesis for large sample sizes.
We shall derive the limiting
distributions of these test criteria under the sequence of alternative hypotheses with arbitrary rate of
tend to infinity, in section 3.
~ence
to the hypothesis as sample sizes
The asymptotic expansions of two test criteria
under fixed alternative hypothesis and also the asymptotic expansion of the L
test Wlder the null hypothesis are obtained in Section 4 wi til some numerical
examples.
The asymptotic non-null (standardized) distributions of the two
test criteria Wlder fixed alternative hypothesis are shown to be normal distributions.
2.
Biasedness of
!!~.
Let XiI' Xi2, ••• , XiN . be a random sample
J.
from a normal distribution with mean J..1.J. and variance 0".2
(i = 1, 2, ••• , k).
J.
For testing the hypothesis
for some
H : 0"12
i, j (i
f:
= 0"22 = ••• = O"k2
-
j)
-
against all alternatives
with unspecified mean
i,
J..1
the L test
criterion due to Lehmann [4] is given by
This research was supported by the National Science FOWldation, Grant No. GU-2059,
and the Sakko-kai FOWldation.
2
(2.1)
1
k
- -
!:
2
nA ZA J ,
n ~=l
......
H
where
s. = ~ (X ja J
a=l
Za = log( salna) and
X")2
with
J
k
n"
J
= NJ"-1
and n
=
!: n •
a=l a
M test criterion due to Bartlett [2J, with-
n1e
out correction factor, is given by
with the same notation as above.
The L (or M) test rejects the hypothesis
when the observed value of L (or M)
H
is larger than a preassigned constant.
It may be remarked that the M test is equivalent to the modified likelihood
ratio criterion and the unbiasedness was proved by Pitman [5J.
The modification
means to change sample size
N to the degrees of freedom n • The following
a
a
theorem shows that the L test is not always unbiased.
Theorem
&l:.
In two sample problem (k=2),the Ltest is unbiased if and
only if their sample sizes are equal.
In this case
(n l
= n2 ), the L test is
equivalent to the Mtest.
Proof.
If k
= 2,
L
=
(l/2n)nln2(log(~Sl/nlS2»)2
and the acceptance
region of the L test is simplified as
(2.3)
l/c
~ ~Sl/(nls2) ~ c
for some constant c (c > 1). Ramachandran [6 J showed that the acceptance
region
(2.4)
for any constant c
and c
2
l
and only if the condition
c
such that
c <c '
l
2
nl
n
-n
nl
nl
(1 + nl c )
= c (1 + n cl)-n
l
2
2
2
2
gives an unbiased test if
3
is satisfied.
This is proved by putting the derivative of the power function
at the null l~othesis to zero.
In our case
c
2
= 1/c ,
1
the condition is
expressed as
nl -n2
(2.6)
0 <c
for
n
l
cl
=~.
= [(~ + ~cl)/(nl + n2c l )J
n
< 1. We shall show that this condition is not satisfied unless
l
Taking the logarithms on both sides of (2.6), we shall put
= (n l
- ~) log c 1 - n log[(~ + nlcl)/(nl + n2c l )J,
then the derivative of f(c ) is given by
l
(2.7)
f(c l )
(2.8)
fl (c l )
Noting that
= (nl-n2)nln2(cl-l)2/ (c l (nf+n2c l ) (n2 +nl c l »).
f(l) = 0,
we can conclude that for any c
nl -n
2 < [(~ + nlcl)/(nl + n c l )t if n l
l
2
and the reverse inequality holds if n
is less than
l
and one,
c
the condition (2.6).
isfied.
If n
In this case
lying between zero
is larger than
~
n , which contradicts
2
the condition (2.6) is obviously sat-
= n ,
l
l
2
M = (n/2)log[
e
ance region of Which is equivalent to
(1 + (Sl/S2»)2 (S2/Sl)J,
l/c ~ Sl/S2 ~ c
the accept-
for some
c.
Hence
our proof is completed.
3.
Limiting distributions under sequences gf alternative hjyPotheses.
Since the statistic
S /0- 2
a a
has
r
statistic
distribution with n
a
T = [(Sa/o-a2 )
a
-
=
Na
1:
13=1
statistic
T
a
n J/..[2;;a
a
_
X )2/0- 2
a
a
K,
the
is distributed asymptotically according
n
a
tends to infinity.
We shall ex-
L and M given in (2.1) and (2.2) in terms of the
(a = 1, 2, ... , k).
can easily see that
t¥P
degrees of freedom. under alternative
to the standard norml distribution as
press the test statistics
(X
Putting na =
PcP
with
k
1:
a=l
Pa = 1,
we
4
(3.1)
Za = log Sa!na
= log O"a2 + log (1 + .J2/na Ta ),
which implies, for large n with fixed P ( P
a
a
k
L = (n/2) L
(3.2)
Pa(~a-0)2
a=l
k
+ (n/2) L
~l
k
= (n/2)
k
CX=1
Ua = log
where
L
k
L
Pa{log (1 + .J2/na Ta ) P~ log(l + .J2/n~ T~)}2
pel
P (cr -cr)2 +.J2n L ~ (cra-cr)Ta
a a
~l
,
and
press the test statistic
M = n(log
- ( L
~l
k
L
a=-a:u Pa
.JP: T
a
a
log 0";.
1
)2 + 0 (n-'2),
p
In the similar way we can ex-
M as
iT-a) +.J2n
k
L
k
(va-l) ~ Ta + L T 2
~l
~l a
(~
~ Va Ta )2
a=l
-
k
(a-aa +l)Ta 2
O"a2
Pa(ua-O) log (1 + .J2/na Ta )
~l
k
~l
+ L
k
+ n L
> 0),
(n-i"
+ 0
p
Now we shall specify the sequence of alternatives
for
If
a = 1, 2, ••• , k
0
<
8
and
< ~ we can rewrite the e:x;pression of L in (3.2) as
where A(0"1' ... , O"k) = (n/2)
tv
and
8
O"a = 0" + (arln )
as
> 0, where not all a's are assumed to be eqUal.
8
L = A(CT , ••• , O"k) +
1
(3.4)
K
8
~
n i-8
0"
k
k
-t-28
L (a -~ Ta + 0 (n
),
a=1 a
.....
p
tv
8
-
L P (0" _<T)2
~1 a a
substituted by
k
a = L Pa aa • Hence we have A( 0"1' ••• , O"k ) = 0 ( n
1-28
O"a = 0" + (aJn )
).
This means that
- a::J..
the statistic
n
8_1.
2 [L-A(O"l'
••• , O"k)]
has asymptotically, a normal distribution
2 = (8/02)
L
write the statistic L from (3.2) as
with mean zero and variance
T
k
tv
L Pa (aa-a)2.
~1
If
0 >~,
we can
5
L
=
k
k
2 _ ( I:
a=l a
0:=1
I:
T
£
1
)2 + 0 (n2-8 ) ,
a
p
T
a
~
which shows that L has as.y.mptotically
of freedom.
distribution ,dth k-l degrees
In this case, the sequence of alternatives
K8 converges so
fast to the hypothesis that the limiting distribution of L is unchanged as
under the null hypothesis.
(3.6)
L
In the boundary of
8 = !, we can write
k
I:
=
0:=1
Thus the statistic
L has asymptotically noncentral
k-l degrees of freedom and noncentrality parameter
~
8t
Summarizing the above results, we have the following theorem.
T!leorem 3.1.
Under the sequence of alternatives
for a = 1, 2, ••• , k where not all
e's are equal, the limiting distrib-
ution of the test statistic L given by (2.1) for large
n with fixed
Pa == ncln > 0 is the following.
(1) If 0 < 8 < l, [L - A( 0"1' ... , O"k) J/n!-8 is distributed asymptotically
according to normal distribution with mean zero and variance
(8/02)
~
po;( eo; -
'J)2
where A( 0"1' ... , O"k) = (n/2)
0;=1
k
== I: Po;
0:=1.
and
e
(2)
If 8 > ! ,
~
"L2
po;(;o; -
=
0)2
l 28
= o(n - )
0=1
eo;'
;0; = log~,
eT
k
I: Po; eTo; •
0:=1.
=
L has asymptotically X2
distribution with k-l degrees
of freedom.
(3)
If
8 =!,
L has asymptotically noncentral
r
distribution with k-l
k
,..
8 2 = (2/02) I: Po; (ea - e)2.
L
a=l
The result (3) in the above theorem has often been used in discussing
degrees of freedom and noncentrality parameter
asymptotic relative efficiency of nonparametric tests (Desl1pande [3J,
•
Sugiura [7) and etc.), however, we have stated it in the theorem for completeness.
By the same argument, we have the following results for the test
M given in (2.2) from the expression of M in (3.3).
statistic
Theorem 3.2.
Under the same assumptions as in theorem 3.1., the limiting
distribution of the test statistic
(1)
If
M under
is the following.
K
8
0 < 8 <!, [M - B(er , ... , erk)]/n!-a
is distributed
l
asymptotic~
according to nornrli distribution with mean zero and variance
k
T
2
M
(8/r?-) c:.l Pa(ea - e)2,
::
k
n (log L P er
cx=l ex a
(2)
If
2
k
L P
-
= 0 (n
12e
-
)•
M has asymptotically
-f
ex=l ex
8 >!,
log a2)
B(erl , ... , erk ) ::
where
a
distribution with
k-l
degrees
of freedom.
•
(3) I f 8 =!,
M has asymptotically noncentral
degrees of freedom and noncentrality parameter
8 2
L
Noting that two noncentraJ.ity parameters
and T"leorem 3.2 are equal, we
CorollarY.
i.Inrediate~
8 2
M
X2
distribution with k-l
k
(2/ 02 ) L Pa ( eex - e)2.
CX=l
and 8: in Theorem 3.1
::
have the following corollary.
The Pitman's asymptotic relative efficiency of the
witIl respect to the
L test
M test is equal to 1.
The limiting distributions of L and M under the sequence of aJ.ternatives
~
nake no difference when
asymptotic variances
T 2
the asymptotic means
A(er , ..., er )
k
l
my expect the
> B(er ,
l
< B(er ,
l
L
and
T 2
M
8 ~!.
Even when
are equal.
0 < 8 < ! , the
Hence we are interested in
and B(er , ..., er )
k
l
in case (1).
We
L test to have the larger asymptotic pmrer when A(erl' ... , erk )
, er )
k
, er ).
k
and the snauer asymptotic power when A(er , ... , er )
k
l
We can eas~ see that
7
•
Hence the first main terms in tIle expansion of A(0"1' ••• , O"k)
B( 0"1' ••• , O"k)
are equal.
(equality of first
Putting
8
=t
and
el
= e2
and
= ••• = ek_l = e
k-l variances), we have
1.
4n 4
, O"k) = 30"3 1\(1 - 1\)(1 - 2~){e -
A(O"l' ... , O"k) - B(O"l'
A(eT ,
, O"k) > B(eTl , ... , eTk ) when 1\ < ~
l
Whereas the reversed inequaJ.ity holds when ~ < t and
Thus for large
e > ek
,
n,
We cannot make our preference of the two tests
•
ek )
s
+ 0(1).
and
e < ek •
Land M against all alter-
natives fram the asymptotic powers near hypothesis.
!!:..
Asymptotic expansions
2!. ~ distributions 2!
~
~
Ex;pa.nsion of the null distribution of !! and M.
4.1.
!:!.
If the hypothesis
~
is true, the statistic M given by (2.2) is well known to have
dis-
tribution with k-l degrees of freedom asymptotically, and further this
approximation is improved by multiplYing the correction factor
c
to the
statistic M due to Bartlett [2), where
(4.1)
c
=1
Putting m = cn,
of
M
(4.2)
•
_
1
k
(~1-
_ ~).
3(k-l) a=l na n
we know the following asymptotic expansion of the distribution
as in Anderson [1);
P(cM ~ z) = p(~2
~ z) + (k-l)(l-c)2 (P(~2_1 ~ z)
"K-l
4c2
OK
- P(~+3 ~ z») + O(m-3 ),
Where the symbol
f
~
degrees of freedom.
means random variable having the
~
distribution with
8
We shall first derive the asymptotic expansion of the null distribution
of L given by (2.1).
L
= ![
k
E
The statistic L is rewritten as
1
k
Z2 - - (E n ZN)2J.
ex=l ex ex n ex=l ex-v.
n
H we may assume that Sex has
Under the hypothesis
ex degrees of freedom.
distribution with
the statistic
~lUS
n
~
.Jncl2 Zex
~
.Jncl2 log(SJnex )
has the density function
(4.4)
c
exp[ f§: y _ nex
J2
nex
e.J2/na.
2
y
J
_ co < Y < co
,
~lere cn = (n/2)(n-l)/2 (r(n/2»)-1 • We can express the characteristic
function of L as
C(t)
•
k
= (n c )
ex=l nex
J
k
k
exp[it E ~ - it( E ~ Yex)2J
ex=l
CX=1
The second exponential part of the integrand in (4.5) is expanded asymptotical]s
for large
n by the formula
(4.6)
~
n
ex=l ex
e~J2/na
Yex =
We can write
(4.7)
X (1 - -
1
k
E
{J exp[(it-·M
n c n e -naJ2 )
ex=l ex
k
c(t) = (
~
-
m ex=1 ~
The quadratic form
-
1
12
k
E
y~ 1 k ~
n
+ 3b ( E r.J2)
ex=l ex
(it
-!)
k
E ~ - it( kE
CX=1
CX=1
.JPa Yex)2J
3/2 }
dyl ••• dyk + O(n).
a=l nex
k
k
E ~ - it( E
a=1
CX=1
.JPa Yex)2
can be written as
9
_(i)y·~-ly
where
y' = (Y1' ••• , Yk ) and
(4.8)
~
1
= 1-2it
1 - 2P1 it,
-2iWP1 P2 ,
-2iWP2 P1 '
1 - 2P2 it,
... ,
... ,
\ -2iWPkPl '
-2iW~P2 '
••• ,
...
-2i1>lPl~~
... ,
-2iWP2 Pk
...
1 - 21\it
Tne symmetric matrix
~
has a simple characteristic root of one and k-l
roots of 1/(1-2it).
Noting that all characteristic roots of
~
have
positive real parts, we can use the following well-known formulas (based on
IlX>ments of the k variate normal distribution wi th
~an
zero and covariance
matrix ~ ~. (O"~) );
~exp[-iY'E-IYJdy =
~exp[-1YtE-1YJY~
dy =
=
=
if £
=3
30" 2(21t)k/2 IE r!
if £
=4
150" 3 (21t)k/2IE Ii
if
£
=6
o
CD
CD
Hence vre can simplify the characteristic fUnction of L in (4.7) as
(4.10)
C(t)
=( ~
c
n
0=1 a
e-na/2 )(21t)k/2(1_2it)-(k-l)/2
1 k
1
l-2itPa 2
5 k 1 l-2itPa 3
) + -12n a=l
~ -P ( l-2it )
a
x [1 - r.l-2it
E -P(
tfnn 0=1
a
10
In the above expansion, we can easily see that the term of order
n-3/ 2
vanishes because any product moment of 5-th degree from normal population
Applying the Stirling's fornnlla log rex) =
log x - l/(12x) + 0(x-2 ) to the coefficient c n
in (4.10),
with mean zero always vanishes.
log~ + (x -
!)
ex
we can obtain
k
n
ex=l
(4.11)
Arranging the second factor of the characteristic function in (4.10)
according to the magnitude of negative power of
(1-2it)
with the above
result, we can get the following asymptotic forlTRlla;
(4.12)
C(t)
+ 121
= (1_2it)-(k-l)/2[1
n
f (1
L2
-
~
a=l
l/p)
ex
Inverting this characteristic function we can get the following theorem.
Theorem 4.1.
The null distribution of Lehmann's test statistic L given
by (2.1), expanded asymptotically in terms of the
n with fixed
Pex
(4.13)
peL
+
where P
= 9aln
~
k
!:
- CX=1
distribution for large
(positive), is
z) =
P<:~'~-l ~
(3~+6k-6-3p)P(X~+3 ~
=
r
z) +
~n [2(1-P)P(X~_1 ~
z) - (3k2 +6k-4-5p)P( X~+5
~
z)
z) ] + O(n-2)
(l/pex ).
From this theorem., we can easily get the asymptotic mean of the statistic
L when the hypothesis H is true,
(4.14)
E[L
I HJ
= k-1 +
~
k
!:
..l_
2 a=l nex
k(k+2) + 0(n-2 ).
2n
11
This result can also be obtained by calculating directly the asymptotic mean
Zex and ~ defined in (2.1).
of tile statistic
rection factor
d such that E[dLIH]
We shall determine a cor-
= k-l, that is the expectation of dL
is equal to the mean of the limiting distribution.
(4.15)
[1 +
d =
J3 ~ .l:. _ k(k+2)}
L a=l nex
n
1
2(k-l)
J-
We 118.ve
l
•
Then the statistic dL is expected to show better approximation by
variate with k-l degrees of freedom for lArge
correction factor such that the term of order
X2
n. But we could not choose
n- l in the asymptotic ex-
pansion of the null distribution of dL vanishes, as is the case with
Bartlett's test shown in (4.2).
We sl1&ll examine the effectiveness of this
correction d in the following example.
E~le
4.1.
given by 1/1.02.
n =n
1
2
Five percent point of the
When k
of freedom is 3.84146.
test, we
=2
and
= 50,
X2
correction factor
d is
distribution with one degrees
From Theorem 4.1. and the formula (4.2) for Bartlett's
have
P(L ~ 3.84146)
(4.16)
P(dL
~ 3.84146)
0.0526 + 0(n-2 )
2
= 0.0503 + 0(n- )
=
P(cM ~ 3.84146) = 0.0500 + 0(m-3 ).
Thus the value of 3.84146/d may be used as an approximate five percent point
for the L test in this case.
4.2.
Expansion of the non-null distribution of
l.
VIe
shall now consider
the asymptotic expansion of the non-null distribution of L under a fixed
a1ternative.
Putting
k
(4.17)
L'
=
L - (n/2 ) 1:
cx=l
tv
Pex ( O"ex -
tv) 2
0"
J2
,r
in (3.2), we can easily see tl1B.t
Hence tIle statistic
L' /J"n
(4.18)
=
n-t L'
r;:---....
- CT)
k
1: ""2P /CT
c a
a=l
Ta
=0
p
_1
(n 2).
converges in law to the norml distribution witl1
~L
mean zero and variance
(L'/""n) -
=
lo(T) +
k
1:
2p (CT - CT)2.
a a
a=l
n-~Jl(T)
More precisely we have
1
+ n- J (T) + Op(n-3 / 2 ),
2
where each term is given by
=
I (T)
o
k
1:
a=l
.,J2ra (CT"" - CT)
....
T""
....
(4.19)
k
12 (T)
=
1:
a=l
aa = CT - iTa + 1
with
k
k
a~ ~ + ( 1: .J2Pa TaH 1: Ta 2)
a=l
and
characteristic function of
(4.20)
a~ = J2
L
Pa -i {(2/3HiTa -
L'/(J"n T )
L
= E[eitJo(T)/~l
C (t)
a=1
T = (~ - n )/.,J2n ,
a
a
a
from the ~ distribution.
(1)
(2)
E[etTa]
E[Tae
tTa
Hence the
> 0) is expressed as
~(it)21l(T)2/~]} ]
+ 0(n-3/ 2 ).
we can easily obtain the following formulas
=
(1 - .,J2/n t) -na/2exp ( ~na!2 t)
a
=
{l +
]
l}.
+ n-litl (T)/T
l
L
l
+ n- [itJ (T)/T +
2
L
Since
(T
L
iT) -
2
1..%'Tn, t 3 + ....l (lt4 + ~6)} et /2 + O(n- 3/2)
3
ex
n
9
a 2
=
~
1
tTa
t(l - "J2/n t)- E[e
]
a
=
{t + .,J2/n
a
(~ t 4
2
+ t 2 )} et /2 + O(n-1)
(4.21)
(3)
E[~
etTa]
=
(1+t2Hl-~t)-2
=
(1+t2 ) + .,J2/n
a
E[etTa ]
(~5
+
~3
2
+ 2t)} et /2 + 0(n-1 )
13
(4)
E[~etTaJ
=
(5)
tT
E [T~e ex]
=
Applying the first formula (1) to the first term in (4.20) with the abbreviated notation bex ~ "/2Pex (it/TL)(aex - a)
E[eitlo(T)/TLJ
= e-
in
lo (T), we have
~ b~~
t2 2
/ [1 + t ../2/n
CX=1
(4.22)
k
Noting that
1: .[Pexb = 0 , we can write each expectation in (4.20) by
ex
CX=1
formula (4.21) as
(4.23)
t2
E[l (T)eitlo(T)/TLJ = e- / 2 [1: a b2 + 1: a
1
ex ex
+ .J2 n-l %(1: bd'.[Pex)(1:
+ 21:
_t2/ 2
itl (T)/TL
(4.24)
E[l2(T)e
(4.25)
E[l {T)2 e
J = e
0
itl (T)/TL
0
1
J
= e-
aexb~)
aexb~~
-1
ex
b~/.[Pex)(1: aex
+ t(1:
+ 21:
aexba!~
}
- 1)
J+ O(n-1)
~
[1: a~b~ + 3E a~bexJ + 0(n- 2 )
t2/2[
(E a b2 )2 + 4E a 2b2
ex ex
ex ex
+ 21: a b2 (1: a -1) + 2E a 2 + (Ea )2 - 41: a P - 2Ea + 3J + o{n-i ),
ex ex
ex
ex
ex
ex ex
ex
_
k
where the symbol
E means the summation
acteristic function of
_t 2 /2
(4.26)
= e
L1/J;T
L
~,
k
1
[1 + n- 2 ((it/T )( 1:
L CX=1
+ (it/TL)
where coefficients
1: • It follows that the charCX=1
is expanded asymptotically as
3
g4' g6
('1 - '32 k1: pex(O'...ex
CX=1
are given by
[a - ~exJ + k-l)
...
3
- 0') )} + n
-1
3
1: (it/T )
ex=l
L
2cx
~,
14
1
I:(; - U)2 + -2 (I:(; - U)}2 - (k+3) I:(; a
a
a
f!_
=
G4
(2
= 3"I:P
a O'a
-0::
2 ( O'a - -)
(-)3
- -)4
0'
+ 3"I:
0' I:Pa O'a - 0'
-
u)
+ 1(k2 _1)
2
2( k+5 ) I:P (- 3
3"
a O'a - 0')
(4.27)
- I:(Ua - U) ~ + (k+l) ~
e;6
= -29 (I:pa(-a
0'
-
-)3
2 - -"C
2-.2 I:p (0' )
0'
3L
aa
-
-)3
0'
.
InvertL'e; tilis characteristic functi:m, we have tne follYWing tlle:Jrern.
Tlleorem 4.2.
statistic
L'
Under fixed alternative
=L
k
- (n/2) I: p
a=l a
--
(ua - U)2,
the distribution :)f the
K,
where
L is give;l by (2.1) with
k
-=
log~
O'a
-
and
0'
= I: PaO'a
'
is expanded asymptotically for large
n as
a=1
(4.28)
P(L'/(~l~L) ~
z)
=
~(z) - n-t[~(l)(z)~~l( ~
11
~ (3) (z) T2 (T~ - ~
+
1, = 2
a=l
Pa (;;:a - :7)3)] + n
k
I: Pa(;a - ;)2
a=l
where
and
~(j) (z)
standard normal distribution function
-111~
(u - ua) + k-l)
(ax)
(z)~T~
+ O(n -3/2) ,
reans the j-th derivative of the
~(z).
The coefficients
~
are
given by (4.27).
4.3.
Expansion of the non-null distribution of
sider tile aSyIl'iltotic expansion of the distribution of
test) under a fixed alternative.
where correction factor
c
Putting
with
m
by U
a
=
I: rn
a=l a
as
We shall now con-
cM test (
for
= Bartlett's
a = 1, 2, ••• , k,
is given by (4.1), we can write
k
k
cn = rna
a
cM.
k
cM = mlog( I: SJrn) - 1: rnalog(SJrna )
a=l
a=l
• Let U = [(ScI~) - rn J/J2rn , tl1en
a
a
a
cM is expressed
15
(4.29)
cM
= m(log a-u)
k
1:
cr =
where
CX=1
(4.30)
qo (U)
=
ql (u)
=
~(U)
wi tIl
va
Pa~'
=
= a2Ja-
Ul ' ••• , Uk
+
..[mqo~J)
+ ql(U) +
k
1: P
c;: =
CX=1
a log ~
m-i~(u)
l
+ 0p(m- )
1
and
k
1: .J2P
a ( va - l)Ua
a=l
k
k
~
1:
- ( 1:
a=l
rr
2
~ va Ua )2
a=l
k
k
2 ( ( 1: ~ va U )3 - E ~J~)
a
a=l
a=l
for abbreviation.
Note that since the random variables
are independent and each of them has asynptotically the standard
normal distribution as
m ... 00
,
tile statistic
M' /..[m
= (cM
- m( log 'O--a))/.[m
is distributed asynptotically according to the normal distribution with zero
~
mean and variance
of M' / (..[mT )
M
(4.31)
=2
=
E P (v -l)2.
Further the characteristic function
a a
can be expressed as
a=1.
(T > 0)
M
CM(t)
k
it~(U)/TM {
1
E
e l + m-~ itql(U)/TM
[
+
m-l[it~(U)/TM + ~(it)2ql(U)2/~]} ]
Corresponding to the formulas (4.21) for
(1)
(2)
(4.32)
(3)
(4)
tU
E[e a]
=
e
t
2
/2[
1. +
-.1..
m~2( ~ J2
T
a
, we have
l::,.pat +
~ J2t3 }
2
+ o(m-3/ ) •
16
where 6
= n(l-c) = 0(1).
If we put 6
=0
and change
fornnllas, we have the same results as in (4.21).
the abbreviated notation ba
we have
= J2Pa( 'Va-l)it/TM
2
E[e itCJo(U)/TM)
= e -t
/2[1 +
rna
to
n
a
in these
After some computation with
in
Clc (U)
and
I'aa
k
=
E aa '
a=l
~ J2Tm D>~JP;; + m-l%(D>~/JP;;)2
(4.33)
3 2
et a + kEt,2}]
2
a + o(m- / ) .
+ 1.,Eb4/ p
2
Putting aa = ~ 'Va
ql (U) and ~(U) in (4.30), we have
2
itQo(U)/TM
-t /2[
E[ql (u)e
) =e
D>~ - (I'aaba)2 + k - I'a~
in
(4.34)
-i .[{1.(,Eb~.[Pa)(D>~ ..
+ m
[Eaa ba ]2) +
- 2(EaabdJ"Pa)(Eaaba) +
itQo(U)/T
E[~(u)e
M)
=e
_t2 /2 2
(I:bd.[Pa)(~
2,Eba(1-a~)/.[po:
J:
3" .J21.( Eao:bo:)3
+ 2 ..
~I'a~)
l
.. t:J:aa.[Pa Eaaba }] + o(m- )
,r
.. D>d"Pa + 3I'a~ I'aabo:
(4.35)
which implies the following asymptotic formula of the characteristic function
17
where tIle coefficients
l~ = ~1(1:Pa~)2
h
-
11 , 114 and 11
are given by
2
6
41:PaV~
-
(k+2)1:Pa~
4
4
= 21:p (v _1)4 + -3~P (v _1)3(k +
a
a
a
a
~~
+ k(k+2)/2 -
4 - 1:paa
V2)
(4.38)
+
h6 =
~~
+ 1 - 41:Pa va (Va -l)2} +
~(1:Pa(Va-l)3)2
+
~1:Pa(Va-l)3(2
~~(31:Pa~ -
k - 5) +
~) ~ + ~T~(2
_
~T~
~)2
.
Inverting this characteristic function, we have the following theorem.
Theorem 4.3.
statistic
Under fixed alternative
M' = cM - m(log ~-;),
given by (2.2) and (4.1) with ~
where
k
=
cM is Bartlett's test statistic
Pa~ and ;
1:
- a.:J..
expanded asymptotica11.y for large m(- = nc)
(4.39)
+
P(M' /(..[m T )
M
~
z)
~(3)(Z)T~~ ~
Where
= ~(z)
P (v -l)3 +
L3a=1 a a
~
1: PaC va-I) 2 with
a=l
by (4.38) with 6
T~~
k
=
1: P log ~, can be
a
a=U.
as
m-i[~(l) (z)T;l(k
-
~ Pa~)
CX=1
~ - ~~}J + m-l ~ ~(~)(Z)haoIT~
-
k
=2
-
K, the distribution of the
va
= c?J~
+ o(m-3/ 2 )
a=l
and
h2CX (a
= 1,
2, 3)
are given
= n(l-c).
limiting distribution of the statistic
M in multivariate model has
been obtained by Sugiura [8] and coincides with the first term of the formula
(4.39) in Theorem 4.3.
Since asymptotic variances
~
and
~
vanish when
the hypothesis is true, these asymptotic formulas for the distribution of
and M do not give good approximation, when the alternative hypothesis
is near to the null hypothesis.
K
L
18
4.4
Nwnerical example.
He s11&11 finally show some ,lunerical values of
= dL)
the asymptotic power of Lelunann's test (
and Bartlett's test (
= cM)
in the following special cases.
Exapple 4.2.
When
k
=2
are equivalent by Theorem 2.1.
and
= n2 = 50,
nl
the L test and the M test
From asymptotic formula (4.28) and (4.39) with
the result in Example 4.1, we IlB.ve the following approximate powers when
of
= 2~
•
PK(dL ~ 3.84146)
PK(cM
?:
3.84146)
first term
0.6649
0.6642
second term
0.0135
0.0124
third term
0.0014
0.0020
approx. pOl·rer
0.68:>
0.679
These two powers should be equal, because
k = 2
and
n = n , wi. thin the
l
2
accuracy of five percent point of two tests given in Example 4.1. Thus our
result gives a reasonable approximation to this problem.
Exanwle 4.3.
power of
When k
=2
and
~
= 4,
~
= 20,
exact values of the
cM test for some alternatives have been given by Ramachandran [6J.
From his table, we can also obtain exact five percent point of
Formula (4.2) shows
PH(CM
~ 3.795)
= 0.0502 + 0(m-3 ).
the following approximate powers of
K : 0"2
2
= 80"1
2
Formula (4.39) shows
cM test for the alternatives
•
P (CM ~ 3.795)
8
8 = 10
cM as 3.795.
8
=5
8
=10/3
first term
0.6572
0.3224
0.1398
second term
0.0748
0.0804
O.U45
19
third term
0.0001
-0.0013
-0.0170
approx. power
0.732
0.402
0.237
exact power
0.729
0.397
0.230
When
term.
8 is less than 10/3,
tIle first term becomes smaller than the second
Thus we cannot apply our fornn.l1a effectively for alternatives near
Ex.a.nwle 4.4.
When
=3
k
and
2
= 50, n2 = 100,
nl
~
= 150,
PH(dL ~ 5.99147)
= 0.0507
0.0500 + O(m-3)
from (4.2)
We shall specif'y the alternatives
0"2 2 = 80"12
0"3 2 = 82 0"12
•
and
+ o(n- )
H.
we have
from (4.13) and PH(cM ~ 5.99147)
=
K as
Then the fornn.l1as (4.28) and (4.39) give the
following approximate powers.
P8(dL
~
5.99147)
P8(CM
~
5.99147)
8 = 1.5
8 = 0.7
8 = 1.5
8 = 0.7
first term
0.8483
0.7562
0.8430
0.7658
second term
0.0783
0.0556
0.0700
0.0615
third term
-0.0014
0.0070
0.0028
0.0077
0.925
0.819
0.916
0.835
approx. power
This example seems to show that for
8
= 1. 5
larger than that of Bartlett's test and for
holds though the differences are small.
the power of Lehman's test is
8
= 0.7
the reverse inequality
20
REFERENCES
[1) Anderson, T.
w.,
~
(1958),
Introduction to MUltivariate Statistical
Analysis • Wiley, New York.
[2) Bartlett, M.
s., (1937), Property of sufficiency and statistical tests.
!I2£....!!.2Z.
[3]
Desl~ande,
§.2£.
& 160, 268-282.
Jayant V., (1965),
hypothesis.
Some nonparametric tests of statistical
Dissertation for Ph.D. degree,
University of
Poona.
[4) Lehmann, E. L., (1959),
Testing Statistical Hypotheses.
[5] Pitman, E. J. G., (1939),
scale parameters.
Wiley, New York.
Tests of hypotheses concerning location and
Biometrika, 31, 200-215.
[6] Ramachandran, K. V., (1958),
A test of variances.
J. Aner.
~.
Assoc.
53, 741-747.
[7]
Sugiura, N., (1965),
Mu1tisamp1e and m1tivariate nonparametric tests
based on U statistics and their asymptotic efficiencies.
Osaka i[. Math.
[8]
Sugiura, N., (1968),
2, 385-426.
Asymptotic expansions of the distributions of the
likelihood ratio criteria for covariance matrix.
Ann. Math.
-
Statist.
Submitted to
© Copyright 2026 Paperzz