DeWaal, D.J.; (1975)On the asymptotic distributions of the elementary symmetric functions considered as test statistics in two multivariate tests."

* Research partially supported by the C.S.l.R.
ON THE ASYMPTOTIC DISTRIBUTIONS OF THE ELEMENTARY SYMMETRIC
FUNCTIONS CONSIDERED AS TEST STATISTICS IN TWO MULTIVARIATE TESTS
D. J. de Waa1*
Department of Statistios
University of North Carolina at Chapel HiU and
University of the Orange Free State
Institute of Statistics Mimeo Series No. 975
January, 1975
ON THE ASYMPTOTIC DISTRIBUTIONS OF THE ELEMENTARY SYMMETRIC
FUNCTIONS CONSIDERED AS TEST STATISTICS IN TWO MULTIVARIATE TESTS
D. J. de Waal*
liniversity"'"of North Caro Una
University of the Orange Free State
1. INTRODUCTION
Tomsky (1974) applied Roy's union-intersection principle to various multivariate tests by considering the index set as a set consisting of matrices of
order
k
x
p,
1 s k S p.
This index set from which the component hypotheses
are established is usually considered as a set of vectors, i.e.
Roy (1957), Morrison (1967) p. 118).
k
=1
(see
By representing these component test-
statistics in terms of the elementary symmetric functions of the characteristic
roots of the matrix appearing in the likelihood ratio component test-statistic,
he derived a new class of multivariate test-statistics for various tests.
We shall consider two tests here, namely the tests of equality of mean
vectors from samples from multivariate normal populations and the test of
independence between subvectors that are normally distributed.
The asymptotic
distributions of the elementary symmetric functions proposed as test-statistics,
will be derived here.
* This research was partially supported by the C.S.l.R.
4IJ
-2-
2. TESTING EQUALITY OF SEVERAL MEAN VECTORS
The test of equality of mean vectors of several normal populations can be
(~amed
into the following situation:
N(M(p x q), E
@
I q)
and A(p x p)
Let
F
= X'A-lX
x
q)
be
dis~buted
be distributed independently as
The random matrix of interest in testing
2.1
X(p
for
M
=0
against
M ~ 0
W(E,n) •
is either
q < p
or
1
1
V = B2A- 1B2
2.2
for
q
~
p
1
where B = XX' and B2 the sYmmetric square root of B distributed according
to
W(E,n,n) ,
n
= E- 1MM'.
Both F(q x q)
noncentra1multivariate beta with
freedom respectively.
as
n = M' E-1 M
(p, n + q -
or V(p x p)
P4
and
are distributed
(q,n)
degrees of
In the case of F the noncentrality parameter is taken
•
Tomsky (1974) obtained the following class of test statistics for this
problem:
k
~
P ,
f(o)
A (V) ~ A (V) ~ ••• ~ Ap(V) are the characteristic roots of V and
1
2
is the j-th esf of the characteristic roots Ai(V) , i = 1, ..• ,k .
If k
=P
where
,
considering here.
and this is the class of statistics we are
It is interesting to note that
likelihood ratio statistic and TU
statistic.
=1
TIp
= II
+
vi ,
the
+ A (V) , Roy's "maximum root"
1
-3-
We shall now consider the asymptotic distribution of tr. (I
J
lemma 2.1.
(de Waal (1974))
r,s = 1, .•. ,q , and
2.4
etr(Ba)exp(vtr.(I
J
R(q
q
Let
x q)
+ v-IE))
B(q
q)
x
be p.d.s.,
a =
q
(}o
F) .
+
+ 0rs)aoO) ,
rs
any fixed p.d.s. matrix, then
IE=vR- 1
=
etr(Br.)exp(vtr.(I + R- 1)) + O(v- 1)
J
J
r. = (_I)j-1t~ 1(-I)j-iR1-itr . . R- 1 •
. where
J
L1
lemma 2.2.
fixed
R(q
=
J-1
(de Waa1 (1974))
x
q)
If F(q x q)
is defined as in 2.1 then for any
p.d.s. and v = int
1
2.5
E etr(vtrRF)
where
1
1
= etr(-~)etr(~R
-1
)(1 -
-zPqetr (it
2it)
1-2it
nR
-1)
+
-1
O(n ) ,
n = M'E -1 M.
Remark:
If however
F is replaced by V defined in 2.2, then 2.5 stays the
same except that R is then of order p and n is defined as
1
~
1
E- 2MMiE- Z
Theorem 2.1:
If F is defined as in 2.1,
n=
WE
-1
M
and
then
P(~1 < x) =
where
f
= pq
,
2 2
P( Xf(o
) < x)
+
OCn -1 )
and R-1
any f'1xed p.d.s.
-4-
matrix.
r.
J
is given in lemma 2.1.
Proof: Let v
= int
and
T
any constant, then the characteristic function of
after expanding it into a Taylor series at
vF vR -1
-T= T
2.7
and applying lemma 2.1, can be written as
E exp(vtr. (1-..(1 + F))) = E
J T
etr(~(F
T
V
= E etr ( -.(F
tJ
- R-l)a)exp(vtr.(!.I
J T
- R-1)
)r. exp (vtr. (1-(I
= exp ( V J ( tr.(I
J
-r
J
+
J T
R-1 ) - trR -1))
r. IE
J .
T
Hence using lemma 2.2 the characteristic function of
can be written as
If we let
say ,
the theorem is proved.
+ v-IE))
+
I
E=vR
-1 ))
R)
+
v
etr(~r.)
J
T
-1
O(n -1 )
l
J:
+
O(n -1 ) .
-51
Corollary 2.1.
2.8
E;2
If V(p
n
= -CO)
TJ
x
p)
{tr.(I
J
is defined as in 2.2.
P
+
V) + trR -1 ro - tro(I
J
J
1
n = r- 2MMfE- 2 and
P
+
R-1 ) }
then
2.9
P(E;2 <
where
r.
J
f.
~2 and TCj)
2 2)
= P( Xf(~
x)
<
x)
+
OCn -1 )
are defined in theorem 2.1 and for R-l(p x p)
p.d.s.
stays the same as in lemma 2.1.
Proof: This corollary follows directly from the remark on lemma 2.2 and the
theorem.
Since
plim nF = n
n-+oo
we would like to choose
possible.
So let
N*
R-1
given in 2.7 as close to the nil matrix as
be a fixed large value and let
-1
R
2.10
Then t 1 defined in 2.6 becomes
2.11
where
2.12
~1
1 }
_ nN*(j-l)trnn-1Cj) { l
trn
trj(I + F) + N*j trnnCj) - trj(I + ~)
-6-
We note that
3. TEST FOR INDEPENDENCE
Let
A(p
x
p)
be distributed
A= [All (q
W(E,n)
and
A and
L be partitioned as
x q)
A2l
then (Troskie (1969))
3.1
and conditional on
A22
independent of AII . 2
3.2
where
B
L11 . 2
-1
= E12 I: 22
= Ell
-
EI2Li~E21'
G = AI2Ai~A21'
1
1
n = Lli.2BA228'E~i.2
and
•
Let
3.3
then
R is called the generalized correlation matrix.
The following esf's are among the class of test-statistics obt.ined by
Tomsky (1974) for testing independence, i.e.
3.4
Ll2
=0
-7The following theorem can now be proved:
1
111
-2
-1
-2
Z
-2
Theorem 3.1. Let R = A1IAI2A22A21AII~J n = Ell.ZSAZZS'Ell.Z J
then conditional on AZ2
the characteristic function of
can be written as
=
exp(it~(tr.
(I
TJ
J
1 j
+
Q) - trQr .))etr(-~)
J
-1
&tr(~ nr j )(1 - Zit)
+ O(n-l) ,
where
rj
Proof:
itT j
-1
etr(l_ZitOrj)
+
is given in 3.8.
By expanding the characteristic function of 3.5 as a Taylor series at
.
-1 G
ltnA U . 2
-~---=
T
for
-~(p-q)
itnQ
- T-
Q p.d.s. fixed, it can be written conditional on AZZ
- R) -1))•
= exp (itn(
- . tr. (I
TJ
J
+ Q) -
as
)1
-1 ZGr. ) +
E etr (itn
.-::.::;:AU.
J ..
TJ
J
trQr.
O(n-1)
+
=
exp(itj(trjCI
+ Q) -
1
T
r j -1
etr(~ nr )Cl
j
trQrj)etrC-~Q)
.
- Zlt)
-TlCp-q)
.
(itT J
)
.1)
etr I-2it n~j
+
OCn-I)
.+
-8-
where
3.8
This is true since the characteristic roots of A~i.2G are also the character1<,
1
istic'roots of ;'G2~;:i. zG2 which is distributed conditional on A22 as a noncentral multivariate beta distribution with p - q and n - p + q degrees of
freedom
andnoncent~al~ty par~meter
n~
1
1
E~~.28A228'E~i.2.
Hence the theorem
follows by applying lemma 2.2 remembering that
F
tral multivariate beta with' p"and n
degrees of freedom and rtbncen-
trality parameter
{6
+
q - p
is distribtitedas" a noncen-
," Q~E.D.
= M'E"lM •
But since A2Z (p - q x p - q) is distributed W(E Z2 ,n) , we can find the
1 - R) -1) and hence the
unconditional characteristic function of ntr j (T(I
unconditional distribution function.
1
Theorem
1
-2
-1-2
R = AllAIZAZ2All '
Let
-1
3.9
1;,
3
=
ntr00 (j) {trJ.(I - R)-l + tr00(j) - trJ.CI + 0)l
tr9
then
where
f
= q(p
- q) ,
2
o
1
= tr(¥)
and
~
-9-
Proof:
From 3.6 the unconditional characteristic function of ntr. (!(I _ R)-l)
J T
is given by
3.10 E exp ( itntrj{~(I - R)-l) )
+
= exp ( i:j(trj(I
+ Q) -
-~(p-q)
trQr j ) ) (1 _ 2it) 2
EA etr(WA 2Z ) + O(n-I)
22 1
1
.
1
I
itT J
,-Z -l-Z
1 jQ'~-Z r-1~-2 Q
~ ~ ~11.2 j ~ll.Z~ + l-2it S Ell • 2r j Ell . 2S
and the expectation is taken w.r.t. the density of A22 .
1
3.11
EA etr(WA 2Z )
22
= l£z2
=
1
-~
-1
But
- zwi
IEzzi
I
1 1
Z 2"
II - ZE 22WE 22 I
.
-~
-zn
If we define the population generalized correlation matrix as
1
1
-2"
-1
-2"
P = EIlE12E2ZEZIEll
3.12
and assume that
3.13
p
=!e
n
(see Sugiura (1969))
then
-1
El1 . Z
-1
= Ell
+
1)
O(n
0i01
= ° , then
itT j 0 r -1 ,
(I-Zit)n 1 j 01
Using the relation
+
O(n
-Z
)
-10-
3.11 can therefore be written as
3.15
where
t
j
is chosen such that
tr0
Le.
-1
trer.
=t
(.)
J
say.
J
Substitute 3.1S in 3.10 then it follows that the characteristic function of
Since we assumed
i.e.
1
p1im
n~
Therefore, let
3.17
1
nA1i.2GA~i.2
=
0 .
Q = 0 in 3.16 and
0(j)
= r_\
J 0=0
= C-1)j-lF{=1(-1)j-ioi-1trj_iO
-11-
and the theorem follows.
We notice that
,
I
p.1m
n-?oo
1
~3
-1(,)
J
= tr0etr0
{
trjI + tr00(j) - trj(I + 0)
}
•
REFERENCES
[1]
de Waal, D. J. (1974): An asymptotic distribution for the j-th esf of the
generalised Hotelling's beta matrix. Inst. of Statist. Mimeo Series
no. 966, University of North Carolina.
[2]
Morrison, D. F. (1967):
New York.
[3]
[4]
Roy, S. N. (1957): Some Aspeats of MUltivariate Analysis$ Wiley, New York.
Sugiura, N. (1969): Asymptotic non-null distributions of the likelihood
ratio criteria for covariance matrix under local alternatives, Inst.
of Statist. Mimeo Series no. 609, University of North Carolina.
Tomsky, J. L. (1974): A new class of multivariate tests based on the
union-intersection principle. Tech report no. 83, Department of
Statist., Stanford, Univ.
Troskie, C. G. (1969): The generalised multiple correlation matrix. S.A.
Statist. J. l, 109-121.
[5]
[6]
MUltivariate Statistiaal
Methods~
McGraw-Hill,