UNIVERSITY OF NORTH CAROLINA
Department of Statistics
Chapel Hill} N. C.
MONOTONICITY OF THE POWER FUNCTIONS OF SOME TESTS FOR A PARTICULAR
KIND OF MJLTICOLLII-lEARITY? AND UNBIASEDI-lESS AND RESTRICTED MONOTONICITY
OF THE POWER FUNCTIONS OF THE STEP DOWN PROCEDURES
'OR MANOVA .ti'ID INDEPENDENCE
by
C. G. Khatri
April 1964
}
Contract~.
r
"
AF-AFOSR-84-63
This research was supported by the Mathematics nlVision of
the Air Force Office of Scientific Research.
Institute of Statistics
Mimeo Series No. 388
...... ":'1""\1.."
.,..
MONOTONICIn' OP' THE POWER FUNCTIONS
OF SOME TESTS FOR A PARTICULAR
,
KIND OF MJLTICOLLINEARITY, AND UNBIASEDNESS AND RESTRICTED MONOTONICITY
OF THE POWER FUNCTIONS OF THE STEP DOWN PROCEDURES
FOR MANOVA AND INDEPENDENCE
by
C. G. Khatri
University of North Carolina and Gujarat University
= == == ==== ===
1.
=
=== ========== ========
~
=
==
Introduction:
The problems of a particular kind of multicollinearity of means defined
by Roy [7] and extended by the author [4, 5] to
that of regression coefficients
were reduced to a single form [5] in t he follOWing way.
Let the joint density function of the elements of the random matrices
!:
(p+q)x n, (n ~ p + q)
and ......
X: (p +q)
m be
X
where
z: uxm is a known matrix of rank u( .< m),
N
covarian~e
matrix and
l3
Z: (p+q) X (p+q)
~
is an unkno'WIl
(p+q)x u is an unknown regression matrix.
rJ
write
Let us
(Y'
",1
and
and
Then, to test the hypothesis of multicollinearity,
Ho : ( ~ 1. 2
against the alternative
procedures:
=1'1
-
H(l3,.....
1 2
1
2: qx m .
H , given by
o
-1)
= }2
~12 E22 ~ 2
f ,..,0),
let us consider the following three test
2
(4)
the likelihood ratio test
[3, 5J whose acceptance region is
p
7T
i=l
(1 + c i ) :s )..1 ' a constant,
p
(5)
the trace test [3J whose acceptance region is
1:
ci:s ~2' a constant;
i=l
and
the maximum root test [3, 4J whose acceptance region is
(6)
a constant, where
-1
c
roots of ..eh~e ',§h
l
~
c2
~
and ~e
..•
~
c
p
cl:s A3 '
are the characteristic (ch.)
being the matrices of sums of products
due to hypothesis and due to error, respectively, with
(7)
,.§h
= [31 - !l!2(!2!2)-3:2 ] [~+ 2C2(!2!~p-~2rl[~1-!1!2(!2!2)-~2]'
and
.§e = 'xl~i
(8)
- Jl!2(~2~2) -1 J2~i .
For computational purposes, we may note that (7) can be rewritten as
- "....e
S
We consider, in this paper, the step down procedures for two stages only.
The results for more than two stages are similar.
for MANOVA (refer J. Roy
[6J) can be stated as follows:
Let the first stage parameters be
~l. 2
=.Pl -
The step down procedures
k2 h~ P2'
~
2 and the second stage parameters be
Then, the null hypothesis for MANOVA is
,
U
El.
H( f2, ~9) = [Hl ( El ~~)]
[H2 (
2 ~ 12) J.
We propose below only three test criteria for testing (9), but many more can be
and the alternative hypothesis is
stated depending on likelihood, trace and maximum root tests:
(10)
An
intersection procedure based on the likelihood ratio test, at each
q
stage, whose acceptance region is the intersection of
II (1 + c1 .):s '\ 1
j=l
,J
)
being constants;
p
and
(11)
IT (1 + ci) :s \ ' \,1 and \
i=l
an intersection procedure based on the trace test, at each stage, whose
q
Z
acceptance region is the intersection of
c . < \ 2 and
j=l 1 ,J
,
being constants; and
p
(12)
Z
c. < A_, A 2 and
~
i=l
~ -"2
"1,
an intersection procedure based on the maximum root test, at each stage,
whose acceptance region is the intersection of
c1:s
~,
\,3
are the characteristic roots of
are the ch. roots of
~,3
and
being constants, where c1 , 1 ~ c1 , 2 ~ •.. -> c1 ,q
~
and
c1 ,l:S
-1
,§h~
(32~2)(~2!e)-1 and c1 ~ c2 ~ .•. ->
with 2h and
R.e
c
p
as defined in (7) and (8).
Now, the step down procedures at two stages for the independence of sets
as considered by Roy and Bargmann [8]
Let
lSI = (~i
can be summarized as below:
~2 ~3)' ~: (Pl + P2 + P3)
X
n, ~1: Pl X n, ,!2: P2 x nand
,33: P3 x n have a joint density function as
(13)
MN(X; 0, Z )
""
rv
'"
where
Jj 12
~13
PI
Z"3
,.....,c
P2
~3
P3
is positive definite.
Here the
P3
j-~:Ut
;, stage parameters are
That is
k2
and the second stage parameters are
(~3
and ~ 23)'
4
Ha ( rv1.J
~ ..
(14)
=0
for
i! j)
= H01 (
('V
H(~
against the alternative
2)
.. !
N1.J
k2 = 2)
for i !j)
=
n
H02 (
h3
E23
= £,
=,9)
[Hl ('&'2! .Q)] U[H2 (Etf,g,E23!,Q)].
As before, we give below three test procedures:
(15) An intersection procedure¢ based on the likelihood ratio test, at each
p
il2. 1
stage, whose acceptance region is the intersection of
J=
ir
\,4
'2 4
'2,4 '
p
acceptance region is the intersection of
P3
.1::
w j
J = l2'
(17)
+w i .)<
,J -
(1 + w .) <
\,4 and , being constants;
2 ,J i=l
An intersection procedure based on the trace test, at each stage, whose
and
(16)
(1
<
'2
t-,
,
IMW. )..2.,5"
5 'I\'\eing constants, and
~ 2
j=l
W1
~
and
'I 5
. <
,J -
'
an intersection procedure~ based on the maximum root test, at each stage,
whose acceptance region is the intersection of
~ , 6'
W2,1:S
where
w2 , 1
\ , 6 and
wl 1 ~
UJ., 2
2::
2: ..• 2:
,
w2 2
J
~
•••
~
w2
,P3
~,6
UJ., P2
UJ.,1
:s
and
\,6
being constants
are the ch. roots of S
S·l
"'1, h "'1, e
-1
are the ch. roots of rV,
S2 h (VS2 ,e
and
with
(18)
and ~2, e
= t3
Results on the monotonicity of the test procedures when
'1=0
.e3 - ~2,h
and
P3
.
= 0
in the above cases were first established by Roy and Mikhail [9, 10], Srivastava
[11]
and then by a different method due to Das Gupta, Anderson and Mudholkar [lJ
and Anderson and Das Gupta [2].
In this paper, we prove the monotonicity of
the test procedures for multicollinearity and restricted monotonicity and
unbiasedness of the step down procedures for MANOVA and independence.
0 will be understood
The orders of the identity matrix I and the null matrix tV
f'V
by the context.
5
Multicollinearity and step down procedures for MANOVA:
2.
Lemma 1:
The ch. roots of
ShS-l and (X X')(Y YA)-1 are invariant under the
....e
{'1 2 '"2
2 "'c.
,"
rv
('J
-~
transformations (gl~ + ,g2~)
where 21: PXP
~l' 23~2 kl' (Sl~l + JJ2r2) ~2 and !J3~ k2
are non- singular,
and -93: qxq
Let ,Bl
= C,gl~h
:J2 = JJ3~2 ~2'
and
~2:
nxn
are
"
orthogonal matrices and 22: pxq.
Proof:
El: noon
+ ,92::£2) ,..;~,
E2 = g3~2 Sl' J l = (~h!l + E2!2) ~2
and
Then
B2(~2'L2) -1 lt2 = ~i ~2{!2~)-1c2 ~l
'
~l - jlY2{~2!2) -~2 = gl~l~l + ,92~2~1
-
(~l}:l + 2232)!2(~2:2)-)c2~1
and
~lYi - Jl:t2(J2~P -~2i£i = (gi*Jl + ,g2~) [~-!2(!2~P ""~2](21~~1 + ,g2!2) ,
= Gl[YlY1' rV
"'
A~
Moreover, the ch. roots of
Y2') -ly2Yl' ]G ' •
YI Y2' (Y2
rlt'
rJ
~
Nl
"''''
BA except for some
are the ch. roots of
,~
zero ch. roots. (See Roy [7]).
Lemma 2:
....
'" tv
Using these results, the lemma 1 is obvious.
There exist matrices
U: pxm , W: px{n-q), R: qxm and
'"
.....
V: qXll
rJ
having a joint density function
(20)
where
MN(!!; fZ,~)
::.: pxm
=
MN(~;
(9 ij) ,
A
12, l)
ij
=0
MN(~; ):2'
for
possibly non-zero ch. roots of~T-l ..
-,...,
, J.J
J)
i F1 j ,
=0
E 1. 2 =
J::ll - J1.2
gonal elements
ai
#;~
8~
ii
for (i
are the possibly nonzero ch. roots of ( !1. 2~Z~
£, t) ,
MN{Y;
f
(
i
= 1,
j) and
2, ... ) are the
~12, 2J.
" (i=1,2, ••• )
-1 )
J3i. 2 rh.2
'
l1.2 ' and J12]2 is a diagonal matrix with dia-
which are the possibly nonzero ch. roots of
6
(
!2~Z~
-1)
132
I
~22·
(~...:) (~i:) -1
roots of
are the ch. roots of
Proof:
2
J3
=
";22'
0'
=
(T-lt)
N
rv
....,
and
E
Let
=.
Z)'
:E'£'
rJ
c , j 's, the ch. roots of
l
""':er: '" 't =
(~l
Tl
.§2'),
(Vl
and
are symmetric matrices.
and
.-
02: q,xm.
Then
IV
Then we can write ~l
(V
2
:£3) such that ~l = ~ 1. 2 '
T3
-§l: pxm
,_
(~2~2)(.!2t2)-1 ,
12 \
-i""
J22'
= (a'
= T3-1 rJt32 Z•
02
tV
and
(RR')(VV,)-l.
:E2 = ~2
(V
-1
c i ' s, the ch. roots of .§hge ,are the ch.
Moreover,
and
,32
01
Let
= Tl-1 t3l
N,.,..
2Z
tv.f'W
as (refer [3]),
(20)
where
11:
I2:
PX.P,
~l = (TIl'
.):
, ~J
nl , ~J
" =°
pxm,
'"
nonzero ch. roots of
,2'\2
= ~3~:
'n7(.J, ~J
"
of
=
°
for
~
11'
"'2 ..... 2
~ i§,l
~,: mxm
q,xm,
i
~l: mxm are orthogonal matrices, ~ =(~; .J~) I
q,xq"
~
or J2.§,2
~'
~
i
pL 2
n12, ~~
.•
~~:~ 1'1. 2~
or
a~
~
= ~32
..
are the possibly
,t:~:2 !1. 2z,..Z~ ~i. 2
~3 = (11 , ij) ,
3
'
are the possibly nonzero ch. roots
,~~
-1
or J2 ~~ ~2 J22·
Let us apply the transformation
3=(;1) =(~1 ;2) ;-J ~1 ~d !o =G1J =~1 ;2)~_1!-
(21)
0
The jacobian of the transformation is
and by lemma 1, the ch. roots of
function of
R
e-JO
and V
NO
MN(V
; N0, NI)
rJO
(22)
where
and
j
is an orthogonal matrix and
and
j
=
for
11'
tv
=
(~i
""
~2)·
.-
J(X,
Y; ",0
R , rJO
V )
IV
,....
-1
Sh .,Je
S
,v
are invariant.
can be written as
MN(R;~,
~o
N
I)
rJ
= I zit
(m+n)
-v
The joint density
7
The matrix
,6. -
can write
j3 k3
(1. e.
(j3
~2)
I - V'(VV,)-lV is idempotent of rank (n-q).
l'V
f""V
'V
~
Y' (Y'L' )-lX = J-3 ~3
k3:
where
such that (~3"'i'
=,!)
(n-q) x n
1
~, ~,JIl
22
and
(" rJ
tv""
"" "'"
L'(W ..w,)-l,
-v
I'J
the ch. roots of
<V
(~2~2)(!2!2)-1
i'J
L' ,
""~
and
I + R'(VV , )-lR
f'J
rvIV
= 1,
2, ... , p)
and covariance matrix between
,...,
Hence the distribution of
(25)
L
rv
tV
Now
£.
""~
are independent
E(~)
= ]]1'
To find
be the i-th column of
£.
is
-~
and £
'1
IV~
is
zero~
is
MN[L '; 1'1 ' ) I + R 1 (VV I ) -1 R] .
N
('V
I
t"'tJ
('V
Using the transformation
joint distribution of
MN(El;2'~)
(26)
E
~j
= 'fJl[I
N
21' !II'
~
and
MN(yl;~)!)
the ch, roots of
tv
f"'J.
tv
:!.
'V
~
we get the
as
MN(~;~2'!)
MN(~;£,~)
1
+ Rl(VV 1)-lRr2, and
N
rV
t v t'V
L[I + Rl(VV1)-1 R]
tv
where
N
and
32
Then it is easy to see that covariance matrix of
~
'V
£ . (i
matrix, let
-»1
k, is normal with
Hence the distribution of
covari&~ce
tV
are the ch. roots of
-livd-
the
MN(V;O,I) ,
""
- W (VV 1 )-2 R
('oJ 2 tv tv
'"
from (24), we may note;,the distribution of
normal.
,I)
1
= Rl
L
("oJl'''' 1
2
can be written as
are the ch. roots of
-1
L{ I+RI(VV')-~)
t'-'
-1
ShS
(" "'e
~
Then
Using the transformation
MN(R;'fJ
,
c. IS, the ch. roots of
is orthonormal
= ;L' (~,..; )'"2.
Let ~2
= £.
is an orthogonal matrix.
the joint density function of ! l '
and
Then we
.--
'"
'" '"
tv
(~l~i)(~l~])-1
•
c. 's, the ch. roots of
~
S S-l
ruh""e
are
8
Let us wr'ite (refer [3]) ,
(27)
where
~5 :
=0
and tvf : pxp are orthogonal matrices, and
noon.
for
3
i ~ j
e~.
:LJ.
and
e:
('.'
= (0 :1.J
.. ),
pxm
are the possibly nonzero ch. roots of S~1.
N
<oJ
Finally, using the transformation
,
and W = f ' Wl
N
'" 3 '"
u
(28)
(V
'we can write (26) as mentioned in the lemma 2.
~heorem
1:
Thus, lemma 2 is established.
An invariant test of multicollinearity for which the acceptance
regi on is convex in each column vector of IV
U for each set of fixed ('lW,
fixed values of the other column vectors NU has a power function Which is
monotonically increasing in each r.1.
= ~l ,:1.J.
...
Proof: It follows from theorem 3 of Das Gupta, Anderson and Mudholkar [1]
that for given
R and
r-J
V, the conditional probability of the acceptance
.-..J
region monotonically decreases in each
e~. (i
= 1,
2, .•• , m)
possibly nonzero ch. roots of
where
P =( I + R'(VV,)-lR )
N
I'V
"",.yt'\J
= 0 1, J.J.
"
~! ~!I
~*l"
to
~l ~i
, :1.:1.
(~i ~l) {~+~,(yy,)-l ~ }
is a symmetric matrix.
= r*i
where
then
~f
rt
> r i (i
J. -
= 1,
2, ••• ).
be the
211
~
rt
> r :L..
:L -
P
tv
at. > 9 44
J.:1. -
......
~* ~*I
P - P
",1 ",1,...,
for each
Hence, for any fixed
<V
~
~I
from
Then
is positive semi-definite at least, and, since n
positive semi-definite, and so
e..
,
J.J.
Let
by changing the diagonal elements of
a non-singular symmetric matrix, we get
at.
>
J.J. -
, or !(~i :h)~
-2'
,.....,
matrix obtained from l'l
r.:L
1
are the
-1
:1.:1.
is
P to be
",,1 ",1 '"
i, and conversely if
R and NV, the conditional
~
probability of the acceptance region decreases monotonically in each
rio
The theorem 1 now follows from the fact that the marginal \distribution tv
R
and :f, does not contain any r i •
9
Now let W be the sum of all different products of (1+01)' (1+C 2 ), ..• ,
k
(l+c)
p
taken k
at a time (k
= 1,
2, ••• , p).
Then by using theorem 1
above and the lemmas 2 and :; of [1], we have the following corollaries:
~j.J~tL,*,:
The maximum root test given by (6) has a power function
which is monotonically increasing in each r .•
~
A test having the acceptance region
P
L:
k=l
has a power function which is monotonically increasing in each r ~..
£Q.Lo..l~:
The power functj.on of the likelihood ratio test given by (4)
increases as each
££r~!~i:
as each
r.
increases.
The power function of the trace test given by (5) increases
increases.
~
Theorem 2:
r.~
An invariant test for which the acceptance region is sectionwise
convex at each stage of the step down procedure for MANOVA is unbiased and
moreover is monotonically increasing with respect to the ch. roots of the
parameters at the i-th stage when the parameters of (i-l) stages are fixed
and all the parameters above the i-th stage are zero.
[This type of
monotonicity will be called the restricted monotonic property.]
This follows from theorem 1, lemma 2 and theorem :; of [1] for the two
stages of the step dowm procedure for MANOVA.
The generalization is
immediate.
Let Wij
(l+C l ,q )
be the sum of all different products of
taken
j
at a time
(j
= 1,
2, .•. , q).
(l+cl,l)' ... ,
Then using theorem 2,
we get the following corollaries:
££r2;u.~:
q
1::
A test having the acceptance region
P
L:
~ Wk:S IJ, and
k=l
of the step down procedure for ~ANOVA
b. Wl.:s IJ, (~~O, b. ~O)
j=l J
,J
J
at two stages is unbiased and has the restricted monotonic property.
10
££££11~:
The test procedures given by (10), (11) and (12) are un-
biased and each has the restricted monotonic property.
It may be noted that the regions described above are all, in fact, not
only conditionally sectionwise convex, but also conditionally sectionwise
ellipsoids, so that, for proving the above results, we could as well have
used Roy and Mikhail [9, 10].
3.
Step down procedure for independence:
z
012.1
'£; 3.1,2 = ~33 -
Let
=
I
(Y
:NOVT,
(Z(
Z')
tv3.l, 2 1V13
rv
k2
23
rE22
= ~--12
~~l.
1-
1
~2
lV
2 tvX
3
given by (18) and (19) are invariant, we replace (Vi
Y
r.
,:::;).,
tv
~2
and then write the joint density function of .el'
(29)
MN%;
B'~)
MN(:2;
El'
2)
where ~: P2XPI and!: P3 x (Pl+P2)
Since
"£ - <3i ,W
[(~)
MN[!f3;
!(~)',!l
(iSi )$2)
J~~)
we can write them respectively as ~l ~i
!'3: (n-Pl)xn
by (Vi
X (i-- 1,2,3),
and ,!3
as
,
are defined above.
and
are idempotent matrices of ranks (n - PI - P2)
and
N
X Y = Z-·2 X
and
,_
rJlI l'",2 "'2.1"'2
in (13). Since the ch. roots of S.~, hSi-l
(i = 1, 2)
tv , e
we use the transformation
1
Y3
-
~12)_l(r;~ol
(,J1.l
1
F - Z-2
are orthon,ormal (1. e.
I - ~i(!h:Si)-~
and
and ..93 ~3
~i ~l
=£
(n - PI) respectively,
where ~i: (n- Pl-P2)xn
and
'&3
~3
= J)
11
6. ' (Xl'
such that
1
t"v"...,
X') = 0
2
'V
,......,
and
6.~ X'
= O.
2
Let us write
I"\,I./I'V"....,
~()si 32) [(~~) (!Si .J}2) l-~ and ~4 = lSi%H) -!.
(~3
£'4)
Now transform ~2
are orthogonal matrices.
Since the transformations are conditional in
J(~2' t:3; X!l' Ji2'
jacobian of the transformation
(~l~)
Then
and ~3
and
by
X , we have the
2
:Y1 ,
$2) =
JC.~2; )II' :12) J<"e3; ~l' :'£2) = 1, and so the density function of Jl'
;11
and
w1E re
G
,..,
.J2
=
21 , ;G2'
can be written as
"'"XXI
l!vl
(X
X')1.
tv 1""1
w~~
2
I
,
W(XX')2.
( ""2
",1",1
the ch. roots of
ch. roots of
(~~)
-1
S
S
(Vl,hl"l,e
tv
WW'+WW'
,...,2~ ""1"'1
(lil 1:!i) -1,
the ch. roots of
and the ch. roots of
are
~2, h~~~e
are the
~V2~)(~1~i)-1
Let us write
1.
E(~l~l')~
=
f 2 '1\lf
rw l
f"w'
where '£1: P1Jq)1' !2: P 2x:P 2
IV
'I3:
f4
and FG=f:z;'l'l2
r¥./
t'V
P3 x (Pl +P 2 ),
tv
'
t"V
P3 x:P 3 and !4: (P l +P 2 )x(P l +P 2 )
orthogonal matrices, ~l = ('l'll,ij): P 2 x PI'
2
'11 l ,ii are the possibly nonzero ch. roots of
~2 =(~2,ij):
fV
~l,ij = 0
for
[(~lJ.Sl.)!'1P]
f
i
are
j
and
~2,ij
possibly nonzero ch. roots of
and
= 0 for i ~ j and ~~,ii are the
2
(F IFGN ). Now, we apply the transformation
fV.-v
and U4 = ¥,v..,J
f ~ V2 f 4' .
('V
~
tv
12
Since the transformations are cc,nditional in
jacobian of the transformation as
= J(~l' 22 ; }21' B2 ) J(!l' 2"2;
of .el' '£1'
22,
£3
and ]4
J(~l'
~3' ,E4)
E2,
= 1.
~l
and
,!l' 12;
l:!2'
we get the
21 , 22, By 24)
Hence the joint density function
is
where the ch. roots of S S-l are the ch. roots of (~2V.2) (£lV,i)-l and
"'1, h""l, e
S
S-l
the ch. roots of
are the ch. roots of ~4~) (£3~3rl. On
rv2,hcJ2,e
account of (34), we may note the following which are derivable by the
similar arguments as in Section 2.
(35)
If ..31' ";2' ~ll' 212
test due to the ch. roots of
and ~22
(~~)(~3~3)-1 increases with each ~2,ii
and hence due to each ch. root of
(36)
If ,E13
=Q
and
.£23
are fixed, then the power of the
= Q,
the test due to the ch. roots of
L:: l
(
",,;>.1,2
E
- E..
)
"";33
,w;>.1,2·
and..31
is fixed, then the power of
(E2~)(£1~i)-1 increases with each
11 1, ii and hence due to each ch. root of rE~~l( J'r22 - !'2.1)·
From (35) and (36), we have the following theorem.
Theorem 3: An invariant test for which the acceptance region is sectionwise
convex at each stage of the step down procedure for independence is unbiased and has the restricted monotonic property.
Let WI . be the sum of all different products of (1 + wI, 1) ,
,J
(1 + l.l~ 2)' ... , (1 + IJ.l.
)
taken j at a time (j = 1,2" .. ,1'2) and
,
.1.,1'2
W2, k the sum of all different products of (1 + w2, 1), (1 + w2, 2)" d'
21\!J?3) taken k at a time (k
3, we have the following corollaries:
(1 +
lJJ
= 1,
2, ... ,1'3)'
Then using theorem
13
P2
E
b. WI . < ~
£~~.:L: A test having the acceptance region
j=l J
,J
I>~
and
of the step down procedure for
L
~ W2 k
k=l
'
:s
independence at two stages is unbiased and has the restricted monotonic
property.
~!:..~l~lt:
The test procedures given by (15), (16) and (17) are un-
biased and each has the restricted monotonic property.
The same remarks
(involving "the conditionally sectionwise ellipsoidal nature of the
regions") apply here as at the end of section 2.
4.
Ackno'VTledgement:
I am extremely thankfUl to Professor S. N. Roy for
suggesting the step down procedures and for helpful discussions with him
in the course of preparation of this paper.
14
REFERENCES
[1] Das Gupta, S., Anderson, T. W. and Mudholkar, G. S. (1964).
Mono-
tonicity of the power functions of some tests of the multivariate
linear hypothesis.
Ann. Math. Statist., 35, 200-205.
[2] Anderson, T. W. and Das Gupta, S., (1964).
Monotonicity of the power
functions of some tests of independence between two sets of variates.
Ann. Math. Statist., 35, a06-208.
[3] Khatri, C. G. (1960).
On certain problems in multivariate analysis
(multicollinearity of means and power series distributions). Ph.D.
thesis, M. S. University of Baroda (India).
[4] Khatri, C. G. (1961).
Simultaneous confidence bounds on the departures
from a particular kind of multicollinearity.
Annals of the Institute
of Statist. Math., 13, 239-242.
[5]
Khatri, C. G.
Non null distribution of the likelihood ratio for a
particular kind of multicollinearity.
[6] Roy,
J.
(1958).
(Sent for pUblication.)
Step-down procedure in Multivariate Analysis.
Ann.
Math. Statist., 29, 1177-1187.
[7] Roy, s. N. (1958).
Some Aspects of Multivariate Analysis, Wiley,
New York.
[8] Roy, s. N. and Bargmann, R. E. (1958), Tests of mUltiple independence
and the associated confidence bounds.
Ann. Math. Statist., 29, 451-
503.
[9] Roy, s. N. and Mikhail, W. F. (1961).
On the monotonic character of
the power fUnctions of two multivariate tests.
32, 1145-1151.
Ann. Math. Statist.,
15
[10] Roy, S. N. and Mikhail) W. F. (1964).
Correction to and comment on
section 4 of the paper "On the monotonic character of the power
functions of two multivariate tests" by Roy, S. N. and Mikhail, W. F.
(Ann. Math. Stat., 32, 1145-1151), Ann. Math. Stat., 35,
[11] Srivastava, J. N. (1962).
On the monotonicity property of the three
main tests for multivariate analysis of variance.
Institute of
Statistics, Mimeo Series No. 342, University of North Carolina.
© Copyright 2026 Paperzz