INSTITUTE OF STATISTICS
BOX 5457
STATE COLLEGE STATION
RALEIGH. N()PTI-lr:AROLlN~
·.t
UNIVERSITY OF NORTH CAROLINA
Department of Statistics
Chapel Hill, ..No' -C. - .-
(
-
DISTRIBUTION OF THE LARGEST OR THE SMALLEST CHARACTERISTIC ROOT
UNDER NULL HYPOTHESIS CONCERNING COMPLEX MULTIVARIATE NORMAL PORJLATIONS
by
C. G. Khatri
April 1964
Contract No. AF-AFOSR-84-63
This research was supported by the Mathematics Division of the Air
Force Office of Scientific Research.
DISTRIBUTION OF THE LARGE8T OR THE SMALLEST CHARACTERISTIC ROOT
UNDER NULL HYPOTHESIS CONCERNING COMPLEX MJLTIVARIATE NORMAL POPULATIONS
by
C. G. Khatri
University of North Carolina and Gujarat University
- - - - = = = == = = = = = = = = = == = = = = = = = = = = = = = = - - =
1.
Introduction:
It has been pointed out by the author [1] that one can handle all the
classical problems of point estimation and testing hypothesis concerning the
parameters of complex multivariate normal populations in a similar manner as one
handles those for multivariate normal populations in real variates.
In [1, 2],
the author has derived an asymptotic formulae for certain likelihood test-procedures
and in [2], the author has mentioned the maximum characteristic root statistic for
e
testing the reality of a covariance matrix.
The distribution of the characteris-
tic roots under null hypothesis established in those two papers can be written
in a general form as
Cl{'~lw~
(1)
J-
J
fl~l
ir
(l-w. )n}
.
(w. _ '\)
J
~=l k=j+l J
2}
dW ••• dW
q
l
q
where
c
1
,
o ::s
wl::S
f
r(n + m + q + j)/ {r (n+j)r (m+j)
w2::s...
~
wq ~
(j = 1, 2,
... , q)..
q
~.
(2)
where
~his
c (
2
c
2
n
j=l
= l/{
o~
fl
~) exp( J
ri
r(j)}
and
1.
We may also note that when
~..
t.
= lUI
i=l
:s •..
n
<f
-
q
E f)
j=l j
is large, the joint distribution of
q
<00, can be written as
{q~l
j=l
q
n
k=j+l
_ f
(f
j
k
)2}
df l •• •• dfk
(r (m+j) r (j) ]}.
j=l
research was supported by the Mathematics Division of the Air Force
Office of Scientific Research.
nw.
J
= f.
J
-2-
In this paper, we derive the distribution of w <1 (or
f<1)
and wI (or
f l )·
The percentage points will be given and sOJTEapplications will be discussed in
another paper.
2.
Distribution of w <1
or
wI'
For the distribution of w<1' we shall require the following two lemmas:
Lemma 1:
J
E &>
where ~:
s
[m!
x/
j~l
n!
(1 •
(0 ~ xl ~ ••• ~
(m~, n~), ••• , (mi' ni)
X
Xj
s
]
~ x),
s
=
) J dxj
j~l
(x ~ 1);
is any permutation of
[Jx
0
m.J
Xj
x n.J
j)
(l.
]
dxj
and on the left hand side
(m s ' n s ),
H"
(m~, n;)
and the
summation is taken over all such permutations.
For proof, one may refer to
,"
[3, (A.9.3), p. 203].
Roy
Lemma 2:
w.2'1-2
!
J1
w.2q-3
[
w 2'1-3
jl
w.2'1-
w.<1- 1
Jq
J2
I
.
r.
t
f·
t
I
W
where E
means the summation over
(1, 2, ••• , <1), and
Proof:
IAI
4
.
J2
...
'1- 2
Wj
'I
<1-1
jl
(j l' j 2' ••• , j q)' the permutation of
means the determinant of A.
It is well known that a Vandermonde determinant
. '1-1
w
1
q-l
w '1-1
2
2
wq
...
'1- 2
wI
w '12
w1
w2
W
1
1
1
wq-j.
'I
q
2
=
'I
II
k=j+l
2
( w.J -
W
'I
) ]
= a,
_
(say).
e
-3Then, the above expression can be written as
q
!:
j=l
a
=
w2q-2
j
q
!:
q
q
w.
J
j=l
j=l J
q
2q-4
w.
!:
!:
J
j=l
w.q-l
!:
J
j=l
2q-3
q
w.2q-3
!:
w.q-2
J
j=l
...
q
!:
q
w.q-l
I:
J
j=l
w.q-2
j=l
w.2q-2
w.2q-3
w. 2q-3
w·
J2
Jl
=
J,
I:
q
J
2q-4
Jz.
jl' j 2' • ", j q
1
are equal, then the value of the
If in the right hand side, any two
determinant is zero.
I
Hence the sumrnation over the right hand side over (jl' j2"'"
jq) reduces to the permutations of (1, 2, ••• , q), Which establishes the lemma 2.
l
Now we shall prove the following theorem:
Theorem I:
If the joint distribution of
W
l'
W
2' •• "
Wq
is givenby
(1),
then
I
(3)
Pr( w q< x)
=
c
l
13 0
13 1
13 q-1
13 1
13 2
13 q
f3 q
13 2q-2
= cll( 13 i+j-2)
--
j3
q-l
I
-4,.,
..
where
•
~s
c1
defined in
~
2,
()
Q.
i+j-2
J
x
=
wm+i +j - 2 (1..W)n d
W
o
for
i, j
!!£2f:
Pr(
W
= 1, 2, ••• q and (13 i+j-2) is a qxq matrix.
By definition, we have
< x)
q-
where
= Pr( ° -<
wI < ••• < w < x)
-
-
D: ( 0;: wI;: w 2 ;: ••• :::
q-
Wq
;: x,
x;:
1) •
Using lemma 2, the above expression can be written as
(4)
pr(wq:::x)
= c1
E
J
~
w.21l-3
J2
w.2q-3
J1
w.21lJ2
4
...
~ [w~
j=l
...
J
(l-w.)n dW.J,
J
J
••• w o.
J
where
(1, 2,
means the summation over
• • • J
q) •
il
(j, •.• , j ), the permutation of
1
Il
Now the determinant in the integral sign of
(4), can be
written as
Where
(t1, ••• , t )
q
(t , ••• ,t )
1
ll
is positive if the permutation is even and negative if the permutation is odd,
is a permutation of
(0, 1, ••• , q-l), sign
and E1 means the summation over all such permutations.
Then (4) becomes
-5-
q
II
[m (l,:,W .) n dw j
W.
j=l
First taking summation over
]
J
J
.
(jl' j2' ••• , jq)' the permutation of
(1, 2, .•• , q)
and applying the lemma 1, we get
which proves the theorem 1.
It may be noted here that
Pr (w 1
S x) = 1
?= x) = 1
- Pr (w 1
Going back to the c. d. f. of
U>.
J
=1
Pr (w 1 S x)
(5)
=1
;:: Pr{x
...
Pr(x S w1 S
<w
-
< 1) •
q-
(lV'1' ••• , W""q) and using the transformation
= 1,2, ••• , q),
- z. (j
J
-
we have
::s w 1 ::s ... S W q S 1) = 1
-
l-x
where
8 .. 2
1.+J-
Theorem 2:
=
J
(Qv . . 2 ) 1.S
• a
1.+J-
and.
o
f
If the distribution of
Pr (fq::S x)
(6)
zn+i+j -2 ( 1 - z )m dz
=
c2
I(
y i+j-2)
q
.
qxq ma t r1.x.
is given by (2) then
I
x
and
1
m+ i + j - 2 exp
c
=
1.+Jis defined in
3.
The author thanks Professor S. N. Roy for discussion and help.
7. . 2
where
W
(-w) dw
,
(Y . . 2)
1.+J-
is a
qxq matrix
(2) •
2
Proof is similar to that of theorem 1.
(1)
Khatri, C. G., "Classical statistical analysis based on a certain multivariate Gaussian distribution".
(Sent for publication).
[2] Khatri, C. G., itA test for reality of a covariance matrix in a certain
complex Gaussian distribution".
[3] Roy, S. N. (1958).
and Sons, Inc.
(Sent for publication).
Some Aspects of Multivariate Analysis.
John Wiley and
© Copyright 2026 Paperzz