Baldessari, B.; (1966)The distributions of quadratic forms on normal random variables."

I
I
,e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
,e
I
THE DISTRIBUTIONS OF QUADRATIC
FORMS ON NORMAL RANDOM VARIABLES
by
Bruno Baldessari
DepartMent of Statistics
University of North Carolina
and
Universita di Roma
Institute of Statistics Mimeo Series No. 476
April 1966
CONTENTS:
1.
Introduction and notation.
2.
A lermna in matrix theory.
3.
The moment generating function of the r.v. X'SX.
4.
The distribution of the r.v. X'SX and the matrix SV.
5.
Numerical method.
ACKNOWLEDGEMENTS
REFERENCES
This research was supported by the Mathematics
Division of The Air Force Office of Scientific
Research Contract No. AF-AFOSR-76o-65.
DEPARTMENT OF STATISTICS
UNIVERSITY OF NORTH CAROLINA
Chapel Hill, N. C.
I
I
,e
1. Introduction and Notation
In this article we obtain a necessary and sufficient condition under
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
which a quadratic form, in normal random variables, is distributed as a given
linear combination of independent chi-square variates; in this linear combination the coefficient are
arbitrary constants, (Theorem 2).
is a eeneralization of the known theorems that a quadratic form,in a set of
normal random variables, is distributed as a chi-square variable(or a difference betHeen two independent chi-squares) if and only if the product of the
matrix of the quadratic form and the variance-covariance matrix is idempotent
(or tripotent).
In addition we give a numerical method for finding the coefficient of
this linear combination (section
5).
In regard to this type of problem
we refer
to the text and to the references of [3J and [4J and to the Chapter 4 of [2J.
We shall use the following abbreviations:
r.v. for random variable;
r.vt. for random vector; p.d. for positive definite; s.d. for spectral decomposition; d.f. for degrees of freedom.
X will denote an n-dimensional r.vt. and we assume that X is N(u,V),
(i.e., X is distributed like N(u,V) - an n-variate normal distribution with
mean vector
~
and variance-covariance matrix V, V p.d.).
Also we denote
by~ ''''
c (n,~) a'noncentral chi-square r.v. with n d.f. and noncentrality par-
ameter
~,
and by S an
nxn symmetric matrix, by A an nxn diagonal matrix, and
by I the nxn identity matrix.
,e
I
This result
1
2.
I
I
A lemma in matrix theory
If S and V are nxn real symmetric matrices and if V is p.d., then
Lemma 1.
the matrix S V has a spectral decomposition.
By hypothesis V- l exists and is real~ symmetric, p.d., also S is
Proof:
real s'ymmetric, then it exists an nxn real matrix M such that'MI~ 0,
-1
M' V
M = I, M' S M = A, where A is a real diagonal matrix, and its diagonal
elements are the n roots A , ••• , An of the equation,
l
Let a , j
j
, AI - SV!
= 0,
= 1,
(the roots of the equation JAI - SV{ : 0 are, of course, iden-
be their respective orders of multiplicity.
= H' -lAM-\1M' :::M,-lAM' ,
Let B., j
J
= 1,
- S 1= 0, [lJ.
••• , s, (s ~ 1), be the distinct roots of the equation
tical to the roots of the equation I AV- l - S/
SV
I A V- l
and putting P
=
= 0),
and let r., j
J
= 1,
••• ,s,
M' -1 we see that:
same places in which A has element a , and 0 otherwise.
j
l S .V P -_
s
~ a. Bj '
j=l J
A
H=L.
and putting
Ej
=P
-1
Bj P
,
j
= 1,
••• , s,
we obtain the re~uired spectral decomposition [1 J:
S V
=
I
I
I
I
I
I
_I
From this it follows that:
p- l SVP
= A.
••• , s, be the nxn diagonal matrix shich has elements 1 in the
P-
-.
So we have:
(1)
I
I
I
I
I
I
I
-.
s
j:l a j Ej
2
I
I
I
,e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
The spectral decomposition (3) is unique except for the fact
Remark 1.
,
that, in A , we may have
1, ••• , s.
tn. i
rl···rs •
different orders of the numbers a , j
j
We consider that one of these orders is chosen and that it is not
subsequently so that we can consider (3) as the unique s.d. of SV •
changed
Remark 2.
The matrices M, P, A, Bj , E , j
j
be found useful in the sequel.
= l, ... ,~of
Lemma 1 will also
We shall not repeat these definitions, it
being understood that they are the matrices appearing in the proof of Lemma 1.
Corollary 1.
If S and V are nxn real symmetric matrices and if V is p.d.,
s
then SV has the s.d.: S V = .~ ~~j where rank (EJ j ) = !j' j = 1, ••• , s,
j=l
if and only if:
Proof:
Follows from Lemma 1 and from the definition of s.d ••
,e
I
=
3
3.
I
I
The moment. generating function of the r. v. Xl SX.
If the r.vt. X is N (~IV) with V p.d. , and S is a real symmetric
Lemma 2.
matrix, then the moment generating function m){' SX
(Qj ~,V)
of the r.v. X SX,is:
(4)
i!j
s,
are the distinct roots of the equation)~I-SV'= 0,
s
with multiplicity !j)' and where E j is defined by the s.d.: SV= E !:j-!,r:j=l
a , j = I, ••• ,
where:
j
Proof:
By definition, we have:
II
II
I
where
J is
mation: x
an w fold integral and where dx ;; dx,. ••• dx " ve make the transforn
= My
+
~,
so that:
As in the proof of Lemma 1, suppose that we have r
j
= 1,
••• , s, and let:
c
j
=
j
roots
~i
equal to a j'
E(j)ci, where E(j) is the sum on the r
4
j
-.
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
-.
I
I
I
,e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
,e
I
subscript
i
such that Ai = a j •
where:
we have:
So
1
r
ns
j=l
- :.;
-~
(1-2Qa j )
= I-2QSV! ,
I
for:
I AI
= n,
- SV j
and the equality follows putting A = (2Q)-1 for Q .;,.
OJ
for g = 0 the equality
is verified.
In order to prove the lemma we must show that:
~ ~
j=l 2
s
2Qa j
l-2Qa
j
l-L'EjV-\L
,
~
=
j=l
(6)
2
s
where the matrices E are defined by the s.d. SV..
j
~
j=l
a j E , by means of the
j
matrix M.
-1
,
Decomposing the matrix M
into its row vectors mIl' ••• , m n it
.
1
(i)
follows from c = M- I-L that c i = mil-L. We will denote by B
the diagonal
matrix which has all elements equal 0 except for the .f,th element of the
diagonal which has the value 1.
2 = I-L' m m' I-L = I-L' M,-~(i)M-11-L'
By calculation it can be verified that: ci
ii
So we have' c = ~(j)c2 = ~(j)I-L' m m' I-L = I-L' M,-l ~(j) B(i) M-11-L = I-L'M'-~jM-11-L,
. j l
i i
j=l, ••• , s.
From (2) we have: B = M,-l B M' so that: M,-l Bj M- l
j
j
5
=
·.. , s,
I
I
-,
therefore:
j=l, ••• , s, which proves (6) and so also Lemma 2.
Remark 3.
-1
The matrices E. V ,
J
semidefinite of rank rjo
j
= 1, ••• , s, are symmetric and positive
This follows from the proof of Lemma 2, but it may
be also verified as follows:
Ej V-1 = M' -1 B. M-1 and so the matrix E. V-1
J
J
is symmetric; the matrix Bj is positive semidefinite and so it exists a real
matrix Cj ' such that:
B. =
J
C~ C '
J j
rank (C j )
= rank
(B j )
= r.,
J
and therefore:
E V- l = M,-l B M- l = M,-l C' C M- l = (C M- l )' (C M- l ), which is a positive
j
j
semidefinite matrix of rank equal to the rank of C.
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
-,
6
I
I
I
,e
I·
I
I
I
I
I
Ie
I
I
I
I
I
I
I
4.
The distribution of the r.v. X'SX and the matrix SV.
Theorem 1.
If' the r.
vt. X is N(IJ., V), with V p.d., and if S is a real sWs
2
IJ. LJIJ.
metric matrix, then the r.v. X'SX is distributed as Y. 1: ...!~j-l!:jl.. 2
),
,
J-l
. 2
,( a ,ri..!:J,j
J'), where the 'X' 's are mutually independent and where: t"...:-
,r
~j' IJ.'L j lJ.,20 r~(Lj2.=..2:j'
J ...
1, ••• , St if and only if the matrix SV has
s
S_V_.._j:l...!j Ed' With: IJ. , E
_th_e_s...;.._d_.
( .) ...
rankE~rl--
s
j ...
1, ••• , s,
I: r
j=l
J
.. n.
•
If SV has the s.d. SV. I: a j E , with rank (E j .) J
J-l
Proof of sufficiency:
r , j
j
= 1, ••• ,
s
s,I: r j
n, then, by Corollary 1, we have:l ~I -
..
svt -
j=l
s
n
j ... l
(~-aj)
rj
1
.
,and therefore, putting A .~Q' we have:
1
II
-~
- 2QSVI
..
AlSO, by Lemma 2, we have:
(8)
and so, by ( 7 ) , and by the
"
hypothesis~
-1
EjV IJ.. IJ.'LJIJ., j ... 1, ••• , s, we see
that the r.v. X'SX is distributed like the r.v. Y.
Proef of necessity:
If the r.v. X'SX is distributed like the r.v. Y, then:
,e
I
-1
,
.
J V IJ. - IJ.. LJ IJ.,
7
r.
s
_..l.
[s Il'L Il
(1-2Qa.) 2 exp L: ~
j= 1
J
j=l
I
I
-.
1
2Qa j
1-2Qa j )
We observe, now, that the s.d. of SV is independent of Il, (being dependent on
S and V only, through M
and so the s.d. of SV corresponding to the r.v.
X'SX where X is N(Il,V), is the same as the s.d. of SV corresponding to the
1/
r.v. Z'SZ where the r.vt. Z is N(O,V).1
-2
'I-2QSVI ,
,
But:
1
IA.I-Svl=
s
r.
-J
TT
(1-2Qaj)~=
J=l
sr.
IT (A.-a.) J , and
mZ'SZ(QjO,V) =
2Q = ~ so that:
in which we take :
s
. 1
J=
J
therefore, by Corollary 1, we have the s.d.: SV = L: a. E., rank (E ) =
j
j=l J J
s
r , j = 1, ••• , s,
j
L:
j=l
r j = n.
From the first part of the proof of the
sufficiency we know that this implies that, (formulae (7) and (8)),
-l.
S
s
L:
l j=l
J
so that, by the hypothesis
,
2
1l'(L
j=l
for all Q sufficiently small:
j
I
- E.V-l)1l
2Q a
2
1-2Qa.
J
j
=
0,
(10)
J
here we can, obViously, suppose that a.
J
~
0,
Expanding (1_2Qa j )-1 in a geometric series it is easily seen
1.
This could also be demonstrated by noting that if R(Q), St(Q)
. are ratios of polynomial functions of Q, then exp{R(Q)} cannot be expanded
number of Sl(Q)' s (except in trivial case:
8
I
I
I
I
(9), we must have:
L:
j=l, ••• , s.
{s
(1 - 2Qa.)2 exp\
I
_I
- r.
mS ' SX(O;Il, V) = IT
j =1
I
I
R(Q) = constant).
I
I
I
I
I
-.
I
I
I
,e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
that
(10) implies:
... , s.
of Theorem 1.
In the following Theorem 2 we will use the notation:
(~j ~ a , j ~ j' ,) to mean that the r.v.
j
Y
=
s
E
j=l
aj
'2
~j (.~.))
Y is a linear combination,
with coefficients aj,(a j ~ a j , j ~ j'), of s mutually independent non-central
chi-square r.v.'s, in which we do not specify the individual d.f.'s and noncentrality parameters except for the fact that the d.f.'s are positive and
sum to n,- and the non-centrality parameters are non negative.
If'the r. vt. X is N(IJ.,V), with V p.d., and if S is a real syms
. '2
metric matrix, then the r.v. X'SX is distributed like the r.v. Y = E ~j~j (.,.),
j=l
Theorem 2:
if and only if:
s
a).
n
(sv - ajI)
= 0,
j =1
(11)
s
b).
TT
(sv - ajI) ~ 0, k
j=l
= 1,
••• , s.
j~k
Proof of necessity:
First we show that if X'SX is distributed like Y, then
the condition (ll,a) is satisfied.
In fact, by Theorem and Lemma 1, since the r.v.
X'SX is distributed like Y the matriX SV has a s.d.: SV
=
s
E
a Ej , where
j
j=l
rank (E), j = 1, •.. , s,' is unspecified except for the fact that: rank (E j ) >
s
1, j = 1, ••• , s, E rank (E j) = n. This implies that, for every polynomial
j=l
p(x), we have [lJ
p(SV) = j~l p(aj)E j , and so, for the polynomial:
,e
I
This completes the proof
9
I
I
-.
and then the condition (ll,a) is satisfied.
We prove, now, that if the r.v. X'SX is distributed like the r.v. Y,
then the condition (ll,b) is satisfied.
s
there exists a member k such that:
III
For if we suppose that within l, ••• ,s,
(SV-ajI)
I
I
I
I
I
I
s
= O.Then
bySV
= 1:
at Et ,
t=l
jFk
s
I = 1: Et , we would have:
t=l
= 0,
(12)
so that, multiplying (12) by Ek and. recalling that Ek Ej = 0, t F k, we would
s
have: J=
n~ (ak-a j ) Ek = 0, which implies that: either one of the. members a.,
J
_I
j
jFk is equal to a k (contradicting to the hypothesis:
aj~' jFk) or E
k = 0
which contradicts the hypothesis that each of the chi-square r.v.'s involved
in the r. v. Y have at least one d. f'. since this would imply (Theorem 1) that
rank
(E.) = 0; j = 1, ..• , s.
J
Proof of sufficiency:
First we prove that the condition (ll,a) implies that
the r.v. X'SX is distributed, at most, like the r. v. Y
a j F a jl , j =/jl.
s
'2
X.
1: a.
.
1 J
J=
J
(.
J. ) ,
By this we mean that the r.v. X'SX is distributed like a
'2
q
=
r. v. Z = ~ 1 a j "X.j ( •,. ), where:
p
p
p
l:s. q :s. s, and the members
constitute any non empty sub-set of the set (1,2, •.• , s1.
Now suppose that the condition (ll,a) is satisfied and that the r.v.
k
XI SX is distributed like the r.v.:
T
= t:l bt
'2
Xt
.
.
~. ,. ), w1.th b
t
F be' 1 F1,
k
and k> s.
Then, by Theorem 1 and Lemma L, the matrix SV has the s.d.:SV=
~
b
t=l
where bt
F tot
= 0, t
Ft.
By (ll,a) we have:
10
t
8
t
I
I
I
I
I
I
I
-.
I
I
I
,e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
f'
I
k
=
(.E b F
t t
t=l
and multiplying by
F
t
0
, t =1, ••• , k, we have:
k
k
(.E btFt-as.E Ft)=O,t=l, ••• ,k,
t=l
t=l
so that, continuing the multiplication, we would have:
s
n
j=l
(b
t
- a j)
F
but this is possible only if either
= 0,
F
t
t
=0
••• , k,
= 0, which is excluded, or 1;1" .t=l, •• ,l:,
is equal ~o one of the members a j , (aj~j"
istic roots of the equation' A.I-SV I
= 1,
j~j'), in which case the character-
would take their values among the
members:
a , ••• , as. Therefore, by Theorem 1, the r.v. X'SX is distributed
l
like a linear combination of non-central chi-square r.v.'s, and the coefficients
of this linear combination are among the members a1, ••• ,a s ' i.e., the r.v.
X'SX is distributed, at most, like the r.v. Y.
Proof of the sufficiency
In fact, (ll,a) implies that the r.v. X'SX is distributed, at most, like
the r.v. Y, and this fact leaves open two possibilities: either the r.v. X'SX is
distributed like the r.v. Y(and in this case sufficiency follows) or the r.v.
q
'2
.
X'SX is distributed like the r.v. Z = .E a j Xj (.,.)q~ s-l,a j ~ a
, p ~ p',
j
.
p=l P
P
P
p'
and we have a contradiction..
1, we have the s.d.: SV
=
In fact, in the latter case, by Theorem and Lemma
q
.E a j E and this implies that:(SV-asI) ••• (SV-a I)=O,
p=l P j P
q
q ~ s-l, which contradicts, at least, one of the equations (ll,b).
11
5.
I
I
Numerical Method
Theorem 2 can be used, also, to justify the following numerical method.
Suppose that ,~e know that the r.vt. X is N(I-l,V), '~ith V p.d. and that
we want the distribution of the r.v. X'SX, where the matrix S is real symmetric.
We know that the r.v. X'SX is a linear combination of independent
non-central chi-squarer.v.'s.
'2
A numerical method which gives us the number of"
(','), r.v.' s in-
volved and the coefficients of the linear combination is the following:
C<"'1sider the sequence of equations, where a, b, c, •.• , are, real numbers:
(SV - aI)
=
0,
(SV - aI) (SV-bI)
(SV - aI) (SV - bI)
=
0,
(SV -
cI)
= 0,
...................................
and find the first of the equations (13), which is satisfied by a finite set
of real numbers.: a, b, c, ••• , •
r.v. Y
=a
'2
'2
Then the r. v. X'SX is distributed like the
,'2
~ (. ,.) + b%2 (.,.) + c;x..3
.
(.,.) + •••
This method ,is, obviously, justified by the Theorem 2.
If the humber of chi-squares involved in the linear combination is
reasonably lOW, then we may obtain the d.f. and the non-centrality parameters
by equating the moments of the r. v. X'SX and of the linear combination of
Chi-squares.
12
-.
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
..
I
I
I
.e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
ACKNOWLEDGEMENTS
The author expresses his gratitude to Professor N.L.
Johnson for kind encouragements, helpful discussions, and for
revising this article.
f
I
13
I
I
REF ERE NeE S
[lJ
[2J
Ayres, F. (1962).
Theory and problems in Matrices, Schaum,
Graybill, F. A. (1961).
New York.
An introduction to linear statistical models,
VoL I, Mc Graw Hill, New York.
[3J
GraJ~ill,
F. A. and Marsaglia, G.
(1957).
Independent matrices and
quadratic forms in the general linear hypothesis, Ann. Math.
Statistics, 28, 678-686.
-[4J
Luther,N. Y. (1965).
Decomposition of symmetric matrices and distri-
butions of quadratic forms, Ann. Math. Statistics, 36, 683.690.
-.
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
..
14
I