SOME PROPERTIES AND GENERALIZATIONS
OF MULTIVARIATE MORGENSTERN DISTRIBUTIONS*
by
Stamatis Cambanis
DepaPtment of Statistias
Universi ty of North Carolina at Chape l, Hi l, l
Chapel, HiU 3 North CaroZina 27514
N'ay~
1976
Institute of Statistics Mimeo Series #1069
* This research was supported by the Air Force Office of Scientific Research
under Grant AFOSR-75-2796.
SOME PROPERTIES AND GENERALIZATIONS
OF MULTIVARIATE f~RGENSTERN DISTRIBUTIONS*
Stamatis Cambanis
Department of Statistics
University of North Carolina
Chapel Hill~ North Carolina 27514
ABSTRACT
The admissible values of the coefficient in a bivariate Ivbrgenstern
distribution are found.
~
For multivariate
~brgenstern
distributions necessary
and sufficient conditions are given for its coefficients~ and its conditional
distributions are found and shown to belong to a family of distributions
further extending the multivariate r'brgenstem family.
* This
research was supported by the Air Force Office of Scientific Research
under Grant AFOSR-75-2796.
2
1. INTRODUCTION
14)rgenstern [1] introduced a family of bivariate distributions
and F2(xZL of the form
H(xp x2), with marginals Fl (xl)
where a is a real constant.
In Section 2 we find all values of a
for which H as defined by (1) is a bivariate distribution, assuming
of course that FI and F2 are univariate distributions.
Johnson and Kotz [2] introduced a multivariate f-brgenstem family
of distributions H(xI,x2' .•. '~)' with marginals Fl(xl),F2(x2), ••• ,Fn(~)'
by
e
H(xp x 2,· ..
,~) = Fl (xl )F2(xZ)
•••
Fn(~){l +
n-l n
+
r r a·
. [1- F . (x. )] [1- F. (x. )]
J1 J 1
JZ J 2
j I <j Z J I J 2
n-Z n-l n
+
+
r r r
a· . .
j 1 < j 2<j 3 J lJ 21 3
[I-F. (x. )][l-F. (x:~. )][l-F. (x. )] + ...
J1 J1
J 2 J2
J 3 J3
aIZ •.• n [1-Fl (xl )][1-F Z(x2)] ..• (I-Fn(Xn)]}
where the coefficients a are real constants.
(Z)
In Section Z we also
give necessary and sufficient conditions on the coefficients a so that
(Z) defines an n-dimensional distribution, assuming again that FI ,··· ,Fn
are univariate distributions.
In Section 3 we introduce a
faw~ly
of distributions closely related
to the multivariate r-brgenstem family, by inserting within the brackets
3
on the right hand side of (2) the tern
in Section 4 that if Xl' •.•
,xn
'iT!J= lCt.[l-F.(x.)],
J
J J
and we show
have an n-dimensional fvbrgenstern distribu-
tion then the conditional distribution of Xl,···~Xk given ~+l
;...
~xn =\~n
(k
= l, ... ,n-l)
= xk+1 '
belongs to this family.
2. THE VALUES OF THE COEFFICIENTS Ct.
For a distribution function
of Fi(x)
Fi(x)
let Ei
with the exception of ,'D: and 'l~~ i.e."
be the set of all values
E
i
is the:,··subset of
defined by
(0,1)
= {F. (x), -oo<x<+oo} -
E.].
].
{O,l},
and let
Then clearly 0 s mi s
THEOREf~
1.
~1
s 1 and ",e have the following
l 2) defined by
H(x ,x
(1)
is a bivariate distribution
if and onZy if
Proof.
all
H(xl,x Z)
Xl < xi and
z<
X
is a bivariate distribution if and only if for
z we have
X
Xi Xi
Axl',x Z H· = H(x'1 ,Xl)
2
l 2
But by (1),
-
H(xl ,x 2
'.) - H(xll ,x2) + H(xl,xZ) ~ O.
4
xi 1xi H = Axi
A
F
xl 1
xl~xZ
where A~' F
x'
Ax {P(l-F)}
= F(x ' )
xi F + a Axi
Xz Z
XI
A
0
- F(x)
{Fl(l-Fl )}
xi
0
A {FZ(l-FZ)}'
xZ
and
= F(x')[l-F(x ' )]
- F(x)[l-F(x)]
= AxXl
F
0
[l-F(x)-F(x')],
and thus
Hence H(xl,xZ)
where Ai (x 1 x') = 1 - Pi(x) - Fi(x').
is a bivariate
distribution if and only if
1 + aAl(xl,xi)AZ(xz,xi) ~ 0
Xi
(3)
Xl
PI > 0 and A Z FZ > O.
Xl
Xz
We now show that whenever A~' F > 0 we have
whenever A
-M s A(x,x') s 1 - m
(4)
where the bounds are tight, but they are not necessarily achieved.
It
suffices to prove the left hand side inequality ~ the proof of the right
hand side inequality being similar.
x < x'
1
It is clear that A(x,x ' )
is decreasing (Le. non-increasing) in x and x'.
to show that as
x
+ +00
1
x
+ +00,
Thus it suffices
A(x 1 x') t -M.
If M = 1 t h en .t:
.Lor each x t here exists x I > x
and thus as
= l-F(x)-F(x'L
A(x,x ' ) t 1 - 1 - 1
= -1 = -M.
If M < 1 then for all
x < sup{u
F(u)sM}
= F-l(M)
< x'
such that
"xx' P > 0
u
5
e
we have
F(x):s: M < 1 = F(x')
A(x,x') {- 1 - M - 1
and l\~i F > 0, and thus as x t F-l(M),
= -M.
x'
x'
Now (4) implies that whenever l\ 1 Fl > 0 and l\x 2 F2 > 0 we have
xl
Z
where again the bOlDlds are tight (but not necessarily achieved) and the
o
result of the theorem follows immediately from (3).
Notice that
a. :s: -1 and l:s: a
nun
max
and that in fact
Ct
= -1
if and only if Ml =
Ct
=1
if and only if
min
max
l~en
Mz = 1
CMr=1
and mZ=O)
the marginals are identical,
a. = 1
:s:
mm
{max (M, I-m) } 2
As an example, when
for
of a
or
(rnl =0 and t4z=l).
Fl = FZ = F, then
Ct <
1
- M(l-m)
F(x) = 0 for x < 0,
=
Ct
= p for
max'
0 s x < 1,
=1
l:s: x, with 0 < p < 1, then M = m = p and the admissible values
are
1
S a :s:
1
{max(p,1-p)}2
p(l-p)
This example is considered by Jolmson and Kotz [2].
If F has a density then rn
=0
and M = 1.
Thus if both F1
and F2 have densities then the admissible values of Ct are -1 S Ct
S
1,
6
a result obtained by Johnson and Kotz [2].
It is clear however that
we may have m = 0 and M = 1 even when F is not absolutely continuous.
For instance 9 if F is a discrete distribution with (positive) jumps at
~
the integers (or at
m = 0 and M = 1.
with inf
~
=
-00
and
sup
~
= +00
)
then
Thus for such marginal distributions the admissible
values are again -1
~
a
.~
1.
As a final example consider a discrete distribution F with mass
Pn at each
M = 1.
xn,
n = 1,2, ... , where
'Thus if both marginals
~ < ~+l.
'Then m = PI and
Fl , FZ are of this type the admissible
values of a are
e
The same method can be used to obtain necessary and sufficient conditions
on the coefficients a
sional distribution.
so that Hexl, ..•
,xn)
defined by (2) is an n-dimen-
As in the proof of Theorem 1 we have
Hence H is an n-dimensional distribution if and only if
n-l n
1 +
r
~ a. . A. (x. ,x! )A. (x. ,x! ) + ... + a1. eon ~ (xpxp
j 1 <J 2 J 1J 2 J1 J 1 J1 J 2 J 2 J 2
xi
F > 0,
Xl l
whenever b.
Xl
Since by (4),
-M. :s; A. (x. ,x!) :s; 1 - m·
1
111
whenever b.x Fi > 0, it follows that the necessary and sufficient conditions
1
7
e
on the a.' s are the following 2n inequalities
n-l n
1 +
~
r
JI<jZ
where for each i
£. £. a..
.
(5)
+
JI J2 JIJ2
= 1, •••
,n,
£i
= -Mi
or 1 - mi'
These necessary
and sufficient conditions were obtailied by Johnson and Kotz [2] mder
the assumption that all marginal distributions Pi have densities, in
which case of course £.l.
= ±l
in (5).
If H(xl, ••• ,xn) has the following (simplest possible symmetric)
form
i.e., if
1 on is the only nonzero coefficient a, then the admissible
values of a are
0.
o •
"mere the maxima are taken over all products with each Yi
= Mi
1 - mi and an even number of Yi 's equal to Mi ' and each
1 - mi and an odd number of <5 iPs equal to Iv~.
3. A RELATED
FN~ILY
<5 i
or
= f'\
or
OF DISTRIBUTIONS
In this section we introduce and study a family of distributions
which constitutes a natural generalization of the multivariate MJrgenstern
family.
This family has some interest on its own but its raison
di~tre
here is the fact (shown in Section 4) that it contains all conditional
distributions of the multivariate l'ibrgenstern family.
8
Let FI (xl) ~ ... ~Fn (xn )
be univariate distributions·ailcV·let us
consider the family M
1 of multivariate distributions
H(xI?.?~)
defined by
(6)
where the coefficients
S are real constants.
Notice that (6) differs
from (2) only by the introduction of the first order tenns;
rj=ll3 j [l-Fj(Xj )]. i\s a result the marginal distributions HI(xIL .. "?I-~(~)
of H defined by (6) are now given by
H.(x.) = F.(x.){l + j3.[l-F.(x.)]}
1
1
1
1
(7)
111
and are not equal to the original set of univariate distributions
unless all
Clearly the family M
I
M of multivariate ~brgenstem distributions and in fact a distribution
H in M1
l3 i
Fi
contains the family
equal zero.
(given by (6)) belongs to
are the distributions
M if and only if its marginals
FI, ... ~Fn.
The method used in Section 2 can give us the necessary and sufficient
conditions on the coefficients
dimensional distribution.
butions Hi.
(3
for H defined by (6) to be an n-
Let us first concentrate on the marginal distri-
It is an imr.lediate consequence of
and (4) that each Hi
D.~' Hi = D.~'Fi{l+(3iAi(X~x')}
is a univariate distribution· if and only if
9
1
- -1m.
1
S
1
(3.1
(8)
S 1\/1 •
....
1
The necessary and sufficient conditions on the remaining coefficients
are of course the (corresponding to (5))
n
1 +
(3
Zn inequalities
n-l n
L
e.(3. +
j=l J J
where for each i =
L L
5
e· e· (3 . . + .•• + e l ... e- 81
~ 0
<jz Jl J2 JlJ2
n
... n
l~
... ~n,
l
e i = -Hi or 1 -
ID
i .
When n = 2 the
admissible values of B12 are bIZ S (3lZ S BIZ where
{l+(3l(l-ml )+Sz(l-rnz)
(i-m ) (i-rn )
~
1
Z
_
b12 - max -
B = min{l+al(l-IDr)-at;~
12
(l-ml)~'
M2} "
-
l-SlMI -S z
M l\1z
l
l-alMl+SZ(l-mZ)}
M1(l-n~)
.
In particular» when Mi = 1 and ~ = 0, i = 1,Z, we have by (8):
-1 S Si S 1 and the admissible values of a12 are
(it is easily seen that this interval of admissible values is always
nonempty).
~Y'n
be random variables with joint distribution in Ml •
It is easily seen from (6) that the joint distribution of each subset of
Let Xl'."
Xp '" ,Xn is also in the family MI.
easily seen that the random variables
and only if for
k = 2» ••. ~n and
51
Also, from (6) and (7), it is
Xr'
<
,~
are independent if
< 5k we have
10
e
t3 is given by the following relationship
The full meaning of the coefficients
(9)
where
c.
1.
=[
~!1. =[
F.(x.){l - F.(x.)}dx.
-00
1.
1.
1.
1.
1.
=- [
x.{l - 2F.(x.)}dF.(x.)
1.
-00
1.
1.
1.
(10)
J
(11)
x. dF.(x.)
-00
1.1.
1.
and where the following assumption is needed to guarantee that all integrals
and expectations are finite
Ix.1. IdF.1. (x.)
1.
i = 1 9", 9n.
The ftmctionals
00 9
c of F were first introduced by Johnson
and Kotz [3] iil evaluating the regression
bivariate f.nrgenstern distribution.
dH.(x.)
1. 1.
<
= dF.(x.){l
1. 1.
E(Xl/XZ)
when ~ 9XZ have a
Notice from (7) that
+
e.[l
- 2F.(x.)]}
1.
1. 1.
and thus by (10) and (11) we have
~.1. = E(X.)
=[
1.
= ~!1.
Thus
~i
x.{l
1.
-00
-
+
e·[l
- 2F.(x.)]}dF.(x.)
1.
1. 1.
1. . 1.
c·e·
1. 1. .
is the mean of Xi under Fi ,while
of Xi under Hi'
and (9) reads
(12)
~i
is the (true) mean
For the r.tlrgenstern family we have from (12)
~i = ~i
11
E{(X . -llJ. ) ••• (X]. -llJ. )} = (-1)
J1 1 k k
k
c.
Jl
. .. c.
(l.
••
Jk J1"' ·Jk
In order to prove (9) we simply notice that (6) implies
d~(Xl~"'~xn) = dFl(xl )
.r
... dFn(xn){l +
J=l
a·[l - 2F.(x.)] +
J
J]
+ a l •.. n [1 - 2Fl (xl )] ... [1 - 2Fn (xn)]}
and that by (10) and (11) we have
_.
[
1
(X·-ll!)[l - 2F.(x.)]dF.(x.)
1
1
1
1
=0
(X·-ll!)dF.
-00
1
(9) now follows immediately.
1
1
1
+00
= -c·1
For k
- ll·1 {F.1 (x·Hl
- F.1 (x.)])
= -c.
1
1
-00
1
=2
1
using (12) and some simple
algebra, (9) can be written in the following fonn
From this it follows that if X.
Jl
are independent.
and X.
J2
are uncorrelated then they
As a final property of the family
M we mention that it is closed
l
with respect to conditioning in the sense that all conditional distributions
of an element of MI
belong to MI'
This can be seen in a way identical·
to that by which Theorem 2 is established in Section 4.
We conclude this section ltlith the following remark.
consists of multivariate distributions H(x!,,, qXn )
. with marginals HI"" ,Hn
e
given by (7).
T'ne family Ml
of the form (6)
At first glance it may seem
that (7) is a restriction on the univariate marginals of distribution
in the family MI'
However this is not the case, as every univariate
12
distribution H(x)
can be written in the form
H(x)
= F(x){l
(13)
+ 8[1 - F(x)]}
for some tmivariate distribution F(x)
and some real number 8.
representation is of course far from tmique.
This
In fact it can be easily
seen that given H and given any real number 8 there is a tmivariate
distribution F(x)
and
(3
such that (13) holds.
F depends of course on H
but is not necessarily uniquely detennined by them.
we can clearly take F:: H.
For
8
~
=0
=
and the nondecreasing, right continuous root r (x)
'When -1
and Hex)
s;
8
s;
=1
1
=0
0, (13) can be written as
(3 F2 (x) - (l+(3)F(x) + H(x)
(28)-1(1 + 8 - 1(1+(3)2_ 4aH (x))
For 8
is the obvious candidate for
this root has the proper 1ir.tits at
F(x).
±oo, but when 1 < 8
we have a choice between 1 and ~, and, similarly, when
(3 < -1 and H(x)
=0
we have a choice between 0 and 1 + ~.
Hence
F may not be uniquely determined by H and (3, but a solution can always
be fotmd as follows:
for
for
(3 < -1
-I.=? (3 ~ 1
F(x) = {ro(x) £foor 0
r
< H(x) s;
H(x)
=0
1
F(x)
= rex)
for all
F(x)
= {~(x)
for 0 s; H(x) < 1
for H(x) = 1.
x
13
4. THE CONDITIONAL DISTRIBUTIONS OF THE
MULTIVARIATE MORGENSTERN FAMILY
In this section we compute the (regular) conditional distribution
of Xl' .•. ,Xk given Xk+l
= x k+l '
••.
,Xn = Xu'
(k
= l, ... ,n-l)
when X1 , .•• ,Xn have a mlutivariate Mbrgenstern distribution and show
M1 .
that the former always belongs to the family
Let the random variables
given by (2).
~,
...
,~
have a distribution in M
For
j1 < j2 < ... < jm we denote by H..
.
the
JlJ ...7." oj m
distribution of X. ,X. , ••• ,X.
(which also belongs to M). For
Jl J2
Jm
convenience we will use the same symbol for a distribution as for its
corresponding (Lebesgue-Stieltjes) probability measure.
e
finite (possibly signed) measure
to the finite measure
A is absolutely continuous with respect
v, A« v, we will denote by [~] the corresponding
Radon-Nikodym derivative.
Notice that
F(l - F)« F and
It then follows from (2) that H1. •• n« Fl
dH
[
dFl/::~dFn
For j 1 < ••
0
Also if the
]
(XP· .. ,Xn ) = 1 +
n-l n
I ~
x •
0
Ct . .
j 1<J 2 J 1J 2
0
x
Fn with
ex . )][1 - 2FJ. (x. )]
J1 J 1
2 J2
[1 - 2F .
J
< jm we will use the following notation
dB.Jr' ·J. m
d.
J. (x. , ... ,x. ) =
dF
aF
[
Jl'" m J1
Jm
jlx ••• x jm
(x., ... ,x. )
J1
Jm
and of course we have an expression similar to (14) for
d.
..
Jloo ·J m
14
Let us recall that a regular conditional distribution
HI. ook/k+1. .•n(xp
Xk+l
= xk+l'
... ,xI!xk+p
...
,xn,)
,~
of Xl'
n
... ,~ = ~ is a ftmction from R
to
given
[0,1) which is
Borel measurable in xk+l ' ••• ,~ for each fixed xl' ..• ,xk ; a m1tivariate distribution in xl' ..• ,xk for each fixe~ x + , .•.' iXi< ;; and
k l
which satisfies
X
n
Loo HI. •• k/k+L .. n (xl'
.00
,xt!tlt<+l' .••
dI\+L .• n (~+l' ••.
THEOREM 2.
'Un)·
'Un)
(15)
With the above assumptions and notation the funotion
,~)
de fined by
H
1. .."
Iv k+L .. n (x1 ,
(16)
when ~+ 1... n (xk+l' ... ,~) > 0 and otheT1JJise by" say,
(17)
is a reguZar oonditionaZ distribution of Xl' ••• ,Xk given
Xk+1
e
= x k+1 '
••• ,~= ~ and beZongs to the famiZy
Proof.
It is quite clear from its definition that
HI ... k/k+1... n (Xl' ..• ,xn/~c+l' ... ,~)
MI.
is Borel measurable in xk+I' .•.
for each fixed xl' ... ,xk ' and a distribution in xl' .•. ,xk for each
,xn,
15
e
fixed xk+1 ~ •••
Fix xl' •••
and B
=
~xn.
,xn.
Thus we need only prove that (15) is satisfied.
For brevity we let A = (-~,xl] x ••• x (-~,xk]
(-~~xk+l] x ... x (-~,xn]
and we omit the variables throughout.
k
Notice that if E = {dk+1. •. n > o} c If- ~ then on its complement E'
we have <1<+1. oon = 0 and thus Hk+1. ••nCE!) = IEy <1<+1. ••ndFk+l ••• dFn = O.
k
It follows that HI. •• n(R x EY) = Hk+1. ••n(E') = 0 and since A x (B n E')
is a subset of Rk x BY,
Hl •.• n{A x (B A E')}
= O.
Now (15) is obtained as follows:
fB H1. •. k/k+1. .. n dHk+1. ••n
=
I
=
J
=
fBnE ( IA dIo •• n dFl
BnE H1. •• k/k+ 1. •• n dHk+1. •• n
JAd1. .. ndFl· .. dFk dH
BnE
dk+l ••. n
k+l ••• n
=fAx (BnE)
... dFk)dFk+l
d1
dFl···dF
... n
n
= HI. oon{A x (BnE)} = H1. •• n (AxB).
That HI •• • k/l<+l •.• n belongs to M
l , i.e., is of the form (6), is now
clear from (14)~ (16) and (17) and from
f~~
(1 - 2F(u)]dF(u)
= F(x)[l
- F(x)].
For instance the term aln[l - 2Fl (Jli)][1 - 2Fn (X )]
n
in (14) will give
rise via the integration in (16) to the tern a1nFl(x1)[1 - Fl(xl)][l - 2Fn(:xn)]~
and thus a ln [1 - 2Fn (:x'n) ] will be one term in the expression representing
16
the ·value of the coefiicient al
o
in (6)~
(xk+1 ' ... ,xn) with dk+l ... n(xk+l' ... ,xn ) = 0
are taken by (Xk+P ... ,Xn) with probability zero, the expression of
Since values
HI. .. k/k+I. .. n given by (16) is the one of interest and it is only for
such
(xk+P ... ,xn )' s
sequel.
The coefficients
that expressions will be written out in the
a in the representation (6) of
HI. .. k/k+I. .• n(xl' ... ,xr!xk+l' ... ,xn)
depend of course on xk+l' ... ,xn .
Using (14) and (15) we can express the coefficients a in tems of the
(constant) coefficients a
For instance for k
i.e.,
a(x Z)
= a lZ [l
= 1,
and of [1 - 2Fk+1 (xk+l )], .•. ,[1 - ZFn(xn)].
n = Z we have
- ZFZ(x Z)].
For k
in the representation of HI. .. n-l/n
=n
- 1 the coefficients a
are given by
(18)
etc.
From the form of the coefficients
a it is also clear that
17
Hl. •• n-l/n(xp"·~xn-l/xn) = Hl. ••n-l(xp,,,,xn-l) +
n-l
x
+ Fle l ) .•. Fn-l(:xn-l)[l - 2Fn (:xn)] {jI ajn[l - Fj(Xj )]
l
n-2 n-l
+ L ~ a· . [1 - F. (x. )] [1 - F. (x. )] + •••
j 1< J 2 J 1J 2n
J1 J 1
J2 J2
+ al .•• n-l~n[l - Fl(xl )] .•• [1 - Fn-l(xn-l)]}· .
Notice that when
:xn
is the median of Fn then HI. •• n-l/n == HI. •• n-l .
Also, whenever f(Xp ... '~-1) has finite expectation we have
·E{f(X1, .•• ~xn-l)/~=Xn} =
+ [1-2Fn (xn)]
E{f(Xl~···'Xn_l)}
J_.~~ f f(xl"",xn_l)h(xl~···~xn-l)~2l(xl)···dFn_l(xn_l)
(19)
where
n-l
h(xl~",~xn_1) = ~ ajn[1-2Fj(Xj)]+... +al ••• n_l~n[1-2Fl(xl)]···[1-2Fn_1(Xn-1)]·
J
In particular (9) ~ (18) and (19) imply that if the means lli = E(Xi )
exist then
18
e
The expression for E(X/Xn) has been calculated by Jolmson and Kotz [3].
Also we can easily calculate the conditional covariance
As a further example the coefficients
of HI ...n-Z / n-l,n (n
~
SC~_p~)
in the representation
3) are given by
etc.
Sl •.•n-Z(Xri-l'Xn ) =
al ...n-2+al ..•n-l[1-2Fn-l(xn-l)]+al ..• n-2~n[1-ZFn(xn)]+al.••n[1-2Fn_l(xn_l)][1-2Fn(xn)]
~
1+an_l,nll-2Fn_l(Xn_lJ][l-2Fn(xnJ]
.
Also
Hi... n-Z/n-l,n
may be expressed as follows
H= H
-~ •.. n-2/n-l~n
1••. n-2
+ Fl· .. Fn - 2 {Cl - 2F )KJn-l) + (1 - 2F )K(n)
dn- 1 ,n
n-l ·~ •••n-2
n 1.•.n-2
+ (1 - 2F )(1 - ZF )K(n-l,n)}
n-l
n 1 ... n-2
where
(m)
_ n-2
Kl ..• n-2 -
f
ajm (l-Fj ) + ••• + al ••. n- 2,rn(1-Fl ) ••• (l-Fn_Z)
for m = n-l,n and similarly for Kl(n-l~~)z
(replace m in expression
on
•
lit
by n-l,n). Notice that the denominator dn-1"n becomes 1 when
~-l
and Xn are· independent.
In particular
19
where
and thus whenever f(Xl )
has finite expectation we have
As a final example we give the expression of
Hl!z ..• n(xllxz'
...
,~)
= Fl(xl){l
where
+ a(xz ' •.. ,~)[l - F1 (xl )]}
(1)
a(
X
) -
z' ...
,~
n
h~~~.n = Jo=L
-
hZ•.• n (xZ,···,x)
Hn
-.-.-.n'""":(;O-x--'.-.-.--:,~~)
Z
Z
'l'"'"h
n-l n
Z
a lJ· (1 - ZFJo) +
r
L a o JO (1 - ZFJo )(1 - ZF · )
JZ
2=jj<jZ lJ 1 Z
1
(1 - 2F Z) ••• (1 - 2Fn)
1 ...n
+ ••• + a 2
,:xn
Again h Z... n :: 1 when Xz' ...
are independent, in which case
the expression of the conditional distribution, and hence that of the
regression, is greatly simplified.
ACKNOWLEDGt~ENT
The author wishes to thank Professor Norman L. Johnson for introducing
him to the problems considered in this paper.
20
REFERENCES
[1] M::>RGENSTERN~ D. (1956). Einfache Beispie1e Zweidimensiona1er Vertei1tmgen.
hlitt. fUr ~ath. Statistik 8 234-235.
[2] JOHNSON, N.L. AND KarZ, S. (1975). On some generalized Far1ie-Gumbel·
}brgenstern distributions. Corom. Statist. 4 415-427;
[3]
JOHNSON, N.L. AND KarZ, S. (1976). Private Communication.
© Copyright 2026 Paperzz