I
..I
I
I
I
I
I
SOME HOHPARAMETRIC TESTS FOR THE MULTIVARIATE
SEVERAL SAMPLE LOCATION PROBlEM
by
V. P. Bhapkar
University of North Carolina
and
University of Poena
Institute of Statistics Mimeo Series No. 415
December 1964
I
Ie
I
I
I
I
I
I
I.
I
I
(Dedicated to the memory of
Professor S. N. Roy)
This research 1'18.S supported by the Mathematics Division
of the Air Force Office of Scientific Research Contract
No. AF...AFOSR-76o..65.
DEPARTMENT OF STATISTICS
UnIVERSITY OF NORTH CAROLINA
Chapel Hill, N. C.
I
..I
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I.
I
I
1.
~~.
This paper offers nonparametric tests of the null
hypothesis 'l=F2=... =F c
(i = 1,2, ... , c), where the
against alternatives of the form Fi(~) = F(~ - ~)
.f!..L's
are not all equal and Fi
is the unknown
continuous cumuJ.ative distribution function of the p-variate population from
which the
1
th random sample comes.
The statistics offered are multivariate
anaJ.ogues of some univariate rank-order test-statistics and all these are
shown to be asymptotically distributed as
when the null hypothesis holds.
~
with p(c-l) degrees of freedom
I
~1RF~Yillstn~,
Quite a few nonpa.ra.metric tests are
available for the several-sample, (say c-sample), location problem in the
univariate case.
Among them are the H-test of Kruskal-WaJ.lis [12], the M-test
of l>i:>od [13],and the tests based on c-plets, viz. the
offered by the author
V and W tests
([1], [2]) and the L-test offered by Deshpande [5].
Some other tests and references to earlier ones may be found in Dwasss [6],
Kiefer [10] and Kruskal-Wallis [12].
Not much work though has been done in the corresponding nmltivariate
case.
Hodges [8], Vineze [14], Chatterjee and Sen [4] have developed non-
parametric tests for the
~ro-sample
problem in the bivariate case.
The
author [3] had offered a test for the several-sample bivariate problem using
a "step-do"l'll1 procedure" in "I1hich the regression of one variable on the otrer
is assumed to be linear.
But there the roles of the two variables do not
appear to be symmetrical. The present paper offers synJlletric nonparametric tests
far the general several-sample multivariate location problem.
seen to be nmltivariate analogues of the univariate
Let %.j' j
= 1,2, ... ,
n i } be
ni
components of observation
~' ij
~j
F , i = 1,2, ... , c.
i
be denoted by X (a),
ij
then denotes the row-vector (X
the corresponding column-vector.
V, L, toT and
H tests.
independent observations from a
population with continuous nonsingular c.d.f.
p
These will be
(1), ... , Xij (p»
ij
Let the
0: = 1,2, ••• , p.
while !tj
denotes
The samples are assumed to be independent.
We consider nonparametric tests of the hypothesis
H :
o
F =F 2= ••• = F c
l
against alternatives of the form Fi (!)
:I
F(!-2i) with the vectors ~ 's not
all equal.
(
) that can
!lt '!et ' ."'';ct
c
l
2
be formed by choosing one observation £':rom each sampJ.e such that X (0:)
iti
Let
Vi
(0:)
be the number of c-plets
2
JI
I
I
I
I
I
I
et
I
I
I
I
I
I
.-I
I
I
..I
I
I
I
I
I
I
is the smallest among
(x~),
k == 1, 2, ... , c).
number of c-plets that cankbe formed such that
{ X(a)
lttk
that
k == 1, 2, ... , c
n~~)
)
j
Similarly let
X~~~ is
(a)
(a)
tea)
denote b i
- Vi
by i •
is the number of c-plets
(~lt
Then weknote that v(a) == n(a) and b(a)
i
il
i
Finally, i f
R~~)
denotes the rank of
x~~)
= n(a)
ic •
among
Then the test-statistics now being proposed are:
te
I
I
where N
Y ==
= I:ini ,
(a)
Pi == n i / N, u i
(a)
= vi
/nl n2 •• .nc
I:iPiYi and ~v is a matrix given by
I
I
I
I
(2.4)
I.
I
I
In general, suppose
W -- ,!g!
2
c
c
I: p.
i=l
~
(4.1).
l
(Ui-U), E-w (Ui-U) ,
~
~
~
3
~
x~~~
has rank
1
Further, let
(~),
k==l, ••• ,c), let
be the
the largest among
, ... ,Act ) such that
1
c
r among (Xkt ' k = 1,2, ... ,c), r :: 1,2, ... ,c.
(a)
b~a)
t==l, ••• ,~ and
I
12
N
c
(N+l
E Pi Bi - 2
i=l
(2.5)
H ==
where
J'== (1)1 Xc·
i\'
-1
'IJ ~
(-
J
N+l)
Bi - 2""" J ,
The V-test consists in rejecting H at a significance level a if V exceeds
o
some predetermined number Va; the same thing holds for the other statistics.
the next section it is shown that, when H is true, each of these statistics
o
2
is asymptotically distributed as a X variable with p(c-l) degrees of freedom.
In
Thus large sample approximations for Va' Ba , La' Wa and Ha are provided by the
2
upper a-point of the x distribution with p(C-l) degrees of freedom. These
tests are thus seen to be the multivariate analogues of the corresponding
univariate tests.
If ~i
= Ej~ij/ni'
with covariance matrix
~ == Ei,j~ij/N, and the populations are p-variate normal
~,
then it is well-known that the statistic
(2.6)
2
is distributed as X with p(c-l) degrees of freedom under H and, hence,
o
-)
E n i (-x
-i --X)'
§ -1 (~i-~
i
2
is asymptotically distributed as X with p(C-l) degrees of freedom under H ,
o
where S is the covariance matrix obtained from the samples. The statistics
being offered now are thus seen to be rank-order analogues of (2.7).
i == 1,2, ••• ,c
ex == 1,2, ••• ,p
4
I
I,
I
I
I
I
I
--I
I
I
I
I
I
.'I
I
I
..I
I
I
I
I
I
I
and suppose that
= i i (X(a)
X(a)
X(a))
1 '2 , ••. , c
'
and further
which means that.
~
is a function of xi and the set of remaining x j t sand,
moreover, is symme~ric in Xj's, j = 1,2, ••• ,c except i. We shall also assume
that i is bounded.
Then, the following special cases (3.4), (3.5), (3.6) and
(3.7) will lead to the uts in the statistics (2.1), (2.2), (2.3) and (2.4),
respectively:
= {' 1 if xi < x j ' j=l, ••• ,c except i
o otherwise
te
I
I
I
I
I
I
..I
I
~.(x.., ... ,X )
( 1 if xi > x j ' j=l, ... ,c except i
= 0 otherwise
~.J. (x1 '···,xc )
1 if X. > x j ' j=l, ••• ,c except i
< x ' j=l, ••• ,c except i
= { -1 if
o otherwise j
J..l
c
t
4
( r-1, if the rank of xi among
=
is r, so that
L
c
=
!:
. 1
J=
h(x. -X
J.
j
)
,
where the function h is defined by
h(y)
=1
if Y >0
=0
otherwise •
From (3.1) it is seen that
uia ) is a generalized U-statistic corresponding
5
I
to
~~a).
uia )
u~~)
For studying the asymptotic distributions we shall write
for
based on the samples of total size Nj also the random variables will be
denoted by capital letters and the variables held fixed will be denoted by the
corresponding small letters.
Let
' - (U' U'
U' ) U' ... (U(l)
u(p»~, - (D'l'~'···'~'
~~
~')
U
-N - -LN'-2N'···'-cN' -iN
iN , ••• , iN ' ~ where
Tj '
(Tj(l)
ioiti =
and
~i' s
i
(p»
'''·'Tli
'
are independent random variables with c.d.f. Fi' i ... 1,2, ... ,c,
respectively.
From the c-sample version (e.g. see [1]) of Hoeffding'S theorem
[9] concerning U-statistics it then follows that in the limit as n ->
i
such a way that n
EiPi ... 1,
i
2
is normally distributed with zero mean vector and
N
t,
where
~ll ?;12
E = ' ~l ~22
-
(3.9)
• • • ?;lC
• • •
I
•
E
-cl
~ij
in
... nPi' the p's being fixed positive numbers such that
r/ [y -;o]
covariance matriX
CD
•
E
-c2
= [ lTi(aj,I3).'.l
u
c
-
E
k-l
a,13
1:.. ,(a~l3) (i
P
k
(k)
,
E
-cc
= 1,2, ••• ,p
j)
,
,
and
where
~i,Xi are independent random variables with c.d.f. F
except that
Xk = ~.
6
I
I
I
I
I
I
I
--I
I
I
I
I
I
•
• • •
,
E
-2c
J
i
(i=1,2, ... ,c),
J
I
I
I
..I
I
I
I
I
I
I
aI
I
I
I
I
I
I.
I
I
Now, when H holds, F1 = F2 = ... = Fe = F, say.
o
We shall denote by
F (0,13) the marginal c. d. f. of the r!:th and ~th components of
~'s,
hereafter in this section,
~.
Here J and
X's are independent random variables each
with c.d.f. F.
Theorem 3.1.
Suppose the functions
~~O),
i=1,2, ••• ,c and 0=1,2, ••• ,p
satisfy the following conditions:
,*t: ) = ii «0)
Xl '
(0)
(0)
x2 , ••• ,xc )
(i)
(a) (
~i
~,~~, •••
(ii)
W
(x ,x , ••• ,x )
c
i
2
(iii)
There exists a constant A such that
(iv)
E
l
= i(x ; [x j ' j~i})
i
Iii
<A
~i(~,x2' ••• 'xc) = cd, where d is some constant; it is the
i
only linear constraint on ~i's.
(v)
The statistic
i(Xl ; [X j , j=2,3, ••• ,c})
is distribution-free in
the class of continuous univariate distributions, i.e., the
probability distribution of i(~; [X , j=2, ••• ,c}), where XiS are
j
independent random variables each With univariate continuous c.d.f.
G, is independent of G.
(vi)
The common distribution of the independent random variables ~'s
is nonsingular.
Then as n
i
->
CD,
in such a way that n
i
= nPi' i=l, ... ,c, the p' B being
fixed positive numbers such that E.p. = l, and under H , the asymptotic distri~
0
~
bution of the random variable.
7
I
J
and
e
Q
(p~), O,~,
= 1,2, ••• ,p
with
2
is X with degrees of freedom p(C-l).
Proof.
The theorem will follow by the application of the r:rrilltisample
extension of Hoeffding's theorem and noting that under H ' F
o l
say.
We first see that, in view of oonditions (i), (ii) and
= F2
....... F
c
j
Q
F,
(v),~[t~O)(~l' ••• '~c)]
is a constant, d, independent of any univariate continuous c.d.f. G.
~ ( (0) (0) (0)
=(Dt
x
; y 'X
'
J
Q
Now
l, ••• ,c except i and k } ) - d
I
I
I
I
I
I
I
--I
I
I
I
I
I
and
Then from (3.11), under H ,
o
.-I
8
I
I
..I
I
I
I
I
I
I
ae
I
I
I
I
I
I
I.
I
I
where P
Q
a, ....
is the correlation coefficient of
~(a)(X)
has c.d.f. F(a,t3) and, in general, will depend on F.
can be expressed in the form (3.13).
and
,(t3)(y), where (x,y)
We also note that
P~
Also, if for i ~ k
then
But from (i) and (iv) we have
c
()
E ~ia (Xl, ••• ,x ) = cd,
. 1
c
J.=
and hence
which gives
.I,(a)( (a)
'I'
(a)) + ",(a)(x(a) )a)) + ~ e~(x<a) . {x(a) x(a) x(a) j-3
c
,.
2' 1
(j. i ' 1 ' 2 ' j , - , ••• ,
xl'~
i=3
except i})
= (c-2)d.
If we let
9
I
J
(3.18) then gives
(xiex) "x~ex»
Replacing xiex ) by xiex )
(3.19)
.(ex)
+ .(ex) (x~ex) "xiex » + (C_2)'l\(ex)
and taking expectation we have
i.e. ,
From (3.17)" (3.20) and (3.16) it then follows that
Similarly" for i ~
j"
• -(c-1)w "in view of (3.20)"
and
• -(C-l)WPat3 •
Finally" for i ~
j ~
k
(xiex ) ,,~ex»
• O.
I
I
I
I
I
I
I
til
I
I
I
I
I
I
.-I
10
I
I
..I
I
I
I
I
I
I
Ie
I
I
I
I
I
I
..I
I
=
wp~
•
that, under H ,
o
(a,~)
where q = Eo (
1 /po) , and cr Oj
= crOjP,."Q..
~
Let ~
*=
~
(cr
ij
~
~
~
), i,j = 1,2, ••• ,c, where cr's are given by (3.24).
-L*
w
~
2.....-1
= c t
- cgJ' - c~I' + ~ ,
where ~ = diagonal (Pi' i=1,2, ••• ,c),
(3.26)
~ =
Then
r
crlle ... crl c•••
r
cr lr ••• crccc -
II
=
= (l)c Xc'
- -,
r @E*
the Kronecker (or the direct) product of matrices
f
*
and ~.
Thus by the
extension of Hoeffding's theorem it follows that, under the conditions of
Theorem 3.1, [N<YN-dj) has a limiting normal distribution with zero mean
vector and covariance matrix given by (3.26); here again ~ stands for a column
vector of the appropriate order with unit elements.
11
I
Now from (i), (ii) and (iv) it follows that I: i YiN • cd~ and, moreover,
it is the only linear restriction on YiN's.
Hence the distribution of YN is
singular of rank p(c-1) provided the distribution F of ~ is nonsingular in
the sense that the unit probability mass is not contained in any lo'fter
The limiting normal distribution Of~(YN-dJ) is then
dimensional space.
also singular of rank p(c-1) which is, thus, also the rank of ~ given by
(3.26).
*J
In fact, it can be verified that ~
It is then seen that, as expected,
f
=- Q and that
* is
~
of rank (C-1).
is nonsingularj it can be singular if and
only if there is some linear constraint on the random variables w(x(a»,
Q=1,2, ... ,p defined in (3.17), which is impossible since the distribution of
~
is assumed to be nonsingular.
I f we consider
YON
=-
(YiN, ... , Y~-l,N)'
'then
1i(YoN-d~)
is as;ymptotica11y
normal with zero mean vector and covariance matrix
~o • f @t~ ,
where
*
~ =- (~ij)' i,j • 1,2, ••• ,c-1.
~-1
"'0
=-
p-1rj{\JO",,1*
\:y tIIO·
Also
Let I:-1*.:(o-iJ).
=0 -•.
Then
which, after simplification as in [1], can be shown to reduce to
J
I
I
I
I
I
I
I
--I
I
I
I
I
I
J
12
I
I
I
..I
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I.
I
I
which is equal to the expression (3.12).
The theorem then f'ollows.
Remark.
As f'ar as the above theorem is concerned, the condition (iii) can
be replaced by the weaker condition thatG[Q(x ,x , ••• ,xc J2 be f'inite. The
l 2
f'unctions 9 def'ined by (3.4) to (3.7) satisf'y the stronger condition (iii).
We note that in T , given by (3.12), even though
n
P~
is independent of' F,
does depend, in general, on F and thus Tn is not a distribution-f'ree
statistic.
P~,
J.l.
But we can construct unbiased and consistent estimators
Le., an unbiased and consistent estimator £J
= (eQt3) of'
e~
of'
e as f'ollows:
Let
J.Le~(i) = ~n.-l).:.(n.-2c+2)
~
1
~
E
P
~(X~~l);tX~~k)'
k = 2,3, ••• ,c})
- d
2
,
where p denotes the sunnnation over all permutations of' (2c-l) integers
(jl,j2, ••• ,j2C-l) that can be chosen out of' (1,2, ••• ,ni ).
It can be seen that
e~(i) is a U-statistic and, hence (see e.g., [7J p. 142) a minimum variance
unbiased estimator, that can be f'ormed f'rom the i th sample, of' PQt3.
Moreover
(p)
PQt3 as ni -~ OJ. So we can have an
unbiased and consistent estimator of' P~ given by Ei e~(i)/c. We shall
it is well-known that eQt3(i)
-~
pref'er another consistent and unbiased estimator
c
n.(n.-l) ••• (n.-2C+2)e~(.)
i=l ~ ~
~
~ 1
E
e~ =
c
E n. (ni-l) ••• (n.-2c+2)
. 1 1
~
~=
If' we let
T*
N
=!
J.l.
(c-l)
c2
2
c
P.(U. i=l ~ _1N
E
u)
g )IE-l(U
- -iN--N'
N
* (p)
it can be easily shown that TN-TN - > 0, under the conditions of' Theorem (3.1)
13
I
and hence
* also
~
has a limiting X2 distribution with p(C-1) degrees of freedom.
Thus we have
I
I
I
I
I
I
I
Under the conditions of Theorem (3.1), the statistic TN
*
2
given by (3.28) has a limiting x distribution, under Ho' with degrees of
Theorem 3.2.
freedom p(c-l).
, J
t (a) ()
x = P [ x < Xj(a).
= 2,3, ••• ,c J--c1
,
CD
~
= J [1
- F(a) (x)J2c - 2 dF(a) (x) - c -2
-CD
2
2
= (c-1) /c (2c-1) ,
CD
~p~
=
CD
J J [1 - F(a)(x)Jc-1[1_F(~)(y)Jc-1dF(a,~)(x_y) -
c
-2
,
-CD -CD
*=
so that, if N
(4.1)
c
E
n (n -1) ••• (n -2C+2), we have
i i
i
i=l
c-2 +
~e~
=
4
N
[The number of (2c-1)-tup1es
(a)
(a)
~ij , ... ,t.t.j
that X
< X ' k = 2, ••• ,c
ij1
ijk
and
X(~j) < X(~j)'
i 1
i k
k
1
= c+1, ••• ,2c-1J,
and we get the V-statistic (2.1) by suppressing N in the subscript of U.
(ii)
For ~ given by (3.5), d = c- 1,
c
~
= (c-1) 2 /c 2 (2c-1)
14
-1
,
,
J
SUC'!l
2c-1
--I
I
I
I
I
I
.-I
I
I
..I
I
I
I
I
I
I
00
I1P~ =
-CD
I.
I
I
J[F(O:)(x)]C-l[F(I3)(y)]C-l dF(o:,I3)(x,y)
- c
-2
,
-00
and we have
-2
(4.2)
C
+ I-Le~
1
=...,.
[The number of (2c-l)- tuples ~j , ... '~1
N
(0:)
(0:)
1
J2C - 1
> X ' k = 2, ••• ,c
such that X
iJl
iJk
and
xi~> X~~~
k
= c+1, ••• ,2c-1,
and supressing N we get the B-statistic (2.2).
(iii)
Similarly, for
~
given by (3.6), d = 0
[
(0:).
[
(0:) ,j=2, ... ,c]
, (o:)()
x =Px>X.
,J=2, ... ,c ] -Px<X.
J
J
= [F(O:)(x)J C- l _ [1 _ F(O:)(x)J C- l ,
I-L 1
2(c.l)~(C-l)~ + 1 _
2
[(2C-2) _ l)J
- 2c-l
(2c-l)!
2C=I - (2C_l)(2C-2)
c-l
'
c-l
Ie
I
I
I
I
I
I
J
CD
00 00
I-LP~
=J
I {[F(O:)(x)JC- l
- [l_F(O:)(x)]C-l}{[F(I3)(y)]C-l _ [l_F(I3)(y)]C-l}
dF (0:,13) (x, y) ,
-00 -00
and we have
where N* (0:,13) and N* (0:,13) are the numbers of (2c-l).tuples mentioned in (4.1)
2
l
and (4.2), respectively; N*3(0:,13) is the number of (2c-l)-tuples with x(O:c\» X(o:),
i
iJ,.
(13)
(13)
*
.k = 2,3, ••• ,c and X'
< X' j , k = c+l, ••• ,2c-l, and N4(0:,13) is the number like
j
J.
*
N3(0:,~)
l
J.
k
obtained from inequalities with signs reversed, and we get the L-statistic
(2·3).
(iv)
Finally for ~ given by (3.7), d = (C-l)/2,
15
I
~(a)(x) ~ ~
pexja) < x] -
j=2
~ ~ (C_l)2
that is
j
JJ
-(I)
J
_ (C-1)/2,
CD
00 CD
p~ = 3
(c-1)/~ = (C_1)F(a)(X)
[F(a)(X)J 2 dF(a)(X) _ (c_1)2/ 4
= (C_1)2/12 ,
[2F(a)(X)-lJ[2F(~)(Y)-lJ dF(a,~)(x,y),
-00
which is the" grade correlation coefficient" ([7J, p. 259) between x(a) and
x(~),
and we have
(4.4)
l t ~i~
X
X
h
were
vi(a,~) i s the number 0 l' tripes
-ij , X
~ij
such that x(a)
ij1 > x(a)
ij2
and
this case
x~~) > xi~),
and we get the W-statistic
3
1
U(a) =
i
where
V~~)
If'
(2.~). I~
.;;;;.1_-
n l n 2 •• .nc
is the number of pairs (?Sit' ?Sjt ) such that
n1 =
~
= ••• = nc = n,
U(a)
i
=..1..
n
2
i
j
x~~) > x~~) •
i
j
say, then
[nR'(a) _ n(n-1) J ~ ~rn(a) _ E.:!:!J.
i
2
N i
2
It has been observed [2] that, in the univariate case, even with unequal nils,
the H-statistic [11] is the same as the one obtained from the W-statistic by
making the transformation (4.5) and, hence, in the multivariate case it is
conjectured that the appropriate H-statistic will be given from the W-statistic
by using (4.5), i.e., by (2.5).
A rigorous proof for this conjecture can be
16
I
I
I
I
I
I
I
--I
I
I
I
I
I
.-I
I
I
..I
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I.
I
I
given by considering the asymptotic normal distribution of
where
EN
= (BlN' •.• ,
N-l/2[~_j(N+l)/2],
B~N)' under H ; the asymptotic normal distribution under
o
H can be obtained either by an appeal to the multivariate extension of the
o
Wald-Wolfowitz theorem (see e.g., ['-rJ p. 239), or by noting that
-(a)
n.R.
~
~
where
1
=--
n.n.
1.
J
is a two-sample U-statistic corresponding to the function
~(x,y)
= h(x-y),
obtained for the sample pair (i,j), and then making an appeal to the joint
limiting normal distribution of ~Yij
- j(1/2)J's.
N
5.
Remarks.
--,.",----
It is seen that the U's occurring in these statistics are in the
nature of "between sample" comparisons while the e' s are in the nature of
"Within sample" comparisons.
a particular function
~
The comparisons themselves are with respect to
defined appropriately in each case.
In the univariate case it has been observed [lJ that the V-statistic is
more efficient than the H-statistic (or the L and B statistics), in the Pitman
sense, for populations bounded below (e.g., exponential distribution
f(y,a)
= e-(y-a),
y
~ a).
It is expected that the B-statistic is similarly
more efficient for populations bounded above (e.g., reversed exponential
distribution f(y,a)
= e(y-a),
y
~ a).
Both of these are fairly efficient
(and the L-statistic is even much more so) for distributions bounded on both
sides (e.g., uniform distribution f(x,a,p)
= l/(~-Q),
a ~ x ~
p). The W-
statistic is seen [2J to be as efficient as the H-stat1stic and these two
appear to be more efficient for unbounded distributions.
17
I
It is conjectured that the same will be true
multivariate analogues.
~or
the corresponding
These are also expected to be consistent against the
relevant class of alternatives and, especially against the class of translation
alternatives.
Work is in progress on these problems and will be presented in
a subsequent communication.
Since the distributions are assumed to be contin-
uous, the probability that any two observations are equal is zero.
practice, ties do occur.
But, in
This problem will also be considered in the next
communication.
REFERENCES
Bhapkar, V. P. "A nonparametric test for the problem of several samples,"
Ann. ~. ~. vol. 32 (1961), pp. 1108-1117.
Bhapkar, V. P. "A nonparametric test for the several sample location
problem," North Carolina Institute of Statistics, Mimeo Series No. 411
(1964) •
Bhapkar, V. P. "Some nonparametric median procedures," Ann. Math. Stat.
vol. 32 (1961), pp. 846-863.
[4J
Chatterjee, S. K. and P. K. Sen. "Nonparametric tests for the bivariate
two-sample location problem," Calcutta Stat. Asso. Bull. vol. 13 (1964)
pp. 18-58.
Deshpa.nde, Jayant V."A nonpararnetric test based on U-statistics for
several samples," (Abstract) Ann. Math. Stat., vol. 34 (1963), p. 1624.
[6J
Dwass, Meyer. "Some k-sample rank-order tests," Contributions to
Probability and Statistics, Stanford University Press, Stanford,
California (1960), pp. 198-2oe.
[7J
Fraser, D. A. S. Nonparametric Methods in Statistics, John Wiley and Sons,
New York, 1957.
[8J
Hodges, J. L., Jr. " A bivariate sign test ," Ann. _Ma_t_h. ~. vol. 26
(1955), pp. 523-527.
[9J
Hoeff'ding, Wassily. "A class of statistics with asymptotic normal distributions," Ann. Math. ~., vol. 19 (19 4 8), pp. 293-325.
[lOJ
Kiefer, J. "K-sample analogues of the Kolmogorov-Smirnov and Cramer-v.
Mises tests," Ann. ~. ~. vol. 30 (1959), pp. 420-447.
[11J
Kruska1, William H. "A nonparametric test for the several sample problem,"
Ann. ~. ~., vol. 23 (1952), pp. 525-540.
18
J
I
I
I
I
I
I
I
--I
I
I
I
I
I
.-I
I
I
..I
I
I
I
I
I
I
Itt
I
I
I
I
I
I
[12 J Kruskal, William H. and W. Allen Wallis, "Use of' ranks in one-criterion
variance analysis," i!.. Amer. Stat. Asso. vol. 47 (1952), pp. 583-621.
[13J Mood, Alexander McFarlane.
Introduction to the Theory of' Statistics,
McGraw-Hill Book Co., New York, 1950.
[14J
Vincze, T. liOn two-sample tests based on order statistics," Proc.
Fourth Berkeley Symp. on Prob. and Math. Stat., University of
Calif'ornia, Berkeley, pp. 6'/5-
..I
I
19
© Copyright 2026 Paperzz