ON THE INVARIANCE PRINCIPLE FOR EXCHANGEABLE RANOOM VARIABLES
by
Tien-chung Hu*
Department of Mathematics
National Tsing-Hua University
Hsinchu. Taiwan. 30043
ABSTRACT:
and
Neville C. Weber**
Department of Mathematical Science
The University of Sydney
N.S.W. Australia. 2006
The following note provides a more general version of Corollary 1
in Weber (1980) and explains why one can get by with only conditions on the
full n sum in the case of exchangeable random variables.
*Part
of the work was completed while visiting the Department of Statistics.
University of North Carolina. Chapel Hill.
**Now
visiting at the Department of Statistics. University of British
Columbia.
Abbreviated title:
Invariance Principle for Exchangeable Random Variables
AMS 1980 subject classification:
6OF05.
1.
INTRODUCTION
Let {X .: 1
n1
~
~
i
m. m ) n
~
I} be an array of zero mean random
variables such that for each n. X • X •...• X are exchangeable.
n2
nm
n1
Define
the a-fields
m
~nj = a{Xn1 • Xn2 •...• Xnj • !i=j+l Xni }·
j
= 1.2 •...• m.
Our first result explains why one can replace the condition
!~nlt]
X2 . ~ t.
1=
n1
used in invariance principles for martingale difference arrays (see. e.g.
Scott (1973) Theorem 2). by the condition ~ 1 X . ~ 1 when the {X .} are
1=
n1
n1
2
exchangeable.
We then use this result to obtain sufficient conditions for an
invariance principle for exchangeable random variables which are weaker than
those given in Weber (19BO).
2.
MAIN RESULTS
First we need the following lemma.
Lemma 1.
Let {ani: 1
~
i
~
n. n
~
I} be an array of positive numbers
satisfying
(i)
(ii)
max
i~n
a
ni
~
O. and
~=1 ani ~ 1. as n ~
Then we have. for any i
CXl.
> 1.
(1)
and for any integer k
~
1.
(2)
- 2 -
as n -+
ClO.
Proof.
Note
...n
i
i-1
a nl. <
- max a.
nl
~.1= 1
n
!i=l ani -+ 0
as n -+
ClO.
i~n
Since
+ ... +
~1=1 a nIl. [!i 2~11
~.
a . ).
a . ) ... [! . . i
i
n1 2
1~11' 2····· k-1 nl k
by (1) and (ii) we have that (2) holds.
Given {Xnl.} as above. if
Theorem 1.
(i)
max
Hn
Ixnl.1
~ O. and
then for t € [0.1].
as n -+
Proof.
Let {Y
: 1
ni
Iy i I -+ 0
i~n n
max
~1=1
and
~
i
~
ClO.
n} be an array of real numbers such that
2
y . -+ l.
nl
Let vn be a random permutation of {1.2 •.... n}
and in the first case suppose that Xnl. = yn,v (')'
1
We claim that for
n
t € [0.1] and k
~
1
(3)
and
(4)
To see that (3) holds. note
- 3 -
+ ... + [nt]([nt] - 1) ... ([nt] - k+1) E (X~l
X~
...
X~]]
= 11'm ([nt] 1 ~
2k
[nt]i[nt]-l) ~
2(k-1) (~
2]
-n ~1··--1 Ynl' + k
(1)
~
y.
~
Y i
n~
n ni =1 nIl
i ¢i
n 2
1
2 1
+ ... +
[nt] ... ([nt] - k+1)
n(n-1) .. , (n-k+1)
2 ( ! n12]
y.
nIl
.~.
1 2 r- l
y.
1
2
= t k , using Lemma 1.
Now (4) follows by using the method of moments as described, for
example, in Billingsley (1979, p. 342).
Finally, using Lemma 1.1 of
Kallenberg (1973) we immediately obtain the result for an arbitrary array of
row-wise exchangeable random variables.
The above result can now be used to obtain the following invariance
principle for exchangeable random variables under weaker conditions than
those in Corollary 1 of Weber (1980).
Theorem 2.
(i)
(if)
Given {Xni '
n E X2
n1
~
~ni}
as above, if
I,
n2 E X X
n1 n2
~
0,
(iii)
(iv)
(v)
max
j~n
n/m
~
Ix .1
nJ
0,
~ 0,
and
- 4 -
then
[nt]
W {t} = ~ X. => Wet}
n
. 1 nJ
J=
as n
~
00.
where Wet} is a standard Wiener process on D[O.I].
Proof.
------
Let Y . = X . - E{X .I~ . I} so that {~ 1 Y .• ~nk} is a
nJ
nJ
nJ n.JJ= nJ
Now. by Theorem 2 of Scott {1973}. ~~nt] y . => Wet} if
J=1
nJ
martingale.
{a}
{b}
n
~
. 1
J=
2
EYj~l,
n
max IY .1 ~ O.
nJ
l~n
[nt]
{c}
2
Y. ~
. 1
nJ
J=
and
~
t .
.
Since E{X . I~ . I} = {m-J+l}
nJ n.JE[
~,
-1...m
~..
l=J
X .•
nl
E {X .I~ . I}) = ~ E x2 1/{m-j+l} + ~ {m-j}E Xn1 Xn2/{m-j+l}
j=1
nJ n.Jj=1
n
j=1
2
~ E X~1 log{n+l} + nlE Xn1 xn21
~
0
as
n
~
by {i}. and {ii}.
00
Thus
n
~
. 1
J=
L1
~
_2
~{X .I~
nJ
. I}
n.J-
o.
{5}
and so {a} follows from (i}.
Next.
maxlY .1 ~ maxlx jl + maxIE{X .}I~ j 1}1
j~n nJ
j~n n
j~n
nJ
n.and
n
2
max IE{X .I~ . I} 1 ~ [ ~ E {X . I~ . I}
j~n
nJ n.Jj=1
nJ n.J-
)~
- 5 -
so (b) follows from (iv) and (5).
Using Theorem 1 we have that (iii), (iv) and (5) together imply (c).
Thus to complete the proof we need to show that
[nt]
!
j=1
. 1) ~o
n,J-
(6)
E(X jl,:
n
But
E
Ent]
]2
!
E(X jl,: .-1)
. 1
n
n,J
J=
r
[nt]-1 [nt]
Ent] 2
]
+
2
!
!
E[X
E(X .1': . 1)]
= E !
E (X .1': . 1)
nk
. 1
nJ n, Jk=j+1
nJ n,Jj=1
J=
r
The first of these terms goes to 0 using (5) and the second sum is bounded by
[nt]-1
[nt]
!
!
j=1
k=j+1
(m - j+1)-1 [E ~1 + (m-j) E X X ]
n1 n2
~
2 /(m-1) + [nt] 21 E X X 1
[nt] 2 E Xn1
n1 n2
~
0, using (i), (ii) and (v).
REFERENCES
Billingsley, P. (1979), Probability and Measure, John Wiley, New York.
Kallenberg, O. (1973), Canonical representations and convergence criteria
for processes with interchangeable increments. Z. Wahr. verw. Geb.,
27. 23-36.
----
Scott, D.J. (1973), Central limit theorems for martingales and for processes
with stationary increments using a Skorokhod representation approach.
Adv. Appt. Probab .• 5, 119-137.
'"
Weber. N.C. (1980). A martingale approach to central limit theorems for
exchangeable random variables, J. Appt. Probab .• 17, 662-673.
----
© Copyright 2026 Paperzz