Rootzen, H.; (1975).A note on convergence to mixtures of normal distributions."

A i~OTE ON COI'lVERGHICE TO ~U XTURES
OF ['lOR[,1AL DISTRIBUTIOHS*
by
Holger Rootzen
Depm:'tment of Statistics
University of North Carolina at Chapel Hill
Institute of Statistics p'Umeo Series #1045
December, 1975
* Research
sponsored in part by the Office of Naval Research, Contract
N00014-75-C-0809.
#
CmIVERGEl'~CE TO iiI XTURES
OF iJOmli{L DISTRIBUTIOHS*
A iJOTE Oi'l
by
Holger Rootzen
ABSTRACT:
Double arrays of random variables obtained by normalizing a sequence
that is asymptotically close to a martingale difference sequence are considered"
and conditions ensuring that the row sums converee in distribution to a
mixture of nonnal distributions
aTe
fmmd.
The main condition is that the
StmlS
of squares in each ro".r converge in probability to a random variable.
Al:1S 1970 Subject classifications: Primary 60FOS; Secondary 60G4S.
Key words and phrases:
Central linit theorem, rnartingales, mixtures of nonnal
distributio:llS.
* Research
sponsored in part by the Office of Naval Research, Contract #
NOOOl4-7S-C-0809.
A ['IOTE 011 CO!'!\![i<GLJCE TO fnXTURES
OF NORf·iAL DISTRIBUTID:'1S
1.
INTRODUCTION.
{~,i':' lSiS~, Il~l}
Let
space
(D,B,P)
Xn,P'" ,~,i
and let
be an array of random variables on a probability
Bn,l. be the sub-sigmaa1gebra that is generated by
(Bn,O = {q"D}).
If E(~»iIlBn,i-1) = 0, 2Siskn , n~l then
is a martingale difference array (m.d.a.) an.d if furthermore it is
{X
-n,l.}
obtained by normalizing a single sequence of martingale differences then
(1)
Bn,l. -c Bn+ l~1
'
for
n~l~
lsismin(k'n ,kn+1).
It'
It is well knmvn that if \ li=l
\ 'n X2 . converges in probability to a consta.'1.t
n,l
(=a, say) and certain other conditions are satisfied, then the distribution
k
of I.1=III Xn,l. converges to a nonnal distribution with meafl zero and
variance
k
a (see e.g. McLeish (1974)). In this note it is shmvn that if Ii~l X~,i
converges in probability to some random variable ~,then under similar condi1<
tions the distribution of Li~l Xn,i converges to a mixture of nomal distributions.
then
Eagleson (1975) observed that if ~
is measurable B1
oo
= nn_
l Bn, 1
is ak martingale difference array (m.d.a.) also after conditioninr
"n X . -+
d ep(a/If;) *under p(a";) (at least along subse~,;~'J
on ~ and thus I'1- l k n,l
quences satisfying I'~l X2 . a.s. ~ ~)
By taking expectations it follows that
{X-n,l.}
0
k
Ii~l
Xn,i
~
1-
n,l
f ¢(o/l:X}dF(x), where
F is the distribution of
main result of this note is tl1at the restriction that
~.
Hence the
is measurable 81
is shown to be superfluous, but also our other conditions are somewhat weaker
*
ep
is the standard normal distribution function.
~
2
than those of Eagleson.
(In fact Eagleson's results are phrased in terms of
conditional variances instead of SVJrrs of squares, but the translation is
straight-forward.)
Final1y~
it is perhaps worth mention that the condition
k
that Li~l x~,i converges in probability cannot be weakened very much:
Dvoretsky (1972) has given an example which shows that it is not enough,that
k
2
~Li=l
n Xn,i
converges in distribution.
2.
CmlVERGHICE TO :11 XTURES OF i'!ORnAL DISTRIBUTIONS.
We are mainly concerned with StDllS of martingale differences, but as is
argued in Rootzen (1975) the interesting case is when the tnmcated variables
asymptotically are martingale differences.
{Xn,1.}
that
Thus we lllitia1ly don't assume
is a m.d.a., but only that e.g.
k
n
L
i=l
(2)
x
-n,1. I el-n)1·1 s;1)11 Bn,1• -1} .4 0 as n-+oo.
E{X
Furthermore define
E:n,1.
{~1 i}
Then
max
ls;is;k
n
E(X
on, 1· I (IXn,1·1S;l)IIBn! , 1'-1}'
IE:. . ,
,1,1
s; 2 a.s.
Assume (1) and (2) are satisfied.
I~ . I
such that
(3)
is a m.d.a. and
,
THEOREM 1.
= -11,1
x .I(IJCll,1·Is;l) -
,1
If furthermore
4 0 and there is a random variable
E:
with distribution F
3
J¢(o/v'X)dF(x)
k
d
Li~1
~,i -+
then
as ~.
k
k
k
L'~l
Xn,l. - L'~l
F • = L-~l
E{X
1
1
"'n,1
1
-n,1.I(IX
-h,1·1~1)IIBn,1'-1} +
+ Ll·~1 X I· I (j)L 1. 1>1) and thus from (2) and
max
Ix-n,l·1 ~ 0 we obtain
-il,
Ii,
1<' ~1
\.
Ie
-1:::.Kn
I i =l Xn,i - Ii~l ';n,i ~ 0 as n-?CX>. Hence it is enough to prove
By definition
PROOF.
k
k
(4)
. In
1=1
Write
IX' . D2}
l~i~k
~
~x
n,l
n
l~l~k
n
n-?CX>.
n
-+
0 as
n-?CX>.
Since
l~i~k
n
IE(X' ·11 B
max
l~i~k
and since
.
-n,1 ll,1-1
n
p(l
·IIIBn , 1'-1)
-h,1
B( max
i = 1, ... ,k
n
E( max
l~i~k
n
)
I~
Ix'
·IIIBn,1'-1)'
"n,1
is a martingale it follows from Ko1mogorov's inequality for
max
IE(X' ·1113 . -1)
-n,~ n,1
l -<I'<k
n
martingales that
as
as
'
= -n,l
x .I(IX-n,1·1~1). Then 1 ~ 1max
-I 4 0 and thus also
- - Ix'n.l
~1~lc
'
x' .
-h.l
,
E{( mx
~ i ~ J ~(o/lX)dF(x)
n-?CX> .
I4
0 and thus also
max
F •
<.<
~,1
l-1-k
n
To facilitate the remainder of the proof "\..,e "'ill without loss of
{~,i}
assume that each row in the m.d.a.
F
•
11,1
= ±
1-.
n
with probability ~
is infinite and that for
each and independently of
k
4
generalL~
i >
~
I~=l ~,i>t} of the summation process based on {~,i}' Then
k
k
n
n
.L ';n i = SnoLn(.I .;~ i) - ';n l +1
1=1'
1=1"
~"n
(5)
r
and by construction
I t:
n k +1 1 =
''n
The next step is to prove
(6)
::>
d
n
k
n
2
= ISnOLn(.L ~
1=1'
~
-+
0 as
i) - 3n -Tn (';)1
9
ll-?CX>.
E-r
0 as
n-?CX>.
j
Bn i-I' Hence we
'
k-?CX>, for each n.
have max IF. I 4 0 as n-?CX> and L' ~1 .;2 . a. s. ) 00 as
~,1
1n,1
1 <'
_1
1
Let SnCk) = rf=l ~,i atlcl introduce nthe natural time-sca1e H
LnCt) = inf{k:
0
4
For E > 0 choose K such that
P({~>K})
< E and
qb~erve
tl¥it for any
0 >0
k
n
P({d >E}) ~ P({d >E}n{1 L F2 . - ~1~o}n{~~K})
n
n
i=l ~,1
+ p({1
~
L ~~" . - ~I>o})
i=l Il,1
+ E.
Here
k
n 2
P({dn >E}n{!1.__L1 ~,i - ~1~6}n{~~K}) ~ P({
sup
Is-tl~o
i ISnoTn(t)-SneTn(s)I>E} = Pn
say.
O~s,t~K+o
max
Il;n i I ~ max I~ i I 4 0 as n-+oo it follows frdin/theorem
(K+l)
,
l~i
'
n
1 of Rootzen
(1975) that {SnoTn(t); tE[O,K+l]}~=l , considered as random
Since
l~i~T
variables in D(O,K+l)
endowed with the Skorokhod topology, converge in dis-
tribution to a Brownian. motion on
[0,K+1].
Hence, by 111eorem 15.2 of
Billingsley (1968) we can make lim sup P < £ by choosing 0 small enough.
Y
Furthermore, by (3),
frkX>
n
P({IL~l ~,i - l;1>o}) ~ 0 as
lim Sl~ P({dn>E})
n-+oo
~
n+oo,
so
2£,
which since £ is arbitrary proves (6).
We now need the following lemraa, the proof of which is given on p. S.
LHlf'lA 2.
If for some integer k£ the random variable l;£
~
0 is measurable
Bn k for all large n then
, £
SnnE
oT (l; )
~
J ~(c/~dF£ (x)
as n-+oo,
where F£ is the distribution of l;£ •
." "" Since we have
assqm~d"
(1) it is for "E >
k£ and a random variable r,
~
E
Q
possible to find an integer
0, which is measurable with respect to Bn,k
£
'
5
for all large n, such that
P(I~-SEI>£)
s E (see e.g. Doob (1953), p. 607).
From Le111111a 2 we obtain that (7) holds and since
F£
~
F
as
£ -+ 0 also
(8)
j'breover, by argt.nnents precisely analogous to the proof of (6) above, we have
for
0>0
that
1:iJn lim sup P({ISnoTn(~£)-SnoTnu';)I>o}) =0.
(9)
£-+0
n-+<:o
By Theorem 4.2 of Billingsley (1968) it follows from (7), (8), and (9) that
o
which by (5) and (6) proves (4) and thus the theorem.
PROOF OF LEnr-1/\ 2. The lemma can be reduced to the corollary on p. 561 of
Eagleson (1975), but as it doesn't i.nvolve more work to give a direct proof we
will do that instead. Put ~'n,1. = Sn1, 1" if i > k E and ~'n,1. = 0 otherwise,
k
and let P (0) = pc?1I E.:) be a regular conditional
let S'(k) = I
E;'
E
£
n
i=l n,i
probability given s . Since t; E Bn k
for all large n, the rows of
£, £
E
{;' .}
are martingale differences for such n
n,1
ref [4].
(10)
Moreover, for
Tn(t)
2
I IF'.
i=l ~,1
tis
P
£
(a.s.), cf.
t~O
k
-
also under
£?
I
~ ...
i=l n,1
+
2
F-
(t) s (k +l)(max
~,Tn
£
lsi
IF~,1.1)
2
.
As max I~ .I 4 0 we can for every subsequence of {l, 2, ... } find a further
lsi
,1
(infinite) subsequence {n I } such that w.ax I~, i I a. s.~ o. Thus also under
lsi
' T (t)
F'
I
a.s.
0
d
\' n
F,2,. a.s.~ t.... B
PE we have a.s. t hat ~~ I"'n' ,i
~
an
l.i=l
"'n,,~
"
Y.
6
Lennna 3 of [5] this proves S~,.o'rn,(t) ~ ~(o/If)
is a constant under P£ we have in particular that
(P£)~ a.s.~
Since ~£
under P£ (a.s.).
S~, .• Tnl
(1;£) d
¢(o/~
and thus by taking expectations that
(11)
Since a subsequence {n I} satisfying (11) can be extracted from any infinite
S~o'rn(I;£) ~
sequence of integers it follows that
I ~(o/l:X)dF(x)
ISn0'rn(1; )-Sn'o'rn(1; )1 ~ k max IF - 1 ~ 0 proves (7).
£
£
£ 1<"
11,1
-1
which since
n~,
as
0
COROLLARY 3. Assume {X-h,l,} k is a m.d.a. and that it satisfies (1).
If
2
max IX-h,l"I ~ 0, if . I'~l
E(Xn,l,I(IX°h, 1- 1>1 liBn,l'-1) ~ O~ and if further1
l -l-- n ~ 2
more
L1'-1 X ,~I; for some randon variable l; with distribution F, then
k
n,l
<"<k
Ii~l
Xn,i
~ f ~(o/l:X)dF(x)
as
n~.
X' _ =
PROOF. We only have to show that (2) and (3) hold. Recall "h,l
Xli, = X , - x' , . Since E(2);,i llBn,i_I)2 ~
= Xn,l"I(IX-h,l-I~I) and put "h,l
-h,l "h,l
~ IE(X'
-h,l,liBn~l"-1)1
that (2) holds.
k
n
I
i=l
2
F-. =
11,1
k
n
')
~ E(XIi"_IIB
n,l n~l--I)
= IE(X"n,l,liBn~l'-1)1
Moreover, by definition
2
k
n
2
I X ,+ I {Xn i=I n,l i-I n,l
-
+ E(x:, -liB
._,~
?
,)-}
n,l
- 2
k
n
= i=I
I
it follows immediately
~
I
i=l
X
,{y(!, + E(XI
-n,l
1l,1
"liBn~l")}
"h~l
2
- + r - 2 r' , say.
n,l
n
n
X
Here r ~ 0 as n-+oo and usiag Cauchy's inequality we obtain
n
k
Ir'l
X2 "}~ ~ 0 as n-+oo, so (3) follows
n ~ {2rn I,n
1= l n,l
0
7
Finally, it should perhaps be mentioned that the results of this note hold
also when
{~}
are stopping times.
REFERENCES
[1]
Billingsley, P.: Convergence of Probability Measures. New York: Wiley 1968.
[2]
Deob , 3.L.: Stochastic Processes. New York: Wiley 1953.
[3]
Dvoretsky, A.: Central limit theorems for dependent random variables. Proc,
.,)';.,::.Sixth:Berke1ey Symp. Math. Statist. Probability. Berkeley: Univ. of
Calif. Press 1972.
[4] Eagleson, G.K.: Martingale convergence to mixtures of infinitely
divisible laws. Ann. Probability 3 557-562 (1975).
[5]
Rootzen, H.: On the ftmctional central linit theoren for nartinzales,'
Institute of StatistiGsJumeQ Series #1043,i·lIniversity 0:: Harth
Carclina~at.Cha~el Hill, 1975.