Gallant, A.R.; (1975)A uniform central limit theorem useful in nonlinear time series regression." Jan. 1978.

•
,
A UNIFORM CENTRAL LIMIT THEOREM
USEFUL IN NONLINEAR ~+ME SERIES REGRESSION
by
A. R. GALLANT
Institute of Statistics
Mimeograph Series No. 973
Raleigh - January 1975
\e
•
The purpose of this report is to prOve in detail a. central limit
theorem useful in nonlinear time series regression,
of the proof are due to Goebel
The main ideas
(1974). The contribution here is to
obtain a stronger conclusion than his Theorem 3 and correct some
minor errors in his proof.
Lemma 1.
Let
(k=1,2, ... ,
0>
Assume that for every
I
uniformly in
n
e.
and
lim
n"'
n=1,2, ... )
°
Assume
oop[zkn(e)sz]
and
e
uniformly in
lim
not
e
or
n"'
oo
P[ S (e) s z] == N(z; 0,
n
Given
n
s >
°
there is a
for all
e.
Then
,l (e) )
'T'2( S)
0>
°
depending on
e
but
such that
N(z + 0; 0, ,.2(S))
because
00
e,
uniformly in
Proof:
°< t s -?(e) s !J. <
where
< N(z; 0, '?(e)) + e
is suitably bounded from above and below.
There is a
•
2
k which depends on
N (z + 6;
6 but not on
°,
2
Ok ( e))
e
or
n
2
N (z + 6; 0, '!' (e)
<
by hypothesis and the uniform convergence of
there is an
for aJ.l
n*
depending on
)1
e
k
and
6
+ e
o~(e) to
but not on
e
,!,2(e). Then
such that
n > n*
Consequently, given
on
such that
e >
°
there is an
n* which does not depend
such that
'\
< N(z + 6; 0, O~(8) + 2e
< N( z; 0,
for aJ.l
*
n > n.
n
n > nI
where
2
(8» + 4e
Similar arguments can be used to show that
peS (8)
for
'!'
nI
~
z] > N(z; 0,
'!'
2
(8)
does not depend on
- 4e
e.
0
•
3
Lemma 2.
2
sUPe c (e) < ~
If
uniformly in
t
e where
Proof:
0
<
rs
for each
c(e)
s ; <
There is a sequence m
n
n
I
not depend on
e
Given
such that for
~
and
for all
a
then
such that
The lemma would be true trivially if m
we assume the contrary.
t
e > 0
n > n*
were bounded for all
there is an
n* which does
we have
m -1
m
m
m -1
n
2
Z n c2 ( a) _ (_n_) 1
= sUP [ ( ~) !....
m -1 Zt=l c t (e)]
n
m t=l t
e
n
n
m -1
m
n (c( e) - e)]
< sUPe[ (n)(c(e) + e) _ (_n_)
n
~ j:L/ n + 2e - e/ n.
0
Theorem.
be the generalized linear process
Let
(Zt}
~
Zt = Z.
J
=
-~
a.e t .
J
-J
(t=
O,~l,
... )
n
so
•
4
where
r,ooJ':=
Ia.1
J
_00
<
and the
00
et
are independently and identically
;- > 00
distributed with mean zero and finite variance
Let
[c (e)}
t
':)
suPec;(e) <
be a sequence for which
l ~"" n
.w.lL
....
00
uniformly in
for all
e
n-l
e
~~::=l-
'-'t
Ihl
0 <
where
and all
k
ct
()
e ct
J:
:5
for each
00
()
e
+ Ihl
I:~
::= -k
r,~
::=
:=
t
and for which
-( h,e )
C
c
-k a i a j (i-j,
e)
:5 jJ:" < co
Then
0
converges in distribution to the normal distribution with mean zero
and variance
I
~ a.a.c(i-j, e)
-1. J
uniformly in
Proof:
;
:=
_00
c(k, e)y (k)
e.
The proof consists of verifying the assumptions of Lemma
where we have suppressed the argument
Let
00
:=
0 > 0 be given.
Since
e
F[ Ixknl
for simplicity.
> oJ < var(x kn )/02 by
Chebysheff's inequality, the first assumption of Lemma 1 may be
•
5
verified by showing
e > 0
For given
(l:i > k
Ia i 1)2 +
l~
-> co
there is a
(l:i
2
<
+ (a In) t.1
k
o
kn
) = 0
e
for all
- k a.1 a.J
t...~
1 S t - j + i:5
n}
-
k E.J
<
l.llliformly in
depending only on
Ia i 1)2 <
-k
<
Var(x
e T
k
e
> ko
CtC
nand
such that
Then
t -J'+'1
where
T =
[t:
I S
t S n,
I
2.
< a2-}l> eI a
o
This establishes the first condition of Lemma L
For
Figure 1)
n
larger than
2k + 1
we may split
Zkn (e)
as
(see
e.
:F:igure 1 •
•
8_
e1.6
e lS
e 14
e
8_
8_
13
a_
e
8_
lO
a_
eg
a_
eS
e
e
I
2
a_I
a
a_I
a
a
a_ 2 . a_I
a
a1
a2
a -1
8
a
a
a
a1
a
a
a
a_
a
a_
e
a
3
3
-2
e2
a_I
a
e·
1
a
a
eO
a
O
e_1 . a
2
e--2
3
c
a
1
2
a -2 . a_I
6
e4
a_
3
a_
7
eS
3
3
-:3
O
1
a_
a
2
0
2
3
2
a_
O
1
2
a
O
2
O
1
a_
a
a
a
a
8
3
2
-1
O
1
2
a
3
a_ 2
8
2
a_I
a
a_I
a
a
a_I
a
a
a
a
e 12
ell
3
8_
3
8_
2
O
a
1
a
2
a
a_
a
a
O
1
2
a
a
O
1
2
3
-2
a
-1
O
1
2
8_
3
8_
2
8
a
a .~, l
a
8
a
a
a
a
0
1
2
a
a
3
-2
-1
O
1
2
3
3
a3:
3
3
3
3
3
3
3
3
.a3
3
1
c
c
2
3
c
4
C
s
c
6
c
7
C
s
cg
c
10
c
ll
c 12
c
13
}:13 }:3.
c a,e , is obtained by multiplying each a,
t=1 J=-3 t J t-J
J
by the corresponding terms on the horizontal and vertical axes and
The sum
summing.
e
with those
The split is accomplished by segregating terms associated
a,
J
between the hori.zontal lines.
•
6
The variance of' Vkn
is bounded by
The term in square brackets does not
var~
with
n
so we have
I
which converges to zero as
Lemma 2.
tends to inf'inity unif'ormly in
Consequently, if' we show
lim
F[U
kn
n"'c:o
e
unif'ormly in
z)
?
=:
N(z; 0, r:f: ( e))
=:
N(z; 0, 0k(e))
k
co
F[Zkn ~ z)
2
e.
unif'ormly in
dt
~
it f'ollows that
lim
n"'
Set
n
=:
~
=:
-k aic t + i ·
By Theorem 1 of' Hertz (1969)
sup z IF[Jri Ukn/s n :s; z] - N(z; 0, 1) I :s; Akn (e)
where
e
by
•
7
'1' n ( c )
and
I
::: ~-k
2
~:::k+ l dt
K is a finite constant,
JId e I >
t
()
Now
J l J e 2 > v2.
2
<
- s -2-n-k
~+ 1_ l dt
n v=~+·
2
c e d Fe,
~n
0
f
2 2d F ()
. f ( 2/d)e
e dv
l $ t $ n~n e sn t
2
. f ( 2/d 2)e d F(e)dv
$ t $ n~n a sn t
: : J ol J e 2 > v 2.~nf l
•
Thus, if we show that
we will have
limn-tOO 6.
kn
( e) ::: 0
uniformly in
convergence theorem and the fact that
::;
;'t~
~
:::
:::
:::
2 k
(J
t.
~
~(S)
:::
k
kI:·
- J
::;
J_=
a a limn ...
- k ~. J.
S by the dominated
2
e d F(e) ::: ;.
<
oon-lr.;~icp+-i
a a c( i-j, e)
-kS ::; - k ~. J.
00
,·Now
(S)Cp+-j(S)
•
8
e . Moreover,
e whence, for
uniformly in
uniformly in
independent of
8,
2 ( a)
Ok
e:
is bounded from below by
with
such that for
° < e: < u,
n > n
J--
there is an
no
o
.... 0 0 .
Thus,
I
S
IJt./rlukn/s n S In z/s n ] -
N(Jil
z/s n ;
0, 1)1
.. °
uniformly in
e
as
n
tends to infinity.
Lastly, we verify that
j.r2(e) -
0k2 (e) I
:=
l~ ....
12; 1::. >
~
S 2eF l::i >
Ikl
Ikl
00
~ ( e)
:=
,l(e)
uniformly in
1::00.J.__
_ oo a.a. c(i-j, a) 1
~J
1::; = _oolaiaj c(i-j,
a)1
e.
•
9
because
<_ 2~2(-u/a2)
0
\oJ
(,<::,rtJ.
"J
= _rtJ Ia j
I) ('<"i
::'
> Ikl Ia i I)
The last term on the right does not depend on
arbitrarily small by increasing
e
.
and may be made
0
k.
The result which finds application in nonlinear time series
regression is the following corollar,y.
Corollary.
I
rz
}rtJ
be the generalized linear process
t t t = _rtJ
Let
(t == 0, ~ 1, ... )
a.e
.
J t -J
where
l:;
_rtJ
=
Ia j I <
and the
rtJ
with mean zero and finite variance
e
are independently distributed
t
(,2 >
0.
Let
(Ct}~=l
be a
sequence of p-vectors for which the limit
C(h) = lim
n ....
exists for all
p-vector
A
depend on
k
Then
h =
0,
~
rtJ
~ "t=l
1, ••.
Ihl
Assume that for each non-zero
there are finite constants
such that
1 and u which do not
•
lO
converges in distribution to the p-variate normal with mean vector zero
and variance-covariance matrix
CCI
!:.J =
=!
Proof:
;: =
_00
_00
a. a .[
1
J
c(i - j) +
h'S
-j )]
y(h)[C(h) + c'(h))
We apply 2c.4 of Rao (l965, p. l03).
By the Theorem,
C' ( i
Let
X "" N (0, V).
P
converges in distribution to a normal with mean
n
zero and variance
I
_CIO
y(h) limn-+
oo
n -l
-co
.
y(h) llm
n -+
co
n
_00
Y(h)
CIO
,. =
~=
=
~=
=
~
00
=
!
~=~
Ihl
-l ~ - Ihl
-l
A' CtC ,
t + Ihl A
1
"2
A CtCt + Ihl + c t + Ihlc~JX
'[
,
A'X
for every non-zero
A'[c(h) + c'(h)JA
= A'v A .
Thus
A'S
converges in distribution to
n
(The matrix
!:~
= -co
it is not symmetric.
in V.)
0
y(h)C(h)
A.
is positive definite by assumption but
This is the reason for the term
~C(h) + C' (h) ]
•
II
REFERENCES
Goebel, J. J. (l974) "Nonlinear regression in the presence of autocorrelated errors," Ph.D. dissertation. Iowa State University.
Hertz, E. s. (l969) "On convergence rates in the central limit
theorem," The Annals of Mathematical Statistics, 40, pp. 475-479.
Rao, C. R. (l965) Linear Statistical Inference and Its Applications.
New York: John Wiley and Sons, Inc.
I