It
ati~
This re8eaPCh was partia'Lty supported by the Nationa'L Science Foundunder Grant GU-20S9.
1
Department of Mathematics and Department of Statistics.
2
Department of Statistics.
GAUSSIAN STOCHASTIC PROCESSES AND GAUSSIAN MEASURES
by
Balram S. Rajput1
Stamatis Cambanis 2
Department of Statistics
University of North Carolina at Chapel: HiZZ
Institute of Statistics Mimeo Series No. 705
August, 1970
.'" ,
-
lt
GAUSSIAN STOCHASTIC PROCESSES
AJID GAUSSIAN MEASURES*
Balram S. Raj put
Department of Mathema.tics and Department of Statistics
University of North Carolina
Chapel Hill, North Carolina 27514
and
Stamatis C&mbanis
Department of Statistics
University of North Carolina
Cha.pel Hill, North Carolina 27514
,
*This research was partially supported by the National Science
Foundation under Grant GU-2059.
1.
INTllJOOcrION
Gaussian stochastic processes are used in connection with problems
such as estimation, detection, mutual information, etc.
are often effectively
fo~ulated
These problems
in terms of Gaussian measures on appropriate
Banach or Hilbert spaces of functions.
Even though both concepts, the
Gaussian stochastic process and the Gaussian measure, have been extensively
studied, it seems that the connection between them has not been adequately
explained.
Two important questions arising in this context are the follow-
ing:
(Ql)
Given a Gaussian stochastic process with sample paths in a Banach
fUftCtion space, is there a Gaussian measure on the Banach spae;e which is indueedby the given stochastic process?
(Q2)
Given a Gaussian measure on a Banach function space, is there a
Gaussian stochastic process with sample paths in the Banach space which induees the given measure?
•
The purpose of this paper is to explore questions Q and Q in the com2
1
• IIOnly encountered spaces
C[O,l)
and
L [0,1].
2
Section 2 provides affirm-
ative answers to both questions Q and Q for the space
2
1
strictive assumpt1oaa.
The space
L [O,l]
2
C[,O,ll,
with no re-
is considered in Section 3.
Under
eome mild conditions, a stochastic process inquces in a natural way a probability measure on L [O,1].
2
If in addition the process is Gaussian, a neces...
aary and sufficient cOlldition for the induced measure to be Gaussian is derived in Theorea 3.1 and Corollary 3.1.
affirmative in Theorem 3.3.
The question Q is answered in the
2
Section 4 includes some generalizations and ex-
tensions of th. results of Sections 2 and 3.
The most important result is
'lheorem 4.1 which provides an affirmative answer to question Q for
1
.
L (T), T
2
any Borel measurable subset of the real line, under conditions considerably
weaker than those of theorem 3.1.
2
It should be mentioned thn.t,
t.hrour~h01At
the literature, whenever
the need arises to have a Gaussian measUl·e induced on an appropriate
"unction space by a Gaussian
:proc~r.s, then,
ei "th('l'" (i) it is assu.'lled that
the Gaussian process is mean squarE' continuoul'l o.nd that the index set is
a compact interval [5, 7,
93, nnd thus
unnpce8~ar~ ~~Gurnpticns
are made VAl
the process, or (ii) the term "Gaussian process" is used to :nean a generalized
Gaussian process, i.e., a Gaussian measure, or a
m~L\surable
map which induces
a Gaussian measure [1, 2) and thus the problem of inducing the Gaussian
measure from a
Gauss~an
,
Thro\~hout
are considered.
process is not considered.
tnis paper real Banach spaces, and real stochastic processes
The basic notation, definitions and properties that are
consistently used in sUbsequent sections are given in the followinp,.
Let
X be a real, separable Ba.nach space of' functions on
denote by SeX)
Let
X(t,~),
process defined on the probability space
almost surely (a.s.).
t
~(B)
£
& real stochastic
and such that
x(·,~) ~
=
xC· ,tAl}
llX induced by.
X(t,~).
=P{T-1(B)} =p{~
,
€
I.
0: X(·,~) ~ B}
S(X).
Gaussian it for every finite
(o,F,P;
X(t,~),
t ([0,1])
n and tl, ••• ,tn
!
is said to be
[0,1], the random
X(tl'W)""tx(tn'~) are jointly Gaussian.
Some definitions and properties related to Gaussian measures are
summariZed here.
Let
X
(1.1)
is defined by
A stochastic process
variables
[0,1], be
(O,F,p)
is measurable, then the probability measure
(X,8(X»
~
If the map 'J': (O,F) .. (X,B(X», defined by
T~
for all B
and
the smallest o-algebra of' subsets of' X which contains
. all the qpen subsets of X.
t £ [0,1], on
(0,1]
X be any separable Banach space and
H(X)
the
3
a-algebra generated by the open sUbsets cf
;t is vell known that
is also the smallest a-algebra of subsets of X with respect to
R(X)
which all continuous tunctionals on
measure
on
11
functional
that
11
F
Let now
(X,S(X»
X are measurable.
is said to be Gaussian if every continuous linear
X is a Gaussian random variable.
on
A probability
It is easily seen
is Gaussian if' and only if any finite number of continuous linear
functionals
{F1' •• • ,Fn}
on
X are Jointly Gaussian random variables.
X = H, a separable Hilbert space with inner product and norm
denoted respectively by
on
X.
<.,. >
and
11·11.
If
~
is a Gaussian measure
(H,S(H», its mean and its covariance operator are defined [8] as the
element
U
o
E
H and the bounded, linear, nonnegative, self-adjoint and
trace class operator
S on H which satisfy
E[<u,v>]
= <uo ,v>
E[<u - u ,v><u - u ,w>]
o
0
•
The support ().f a Gaussian measure
•where
&p{+}
n
is the closure in
genera1ied.bythe eigenfunctions
, for all v
= <Sv,W>
11
on
(H, B( H) )
, for all
(H,8(H»
l+n)
is the set
u
o +sp{+l,
n
sp{+n}
of the covariance operator
{>'}
n
[It].
Also, if'
lJ
S
which
is a
then [8]
2
2
E[ lIull ] = f1luI1 dlJ(u) • 1: A
H
n n
<
+-
(1.5)
Bote that if a (not necessarily Gaussian) probability measure
(H,B(H»
(1.4)
v,v E H"
H of the linear manifold
correspond to its nonzero eigenvalues
Gaussian measure on
H
€
satisfies (1. 5) then its mean
U
o
lJ
on
and its covariance operator
are vell defined by (1.3) and (1.4) respectively.
2.
GAUSSIAN STOCHASTIC PROCESSES AND GAUSSIAN ?~A..~URES ON
C[O,l]
In this section it is shown that both questions raised in the introduction have an af'firmative answer for the space
valued continuous fUnctions on
(0.1]
X. C[O.l]
with the suprimum norm.
of all real
S
4
be a Gaussian stochastic process with
'Ch~.~
measurable, and in rrheorer.;2.1. t.h.s:t
X(t,w), t
(X
[0,1], on
E
f.,I'onosition 2.1.
It
ind"l;ced by
is ~nuR~j~~.
ba a. p:"oba'bLl.ity space.
(O,F,P; X(t,cu), t ~ [O,lj)
If
(i)
{n,F,p)
~. X
rr:es.SL;:rc:
= C[O,l], B(X»
Let
cont:uuous sam:ple pa:;;:-'s.
G..a.
is
a.s. continuous sample paths, then the ma.p
tl.
sto(~ha.Gtic
process with
d~~fir.~d by (1.1) is measur-
T
able.
COllyersely, if
(il)
measurable, then
map
0.
K: ($;,F)
x(t,w):= (Kw){t), t
-+-
(X
= CrO,l),
[0,:], is
F.
tJ.
SeX»~
is
stocl1astic process
defined on the probability space (U.r,P).
Proor.
It suffices to show that the inverse image under T of
(i)
a closed sphere in
'F.
lJ' = {u
Let
€
X with center v
II u
X:
T-1(B) =
- v
I!
~
Q:
{cu
= n{w
r
~ E}.
n:
~!d
radius
sup
vCr) -
is a stochastic process, feu
t e [0,1]
(ii)
and
a ~ b.
£
For every fixed
K-1 {u ~ X: u(t) ~ a}.
functional on
X, {u
€
t E [O,lJ
Since Ft(U)
E
E Q:
T- 1 (B)
Hence
X: u{t)
~ X(t,w) ~
a
~
vCr) +
Let
X(t,w)
we h~ve
=u(t),
[0,1].
< h} €
Since
F for every
t
process with a. s. continuous sample paths.
{w ~
n:
X(t,~} ~ a}
=
u ~ X, is ~ continuous linear
~ a} E R(X) (Aee
(n,f,p; X(t,cu),
t}
F.
measurability ot K, {w E 0: X(t,cu) < a} ( f.
Theorem 2.1.
belongs to
IX{t,cu) - v(t)l ~ &}
where the intersection is taken over all rationals in
X(t,~)
e > 0
Then
tdO,l]
~
X
£
Section 1), and by the
II
([O,lJ) be a Gaussian stochastic
Then the probability measure
5
~x
inril:i.Ccd 'hy
?'roof.
X(t,4i)
IX
or;
In order to shou
that every cont inuous line::..r
= ClU,l},
S{X)
tha~,
is Gaussi",n
:.~un(·-:' ';'nnal
bounded variation on
on
X
Then there exists
Ii.
suer. that for all
[0,1]
j t ~uftjceG
to show
e. Gaussian random
~s
(X,8(X),u x).
variable on the probability space
linear functional on X.
,IX
i.~ r,l$u5sian.
~et
F be a continuous'
vt~lued
real
f\mct1on
ifi
of
u ( X,
,
...
= !u(t}d$(t)
F(u)
o
where the integral 1s Riemann-Stic·:Ltjes.
Then by
,
(1.2)
~
~x{u
€
X: F{u) ~ a} ~ P{w c n: fX(t,w)d~(t} ~ ~}.
o
Hence, in order to show that
F(u) is a Gaussian random variable on the
(X,B(X) ,}.Ix)' it sl~ff'ices to show that
probability space
1
= JY.(t,w)d~(t)
t(w)
a.s.
o
is a Gaussian random variable on
l.t is .known that for continuous
(n,F ,p).
r
functions the Riemann-Stieltjes integral can be p.pproximated as follows:
• Consider the partitions
{t k,n
k
=-.
n
k
~,n
n
= 1,2, •.•
of
[0,1)
defined by the points
= O.l, ••• ,n - l,n}, a:ld define
r.
n
(l.l)
n
= r
k=l
Then lim t n (61) = E;(61) a.s.
n++is a Gaussian family. Since
that it is Gaussian.
X(t
k
,w)[41{t, ) - 1>(t 1)]
kK
and the sequence of random variables
~(61)
is finite
{f;nl
a.s., it follows by Lemma 5.1
II
We now turn to question
~
raised in the introduction and give an
affirmative answer in the following theorem:
Theorem 2.2.
Given any Gaussian measure
there exists a Gaussian stochastic process
a.8.
continuous sample paths which induces
v on (X
= C[O,l],S(X»,
(O,f,P; X(t,61), t
JJ
on
(X,B(X».
£
[0,1])
with
6
Proof.
'l
and denote the identity map between
p - 11
u·
(n,f,p)
Define the probability space
I: tG,f)
Clearly
CII.
-t
[X,8PO}
by
and
X
f. B(X),
Q -X,
t: 1C11 • u,
by
is a measurable map.
Define
where
X(t,(I)
by
=
X(t,CIl)
(1(1)(t)
Then. by Proposition 2.1(ii),
(G,f,P).
X(.t,CIl),
t
nand
.
i.e..
l1,
£
(0,1],
tl, ••• ,t
n
fined on
iables
(O,F,p)
t
X(t.~),
t €
[0,1],
£
[O,lJ,
£
K(t,l.\)
induces
lJ
on
w ( Q.
is a stochastic process on
(X,B (X) ).
X(t,CIl), t
[0,1),
£
In order to show that
is a Gaussian process it suffices to show that for every
[0,1],
X(tk,w), k • l, ••• ,n,
the random variables
de-
are jointly Gaussian; or equivalently that the random var-
Ftk(U). (1C1l)(~) • u(t ), k • l, ••• ,n,
k
jointly Gaussian.
defined on
(X,B(X),l1)
:But this follows from the fact that each F
uous linear functional on
3.
u(t),
It is clear by (1.1), (1.2) and the definition of
1Jx.
that
•
X and
tk
is a Gaussian measure on
lJ
GAUS&1AN STOCHASTI~ PBOCESSES AND GAUSSlJu~ MEASURES ON
are
is a contin-
(X,SIX)).
II
L (O,l]
2
In this section we consider questions Q and Q for the Hilbert space
l
2
L [O,l]
2
of real valued, square integrable functions with respect to the
Lebesgue measure.
First, question Q is considered.
l
Gaussian stochastic process.
X(t,w), t£[O,l],
The problem is under what conditions does
induce a Gaussian measure on
minimal conditions are that
where
X(t.(j)
be product
(X.
L [O,l],B(X)).
2
(8[O,l])C f)
and
(1 )
2
belong to
L [0,1].
2
Obvious
measurable,
8[0,1]' is the a-algebra of Borel measurable subsets of
that almost' all sample paths of X(t,w)
(11)
Let (n,f,p; X{t,w), t£(O,l]) be a
(0,1],
and
Denote by
the following sets of conditions:
..
7
x(t,cu): ([O,lJ
x
R is the real
lir.~ ~ld
F,B[O,l]
x
F)·· .. (R,B(R»)
is measurable, where
FeR)
1s the class of Borel subsets
is Ga.ussi~ with Jl'lean
met), autocorrelation ret,s)
of' R.
(ii) X(t,CAl)
and covariance
H{ t,s).
and
a.s •
.An application of Fubini's theorem t>roves the f'ollowin~ proposition:
Proposition 3.1.
If the stochastic process
satisfies (1) and (iii) of
(n,F,p; X(t,CAl), t
€
[0,1])
(I ), then the map T defined by (1.1) is
2
measurable.
Hence, if (i) and (iii) of
(I~)
are satisfied X(t,CAl), t
...
~X
induces by (1.2) a probability measure
the
ques~on
under consideration is:
is Gaussian, under what conditions is
on
(X
= L2[0,1),
Gaussian.
[0,1],
SeX»~.
If, in addition, X(t,w), t
~X
€
€
~ow
[0,1],
A sufficient condition
is given in Theorem 3.1 and is shown to be also necessary in Corollary 3.1.
For the proot ot .Theorem 3.1 the following lemma is needed.
Lemma 3.1. Let (n,F,p; X(t,w), t
satisfying
on
[0,11.
(ll) and let
f(t)
€
[0,1]) be a stochastic process
be a real valued, Borel measurable tunction
If
1
2
E(!lx(t,w)t(t)!dt) < +w
o
then the random variable
t(w)
defined by
1
t(w) c !X(t,w)t(t)dt
a.s.
1
0
1 1
is Gaussian with mean !m(t)f(t)dt and covariance J !R(t,s)t(t)f(s)dtds.
o
o0
..
I
8
Remark 3.1.
It in loemma. 3.1 the condition (3.1) :1.s replaced by
,
1
(II)
E(lx2 (t,w)dt) = !r{t,t)dt ~
o
0
(III)
1
2
F.(f1X(t,lAl) Idt)
or by
<
o
+..0
+"',
then the conclusions of the 1en'.1na are true for every t
E:
L [O,l]
2
every real, Borel measurable, a.e. uniformly bounded function
respectively.
Note that, as foilows by
Sch~'artz's
f
and tor
on
[0,1]
inequality, (II) implies
(III) •
In view of (1 ) and (3.1) the r&.ndom variable
1
Proof.
(fAl)
is well
defined by (3.2) as an a.s. sample path integral.
Let
be the closure in L2 (O,F,P) of the linear manifold
generated by the random variables X(t,w), t ~ [0,11. Then H(x) ~ L (O,F,P).
H(X)
2
Clearly every family of random veriables in H(X)
1s jointly Gaussian.
The
integral
new)
1
= !X(t,w)f(t)dt
o
. exists in the quadratic mean sense in
0
2
if and only if
H(X)
1 1
= J !r{t,s)~(t)f(s)dtds
o
<
0
+~
(3.4)
[6, p. 33J. If the integral <Jxists, new) is Gaussian with
It will be shown that the hypotheses of the lemma imply (3.4).
ret,s)
=R(t,s)
0
+ m(t)m(s)
2~
= 0 2•
Note that
implies
11'
f
2
E(n )
1
f1R(t,s)t(t)t(s) latds + (!Im(t)t(t) Idt)2
(3.5)
0 0 0
It tollows by Tonelli's theorem and by (3.1) that
1 1
1 1
fIR(t~s)f(t)t(s)ldtds ~ !Elx(t,w)X(s,w)llt(t)f(s)ldtds
J
o0
I
0 0
1.
=E(!lx(t,w)f(t)fdt)'
.!,)
o
< ..
(3.6)
9
Also, by Tonelli's theorem and (3.1), we obtain
1
1
!lm(c )r(t} Idt ..:. E(!lX(t ,ul)r(t) !dt) < +.....
o
0
Equations (3.5), (3.6) and (3.7) imply (3.4).
It is next shown that
;(w)
= new)
a.s. Tonelli's theorem and (3.1)
imply that
1 1
J J Jlx(t,w)X(s,w)f(t)t(s)ldsdtdP(w)
n0
Hence,
by
+.....
Fubini's·theorem and (3.4), it follows that
=02
E(,2}
By
<
0
applyin~
(3.9)
a property of the quadratic mean integral (3.3) [6, p. 30]
and (3.8) we obtain
=
E(~n)
1
JE[~(w)X(t,w)f(t»)dt
o
1
1
= !{!(JX(s,w)f(S)X(t,w)f(t)ds)dP(w)}dt = 0
2
(3.10)
on0
'By
E{n 2 )
(3.9),,{3.10) and the fact that
Hence t =
in L (0,F,P); Le., E;(w)
Tl
2
Gaussian, so is
=02,
= n(w)
it follows that
a.s.
and since
1s
Tl
t.
1
Since by (3.1), Ilx(t,w)f(t)ldt ~ L1(n,F~p), we have
o
X(t,w}f(t)
L ([0,1]
1
€
x
0,8[0,1)
F, dt
x
x
p)
and therefore
1
E(t)
Since E(t 2 )
= E(n2 )
Var(t)
Theorem 3.1.
satisties
=0
2
= Jm(t)t(t)dt
o
, it folloW's by (3.1) that
=E(t2 ) -
2
E (~)
1 1
= J !R(t,s)f{t)f(s)dtds
o
°
If the stochastic process
(1 ) and
1
II
(n,F,p;X(t,w},t
(II), then the probability measure
~X
€
[0,1])
induced on
10
(X
= L2 [O,1],B(X»
m
X and covariance
t:
R(t,~).
onerator fr,enerated by the Kernel
Note that (II) implies (iii) of (1 ) and by Proposition 3.1
Proof.
~X
me(~n
X(t,h\) 'is Gaussian Yit.h
'hy
2
is well defined by (1.2).
~X
In order to show that
is Gaussian it
X is a
suffices to shoY that every continuous linear fUnctional on
(X,R(X) ,u
Gaussian random variable on the pror.a.bility space
be a continuous linear f\mctional on
x)'
Then there exists a
X.
Let
F,
c X such
f
that
1
F{u)
'for all u (X.
= <u,f> = !u(t)t(t)dt
o
It tollows by (1.2) that
~X{u ( X: F(u) ~ a} = :t>{w
Hence the random variables
on
(n,F,p)
U
on
n: !X(t,w)f(t)dt ~ a}.
°
(X,8(X),\.!x)
are identically distributed.
Remark 3.z, so is
It
F(u)
1
E
E
o
~(w)
Since
1
= !x(t,w)t(t)dt
~f(W)
and
o
is Gaussian by
F(u).
X is the mean of \.lX' it follows from (1.3) and Remark 3.1
that,
1
1
0
0
lu0 (t)f(t)dt = Ereu,f>] = E[~t(w)] = !m(t)t(t)dt
for all t
in
E
X. Hence uo (t) = met) a.e.
[L~b.]
on
[0,1]; i.e., m
= u0
X.
1
It S is the covariance operator of
U
x
and
tg(w) = jX(t,w)g(t)dt,
o
g E X, it follows by (1.4) and the tact that the random variables
<u,g>
and
~f(w), ~g(w)
<St,g>
are identically distributed that
= E[<u
-
U
o
,f><u - u ,g>]
0
= E[~f(w)~g(w)] - <uo~~><Uo,g>
for all
t,g
f:
X. As in (3.B) we obtain
<u,f>,
11
1 1
flx(ttw)X(Stw)f(t)~(s)'dsdtdP(w) < +~
Jf
and by Fubinits Theorem
f2 0 0
we have
1 1
=f
<Sftg>
!r(tts)f(t)P,(s)dtds - <mtf><im,g>
°0
1 1
JR(t,s)f(~)g(s)dtdG
=f
a
for all
on
f,g
E
0
= <Bfts>
Xt where R is the trace .class (because of (II»
X with kernel R(t,s).
Hence S
= R.
operator
II
, ..
It should be'noted that condition (II) is satisfied if the stochastic
process
X(t,w), t
E
[0,1], has either one of the following properties:
mean square continuity, wide sense stationarity, uniformly bounded autocorrelation function.
Theorem 3.1 shows that condition (II) is sufficient for the induced
measure
\.IX to be Gaussian.
It is shown in the
ro110win~
corollary that
it Is also necessary.
Corpllary 3.1.
If the stochastic process
(n,Ftp; X(t,w), t
satisfies (I 2 ), then the probability mea.sure ~X induced on (X
R(X»
by X{t,w) is Gaussian if and only if (II) is satisfied.
Proof.
€
(0,1])
= L2 [O,1],
The "if" 'Part is shown in 'l'heorem 3.1.
The "only if" part
2
follows fram (1.5) and the fact that the random variables IIul1
on
2
(X, B(X), \.IX) and IIx( • ,w) 11
on (n, F,p) are identically distributed.
One may wonder whether
the stochastic process
\.IX can be shown to be a Gaussian measure if
[0,1], satisfies conditions (1 2 ) and
(III), which are in general weaker than (I ) and (II). The answer, in
1
the affirmative, is provided by the following theorem•
.......
Theorem 3.2.
X(t,w), t
E
If the stochastic process
(n,F,p; X(t,w), t
satisfies (1 ) and (III), then the probability measure
2
~X
£
[0,1])
induced on
iI
12
(X
= L2 [0,1],
R(X»
by X(t,w)
operator S with kernel
Let
Proof.
such that each
X(',w)
~
is Gaussian with
~ean
met)
and covariance
R(t,s) and (II) is satisfied.
be a complete orthonormal set in 1 (0,1]
2
is continuous on (0,1] for all n. Then
{$n(t)}
n
(t)
L2[0,l] implies
t
in L 2 [0,1]
a.s., where
1
• ~ (w) = !X(t,w)¢ (t)dt
a.s.
nOn
It follows by Remark 3.1 and the proof or Theorem 3.1 that the random
variables
{t (w)}
n
are jointly Gaussian.
In order to show that
~X
is Gaussian it suffices to show that
every continuous linear functional on X is Gaussian; Le., that the
random variable F(u)
f
£
,(u)
= <u,.f>
(X,B(X),lJ ) is Gaussian for every
X
It is seen in the proof of Theorem 3.1 that the random variable
X.
on
is Identically distributed with the random variable
t(w)
on
(n,F,p)
1
,given by
t(w) = JX(t,w)f,(t)dt '.a.s.
It follows from. (3.11) that
o
By Lemma 5.1 it follows that
t(w)
is Gaussian.
The validity of (II) follows from Corollary 3.1 and the claims
II
from Theorem 3.1.
X
It follows from Theorem 3.2 that i.f condition (I 2 ) is
about the mean and covariance of
Remark 3.2.
lJ
satisfied, then (II) and (III) are equivalent.
The affirmative answer to question
~
is given in the following
theorem.
Theorem 3.3.
Given any Gaussian measure
there exists a Gaussian stochastic process
lJ on
(X. L2 [O,l],R(X)}
(n,F,p; X(t,w), t
£
[0,1])
13
~
which satisfies (II) and (II) a.nd imlllces
In order to prove
Lemma 3.2.
let
U
o and
Let
be
S
eo
{~n}n=l
and let
3.3. the followinp; lemma is needed.
Theore~
be a Gaussian measure on
tl
ei~enfunctions
be the
of S
,lJ(Nt
)
(~
~
t
n=l
x
<$
,u - U >$ (t)
non
AI); and
Nt ~ SeX)
i <$ ,u - U >$ (t)
n=l n o n
such that
converges for all u
€
c
N•
t
equal to the limit of the series on A~ x X and equal to
Define Y{t,u)
X.
t <$ ,u - U >~ (t)
n=l n o n
The series
(2)
corresponding to its nonzero
A~ there exists a set
€
and the series
zero on A
l
respectively,
with Leb{A ):& 0 such that
l
convcr~es in L2(X,B(X),~) tor
denotes the complement of
for all t
=0
1.1
{A}
The series
(ii)
= L~[O,l),R(X»,
,
mean and the covariance opera.tor of
th(~
n n=l'
There exists a set Al (B(O,l]
E
(X
00
eigenvalu~s
all t
(X,B{X».
on
converges in
I-
L ([0,1]
2
~
(3)
Y(t,u)
X, B[O,l]
x 8(X),Leb x 1.1).
There exists a set A
= Y'(t,u)
Denote the limit by Y'(t,u).
B[O,l]
€
such that
Leb(A) = 0 and
on AC x X.
a.e. [Leb x 1.1]
r
A let). Since all
n=l n n
the functions in this series are nonnegative, real valued and measurable,
Proot 2!. Lemma 3.2.
it follows that
f(t)
Proof of (1).
Let
ret)
is an extended, real valued, measurable function.
Since, by (1. 5),
!f{t)dt
a
= n=l
tAt
$2(t)dt
nOn
we conclude that there exists a set
and
t(t)
{~ (t,u) •
n
<
+- on ~.
Let
t
A~
00
<+n
,u - u >+ (t)} 1
on
n-
with mean zero and
E
on
Al
£
D
EA < +e,
n=l n
B[O,l]
be fixed.
such that
......
Leb{Ai} = 0
Then the random variables
(X,S(X),lJ) are
jo~ntly Gaussian
14
2 {t)
= <8,n ,~n>$2(t)
= A-.
n
n n
Er~2(t,u}]
n
E[~n (t,u)~k(t,U)]
= <S~n ,~.K >~ n (t)~k(t) = 0nkAn$2(t),
n
~.
0nk is Kronecher's
where
Hence
{w~(t,u)}:=l
is a sequence ot
r E[ IjJ:2 (t ,u)J
co
independent, zero mean, r78,ussian random variables and
2
00
A ~ (t) <
n=l n n
r
n=l
n
=
+-. This implies (i) and also by [3, p. 197] for all
Alc the series
w
t (t,u) converges a.e. [~], which proves (ii).
n=l n
Clearly the two liJ!1i.ts, in (1) and (ii), are equal a.e. [pl.
t
€
Proot 2! ill.
[0,1] )( X.
Then
8[0,1])( H(X) be the product a-algebra of
Let
~
n
(t,u)
is a measurable function.
By
Tonelli's theorem
we have
1
2
IoXI ~n (t,u}d~(u)dt • <8,n " n >
Hence
~n €
~nwk €
L £[O,l)
l
L2 ([0,1]
x
X)
x
X, 8[0,1]
tor all
x
n
SeX), Leb
and k.
x p}
= An •
= L2([0,1]
x
X) and
It follows by Fubini's Theorem
that
1
II
oX
Hence
{Iji
n
.
~n(t,u)~k(t,u)d~(u)dt
= <S~n,$k><~n"k> = 6nk~k.
(t,u)}co 1 are orthogonal in L «(O,l) x X)
2
n=
r
co
lf/2
~ (t,u)du(u)dt.
n=l 0 X n
00
It tollows that the series
2!11l.
r
~
n=l n
and by (1.5),
00
r A < +CO.
n=l n
•
(t,u)
converges in L ([0,lJ x X).
2
Ew(t,u) = yl(t,U)
in L2 ([0,1] x X),
Nk
it tollows that there exists a subsequence such that 11m ~1~ (t,u) = Y'(t,u)
Proof
Since
by
(2),
n=l n
k-++ co n-
(Leb )( lJ]
on
[O;lJ x X.
not converge to Y' (t,u)).
o•
Let
B. {(t,u)
Then
=f f
lOX
• I(f
o Bt
[0,1] x X:
Nk
r w (t,u)
n-l n
does
(Leb)( p}(B) = 0 and by Fubini 1 s theorem,
1
(Leb x u)(B)
€
n
dp(u»dt =
~(t,U)d1J(U)dt
t0 p(Bt)dt
c
15
• where
= {u
B...
u
(t,u)
( X:
I!
B}
R(X).
E
=0
~(Bt)
lIence
a..e. [Leb.)
and tbus there exists a 3et
lJ(Bt )
and
=:
A e B[O,l] such that
2
It "follows that. for every t
t £ A~,
0 for all
)
2
we
Leb(A
A~
IE
=0
~:
;J(B )
have
t
=0
&.!ld
'\k
1:
lim
~
(t,u)
IJI
n=l
k-*o>
yt(t,u)
a.,e. [lJ).
n
A = Al U A2 , Then for every t (Ac there exists
such that j.l(B ) = 0 and
t
Let
N
~ (t,u)
ik
lim
n
k*CD n=l
= yl(t,u)
for all u
Bc •
t
£
B.Y part (l.ii) it follows that for every t (Ac there exists Nt ( B(X)
such that
=0
peNt)
and
N
t ~ (t,u) =
N*co n=l n
lim
N
Since
•
{ik
n=l
is a subsequence of
-
it follows that for every -t ( Ac , Nt .£ Bt •
Nt'
£
co
(t,u)}N-l'
n=l n
-
{E $
Hence for all
t
E:
c
A
and
,.
= Y' (t,u) = E $
Y(t,u)
It now follows from
Leb(A) • 0 and
(Leb
x
lJ){(O,l]
and u (B~
e..e. [Leb
(Leb
x
x
X}.
x lJ]
on
= {A
= 0,
p)(B)
For all
x
Xl u HAc
that
[0,1]
1C
n•
X, F =. B( X), P =
by
I: !w • u, where u = w.
lJ
X(t,w) •
x.
(Leb
(t,u) ( {(AC
Y(t,u) = Y'(t,u).
and thus
(t,u).
n=l n
(0,1] x X
Proof 2t Theorem 3.3.
e·
C
for all u
N
w (t,u)}koo_l
n
y(t,u)
x X) 0 B}
lJ){(Ac
x
x,
X)
x
nBc}
It follows that
u HAc
x X) nBc},
X) nBc}
=
we have
t
C
A
£
Y(t,u) = Y'(t,u)
II
Define the probability space
and denote the identity map 'between
Define
X(t ,w)
{ a
uo(t) + Y(t,I-l(w»
(n,F,p)
n
by
on A
on AC
x
n
x n,
,
and
by
X
16
where A and
B[O~l])( SeX)
are
measurable. it follows that
measurable and hence
stochastic process.
t
£
[0,1])
is a product measurable
It also folloW's by parts (l) and (3) of Lemma 3.2
X(t,lil)
on
(n.F.p; X(t,w),
= uo (t)
+
'f
<ljl
n-1
and also in L2 (O,F,P).
~
1
I- (w) - u >
• (t)
non
I»
{<"u - U >} 1
•
n
0
n=
are jointlY' Gaussian. so are the random variables
(X.8(X),lJ)
1
{<.n,I- (w) - uO>}:=l
Since the random variables
Hence the convergence in L (O,F,P)
2
of the series C3.13), along with C3.12), imply that X(t,w), t e [0~1] is
(o,F,p).
on
a Gaussian stochastic process.
Now uo(t)
L2 [0.l] implies uoCt) € L2 ([0,1] )( 0). Also
y(t.u} £ L ([0,1] x X} implies Y(t,r-l(w» £ L ([O,1] )( 0). It .fol10ws
2
2
that X(t,w) € L ([0,1] x 0) and by Fubini's theorem we obtain
2
€
,
+1» >
i.e.,
X(t,w}
I
1
1
jx2(t,w}dtdP(w) = E(jx2(t,w)dt} ;
n0
0
satisfies (II).
Thus
X(t,w), t e [0,1], satisfies both
(I ) and (II) and by Theorem 3.1 it induces a Gaussian measure lAX on
l
(X,8(X)}. It will be shown that lAX lA. For this it sUffices to show
=
l!X(B) • \J(B)
w
€
X.
" ..
c
t fA.
that for all
a.e. (p]
Since u (t) and Y(t~u)
o
X(t.w} is R[O.l])< F
Y are the same as in Lemma 3.2.
for any open sphere with radius
Let B
= {u
€
£ >
0 and center any
X: Ilu - wll < d. Then we have
by
(1.2), (-3.12)
and (3.13)
\JX(B) = P{lA)
f
= II {u e
Since the support of
II
is
IlxC' ,w} ':"' w(·} II < d
X: II u - t <4l •u - U
_
0
n-l n o n wII
sup ( ll) = tio + "ii"P{.n}' we have
0:
>. -
<
£
}
17
~x(:a) • ldv ~ sp{~ l: -II'll +
n
== lJ
ll'l
{v -;p"{ $ n }: II u0
f;
ll{U E:
= ~{'ll
E
+
t
n=l
<~ ,v>. n
v - vii
II'll - vii
Ilu - ..,11 < d
n
wi I <
e}
< £}
d
Sup(l1):
<
X:
== u(B)
II
which completes the proof.
4.
0
SOME GENERALIZATIONS AND EXTENSIONS
In this sect1.on the folloving generalizatiorB and extension$ of the'
results presented in Sections 2 and 3 are considered.
4.1
From
[0all
to any Borel meas'urable subset
T of the real line.
All the results presented in Sections 2 and 3 for the case vhere the
parameter
subset
t
takes values on T == [0,1)
T of the real line.
clearly hold for any compact
,
Moreover it is easily seen that all the results
of Section 3 also hold for every Borel measurable subset T of the real
line.
Th~
only use of the compactness of the interval
[0,1]
is made in
the proof of Theorem 3.2 s in concluding that the continuous, orthonormal
and complete functions
4>n(t)
in L [0,1] are uniformly bounded. However,
2
what is essential in the proof is clearly the uniform boundedness on T
of each fUnction in a complete orthonormal set in
can be always satisfied
by
L2 (T,B(T),Leb), and this
an appropriate choice of a complete orthonormal
set even if T is a set of infinite Lebesgue measure.
4.2 On the induction of a Gaussian measure on an appropriate
~
S'Dace by
a Gaussian stochastic 'Drocess.
The two questions raised in the introduction have been ansvered in
the affirmative for the function spaces
and 3 respectively.
e[O,l]
and L2 [O,1] in Sections 2
It should be remarked that the only case where an
affirmative answer is obtained under restrictive assumptions, and by no
18
means in general, is the important case of inducing a Gaussian measure on
L2 [0,lJ
from a Gaussian process
assumption being condition (II).
X(t,w), t (
[O,l)~
the restrictive
We now turn our attention to this case
and ask whether assumption (II) is essential to inducing a Gaussian
measure or not.
It turns out that the need for assumption (II) is solely
due to the specific way in which we attempted to provide an answer, and
that an
affirmat~ve
answer can indeed be given in the general case with
no restrictive assumptions whatever.
Specifically, it is presently shown
that every. product measurable, Gaussian stochastic process X(t,w), t
where
~
T,
T is any Borel measurable set on the real line, induces a Gaussian
measure on an appropriate Hilbert space of square integrable fUnctions on T.
Let
(n,f,p) be
~ probability space.
In this section by (I ), (I )
l
2
and (II) we mean conditions (1 ), (1 ) and (II) with the interval [0,1]
2
1
replaced by a Borel measurable set T of the real line. It follows by
Theorem 3.,l and Section 4.1 that a stochastic process
X(t,lIJ), t ( T,
satisfYing
(I ) induces a Gaussian measure on L (T,B(T),Leb.)
l
2
'is satisfied, i.e., if !r(t,t)dt < +~.
if (II)
T
Consider a measure
v on
(T,8(T»
!r(t,t)dv(t)
(IV)
such that
< +-
T
That such measures
Define v on
o
exist tollows from the following particular choice:
dv
(T,8(T»
by [d L~b](t) f(t)g(t), where get) ~ 0,
v
=
r(t,t)
tor
o ~ r{t,t)
1
tor
1 ~ r(t,t)
<1
t(t) •
It is clear that
v
o
satisfies (IV) and is a finite measure.
theorem can be proved in the same wrq
&8
Theorem 3.1.
The folloving .
.
....
,
19
Theorem J!.l.
satisfies
{II}' then for every measure v on (T,B(T»
~x
X(t,w), t ( T, induces a Gaussian measure
B(X»
(n,F,p; X(t,w), t
If the stocha.stic process
with mea.n m
(X
on
E'.
T)
satisfying (IV),
= L2 {T,B(T),v),
(X and covariance operator S generated by the
R(t , s ) .
kernel
IIenee, even though a product measurable Gaussian process does not
necessarily induce a Gaussian measure on
=L2 {T,B(T),Leb.),
(H
Leb
B(~eb»'
the necessary and sufficient condition for the latter being (II), it always
induces a Gaussian measure on every
satisfying (IV).
t
€
(H
v
= L2 (T,8{T),v),B(H»
v
with
For instance, a wide sense stationary process
(--,+-) :: R, satisfying (II) does not induce a measure on
v
X(t,w),
L (R,B(R} ,Leb. ),
2
but it induces a Gaussian measure onL (R,8{R),v)
2
for every finite measure
v.
E'.
Also a harmonizable Gaussian process
induce a Gaussian measure on
measure oJ; every
L {R,B(R),v}
2
X{t,w), t
R, does not necessarily
L (R,B(R) ,Leb.), but it induces a Gaussian
2
for
v a finite measure.
example consider the Wiener process
W{t,w), t
€
[0,+-)
For a more concrete
= R+ ;
then
ret,s) = min(t,s)
and even though
L (R'+ ,B(R+) ,Leb.)
it induces a Gaussian measure for example on
2
L2 (R+ ,B(R+ },v), where
Remark 4.1.
.
satisfy (IV).
v
W(t,w)
dv ]()
[d Leb
t
is defined by
Denote by
does not induce a measure on
= e-t •
N the set of measures
A meaningful choice of
v
in
v on
(T,B(T»
which
N assigns positive measure
to all Lebesgue measurable subsets, vith positive Lebesgue measure, of the
set
To = {t
€
T: r (t, t) .; O}.
That such measures
v
exist is demonstrated
by the measure
exist
v
€
v. The construction of v makes also clear that there
o
0
N which are absolutely continuous with respect to the Lebesgue
measure with, moreover,
exist
v
E:
~d ~b. ](t) ; 0 a.e. [Leb.] on 'lo; 1.e., there
N Which are equivalent to the Lebesgue measure on
(To,BCTo
».
20
ReFla.rk 4.2.
t.l'heorem
l~.]
states tha.t for every
measurable, Gaussinn stochastic process
mea.sure
of
lJ
v
\l and
lJ
on
= L2 (T,B(T),v).
v
(~ v ), as well as the supports
(m +
N.
S"
be the covariance operator
"
R{S)
(H)
"
Let
}
its eigenfunctions corresponding to its nonzero
",n n
The support of ~
is Sup(lJ)=m+sp{,
} =m+'Jf(S),
"
\l,n n
"
lithe closure in H\l of the range ~(S) of the opera.tor
Hence the Hilbert spa.ces
S.
X(t,w), t ( T, induces a Ga.ussian-
{.p
eigenva.lues.
where
H\l
\l ( N, a. product
"
"
and the induced Gaussian measures
R(S v
»,
depend
on the choice of
~
Some interestill@; questions arise in this connection.
v
in
First, whether the
equivalence or singularity of the Ga.ussian measures induced by two product
measurable, Gaussian stocha.stic processes depend on
will not be considered here.
on
v
Secondly, how does
v in N.
This problem
sup(lJ,,), or R(Sv)' depend
N. Note that X(',w) - m(') (f(s)
a.s. for all ,,( N, and
v
£
therefore it would be interesting to know whether there exists a minimal
fof' some
R(S)
\l
\l (N.
Even though an affirmative answer to the latter
question does not seem in general plaus i ble, the following remarks can be
made.
(i)
Let
N, i
"i (
= 1,2,
be such that
vi « Leb. and let
d\li
[d Leb](t) • t.1 (t).
and R(S
)
and R{S
between
R{S ).
~owever,
an inclusion relationship
)
does not seem to hold in general, except if
"2
are strictly positive definite operators, in which case
and
R(S )
c:
"1
"2
c
RCS
"1
).
"2
(il)
,Ct) ~
c
It Leb
< +- on
E
N, then tor every
,,€ N such that
,,« Leb.
To we have 11.eb. cHand RCS ) c RCSLeb ).
and
"
. ;
21
However, there may exist v
N such that RCS )
€
"
instance consider the Wiener process W(t,w), t
Leb.
E
N, and if
N and by (i), RCs,,) • Hv
" E
(iii)
"
v is defined by
==
e •
[0,1); then r(t,t). t,
1P , 1 ~ P < 2, then
t
If Leb ~ N then (1) is still applicable.
R(S)
v
"k' k· 1,2, ••• , defined on
"k
.
E
'k(t)
N, RCS )
"k
k1
HE' H
"
vk
+
+
(R ,S(R »
_ (t_~)2
equal to 1 on
[O,k)
= Hv
, and
by
(i), R(S
k
for all t
for all
and to e 2k
~
€
[0,+-)
b,y
,
on
vk+l
However a measure
does not seem in general to exist.
For instance consider the Wiener process W(t,w), t
measures
RCSL b)' For
~ HLeb == R(SLeb.)'
N corresponding to a minimal
€
[d ~~bJ(t)
€
c
[k,+ao).
) c; R(S
~
"k
Then Leb.
« N,
) for all k.
Note that
and hence there is no "E N such that
k.
4.3. Gaussian stochastic processes and Gaussian measures on Lp[O,l).
It is shown in Theorem 3.2 that if the stochastic process X(t,w),
.t £
[0,1], satisfies (12 ) and (III) then it induces a Gaussian measure on
».
(H2 == L2 [O,l],B(H2
Note that (III) implies only integrability and not
square integrability of almost all sample paths of ,X(t,w). Hence the
question arises as to whether a Gaussian measure is induced on
(H == Ll[O,l),B(Hl » by X(t,w) if (11 ) and (III) are satisfied.
l
answer to this question is shown to be affirmative in Theorem 4.2~
The
,This naturally raises the question of inducing a Gaussian measure
on L [0,1], 1 < P < +-. If the stochastic process (n,F,p; X(t,w), t E (O,l])
P
1
is product (8(0,11 x f) measurable and if flx(t,w) IPdt < +ao a.s. tor
o
1
~
p < +-, then the map T defined by (1.1) with X. L [0,1]
and X(t,w), t
,"
E
[0,1], induces on
p
(X,S(X»
1s measurable
a probability mea.ure
lJ
X
...
22
The followin~ theorem can be proved as Theorem 3.1.
defined by (1.2).
~eorem
4.2.
satisfies (I ) and (III) for
l
1
(V)
1 <P
for
B(H
p
p = 1
[0,1})
P
= L [0,1),
P
or
E(! Ix(t,w)/p dt)2/p
< +-
o
< +00,
then the probability measure
lJ
X
induced on
(H
» by X(t,w) is Gaussian.
Theorem. 4.2 answers question ~ for the spaces
Theorem 3.1 isth~s obtained from Theorem 4.2 for
continuous to hold if' the set
-set
€
(o,F,p; X(t,w), t
If the stochastic process
[0,1]
p
Lp[O,l], 1 ~ p
2.
18
< +ClD.·
Theorem 4.2
is replaced by a Borel measurable
T on the real line.
The· study of question Q in the general Lp[O,l] space, 1 ~ P < +-,
2
appears to be considerably more complicated than in the case p = 2. The
results reported recently in [7] and [10] seem to prOVide the appropriate
structure to approach this question.
5. APPENDIX
We prove the following lemma which is used in the proofs of Theorems
2.1 and 3.2.
{~,
If the real random variables
jointly Gaussian and lim E;n ,. f;
n......variable, then t is Gaussian.
Proof.
to the
a. s., where'
E;
n
= l,2, ••• }
is an a. s.
Since the sequence of random variables
a.s. finite random variable
t
n
are
finite random .
converges
a.s.
t, it follows that the sequence of
's converges completely to the distribution
iu t-.ya2 t 2
n
n
Hence it we denote by l' (t) • e
the characterist~
distribution functions of the
f;
n
function of .
t.
function at
tn' n = 1,2, •• ., and
t, we have
n
n
by
ret)
the characteristic function of
23
lim fn(t)
n++ 011
=f(t)
for all t
€
(-~,+.)
\,
Eq. (5.1) implies that
arg f (1)
n
= un
and that
-0
~OII
n-++
:=
U
011
2
Ifn (12") I = e n n-r-r-:::t It( ~>1
Note that the lim!t
0
2
and.
---+ +nn-++-
the fact that
ret)
arg tel)
<
+011.
If(i:2) \ is strictly positive; since if It(/:2) 1 - 0 then
l for t . 0
Ir(t)l- lim If (t)! =
, Which contradicts
{ 0 for t 'It 0
n.....~ n
is continuous in t.
Hence
2
2 <+011.
n n-++ OII 00
0-)0
It follows from (5.1), (5.2) and (5.3) that
iu t -
t{t) • e 011
and thus
~
,
1s Gaussian.
~a2t2
n .
II
tor all t
€
( __ ,+00)
..
24
REFEREnCES
[1]
C. R. Baker, Mutual information for Gaussia.n processes, SIAM J.
Apple Ma.th., 19 (1970), 451-q58:
I. M. Gel'fand and A. M. Yaglom, Calculation of the amount of information about a random function, contained in another such function,
Amer. Math. Soc. Transl. (2), 12 (1959), 199-246.
[3]
P. R. Halmos, "Measure Theory," Van Nostrand, Princeton, N. J., 1950.
[4 J
K. Ito, The topOlogical support of Gauss measure on Hilbert space,
Nagoya M.ath.·J., 38 (1970), 181-183.
G. Kallianpur and H. Ooda11'&, The equivalence and singularity of
Gaussian measures, in M. Rosenblatt (ed-.), "Proc. of Symp. on Time
Series Analysis," 279-291, Wiley, New York, 1962.
[6]
K. Karhunen, Uber lineare Methoden in der Wahrscheinlichkeitsrechnung,
Ann. Acad; Scient. "Fennicae, Sere AI, No 37 (1947), 1-79.
J. Kuelbs, Gaussian measures on Banach space, J. Funct. Anal.,
5 (1970), 354-367.
[8l
E. Mourier, Elements aleatoires dens un espace de Banach, Ann. d'
Inst. H. Poincare, 13 (1953), 161-244.
[9]
W. L. Root, Singular Gaussian measures in detection theory, in
M. Rosenblatt (ed.), "Proc. of Symp. on Time series Analysis,"
292-315, Wiley, New York, 1962.
[10]
H. Sato, Gaussian measur~ on Banach space and abstract Wiener measure,
Nagoya Math. J., 36 (1969), 65-81.
~
© Copyright 2026 Paperzz