Note: Throughout this paper {∈ t} is a sequence of uncorrelated

Note: Throughout this paper {✏t } is a sequence of uncorrelated random variables (white noise)
having zero mean and variance ✏2 , unless stated otherwise. The term “stationary” will always be
taken to mean second-order stationary. All processes are real-valued unless stated otherwise. The
sample interval is unity unless stated otherwise.
1. (a) What is meant by saying that a stochastic process is stationary?
(b)
Determine whether the following process is stationary, giving your reasons.
Xt +
(c)
1
Xt
12
1
=
1
Xt
24
2
+ ✏t .
Define a real-valued deterministic sequence {yt } by
(
+1, if t = 0, 1, 2, . . . ,
yt =
1, if t = 1, 2, 3, . . . .
Now define a stochastic process by Xt = yt I, where I is a random variable taking on
the values +1 and 1 with probability 1/2 each.
Find the mean, variance and autocovariance of {Xt } and determine, with justification,
whether this process is stationary.
(d)
A complex-valued time series Zt is given by Zt = Cei(2⇡f0 t+✓) , where f0 and C are finite
real-valued constants and ✓ is uniformly distributed over [ ⇡, ⇡].
Determine, with justification, whether this process is stationary.
[The autocovariance for a complex-valued time series is given by cov{Zt , Zt+⌧ } =
⇤
⇤
E{Zt Zt+⌧
} E{Zt }E{Zt+⌧
}, where ⇤ denotes complex conjugate.]
M3S8/M4S8 Time Series (2007)
Page 2 of 6
2. (a) Suppose {Xt } is an MA(q) process with zero mean, i.e., Xt can be expressed in the form
Xt =
✓0,q ✏t
✓1,q ✏t
1
...
✓q,q ✏t q ,
where the ✓j,q ’s are constants (✓0,q ⌘ 1, ✓q,q 6= 0). Show that its autocovariance
sequence is given by
( Pq |⌧ |
2
✏
j=0 ✓j,q ✓j+|⌧ |,q , if |⌧ |  q,
s⌧ =
0,
if |⌧ | > q.
(b)
Consider the non-invertible MA(2) process
9
✏t
4
Xt = ✏t
2
with ✓1,2 = 0.
(c)
(i)
Calculate the autocorrelation sequence of this process.
(ii)
Find an invertible MA(2) process having the same autocorrelation sequence, fully
justifying your result.
Suppose that {Xt } is the MA(2) process
Xt = ✏t
✓2,2 ✏t
2
with ✓1,2 = 0.
Now let Yt = X2t , t 2 Z, i.e., the process {Yt } is formed by subsampling every other
random variable from the process {Xt }, and hence {Yt } has a sampling interval of 2.
Given that sY,⌧ = sX,2⌧ show that SY (f ) = 2SX (f ) for |f |  1/4.
Hint: A stationary process with autocovariance sequence {s⌧ } and sample interval
t has spectral density function
S(f ) =
t
1
X
s⌧ e
i2⇡f ⌧ t
.
⌧= 1
M3S8/M4S8 Time Series (2007)
Page 3 of 6
3. (a) Use the fact that
(1
z)
N
X1
zt = z
N +1
zN
t= (N 1)
to show that
N
X1
ei2⇡f t = (2N
1)D2N
1 (f ),
t= (N 1)
where D2N
1 (f )
is a form of Dirichlet’s kernel, defined as
D2N
(b)
1 (f )
=
sin[(2N 1)⇡f ]
.
(2N 1) sin(⇡f )
Consider the following autocovariance sequence,
(
1, if |⌧ |  K,
s⌧ =
0, if |⌧ | > K,
where K 2. Is {s⌧ } the autocovariance sequence for some discrete stationary process
with spectral density function S(f )?
(c)
Specify the three conditions which must be satisfied by a linear time-invariant (LTI)
digital filter.
(d)
Let {Xt } be a discrete stationary process with a spectral density function SX (f ). Let
Yt = Xt
where K
K
X
1
Xt+j ,
2K + 1 j= K
1.
Find the spectral density function, SY (f ), for {Yt } when {Xt } is white noise with variance
unity.
M3S8/M4S8 Time Series (2007)
Page 4 of 6
4. (a) What is meant by saying two discrete time stochastic processes {Xt } and {Yt } are jointly
stationary stochastic processes?
(b)
Suppose {Xt } and {Yt } are zero mean jointly stationary processes given by
Xt = ✏t + ✓✏t 1 ;
Yt = ✏t
✓✏t 1 ,
with |✓| < 1.
(i)
By first finding the cross-covariance sequence {sXY,⌧ }, or otherwise, show that the
cross-spectrum SXY (f ) is given by
SXY (f ) =
2
✏ [(1
✓2 ) + 2i✓ sin 2⇡f ].
2
XY (f ).
(ii)
Find the value of the magnitude squared coherence,
(iii)
Now consider {Xt } to be the input to a linear filter with frequency response function
G(f ) and {Yt } to be the output.
Find |G(f )|2 and hence identify the polynomials (B) and ⇥(B) in the stationary
and invertible ARMA representation
(B)Yt = ⇥(B)Xt .
Here, as usual, B is the backshift operator.
(iv)
Explain the magnitude squared coherence value obtained in (ii) in terms of the result
in (iii).
M3S8/M4S8 Time Series (2007)
Page 5 of 6
5.
Assume that {Xt } can be written as a one-sided linear process, so that
Xt =
1
X
k ✏t k
=
(B)✏t .
k=0
We wish to construct the l-step ahead forecast
Xt (l) =
1
X
k "t k
= (B)"t .
k=0
2
Xt (l))2 } is minimized by
(a)
Show that the l-step prediction variance
setting k = k+l , k 0.
(b)
Consider the stationary AR(2) process Xt = 2,2 Xt 2 + ✏t , where
( l/2
if l even,
2,2 Xt ,
Xt (l) =
(l+1)/2
Xt 1 , if l odd.
2,2
(c)
It was stated in the course notes that for a general AR(p) process,
Xt =
1,p Xt 1
(l) = E{(Xt+l
+ ... +
p,p Xt p
1,2
= 0. Show that
+ ✏t ,
that Xt (l) depends only on the last p observed values of {Xt } and may be obtained by
solving the AR(p) di↵erence equation with the future innovations set to zero; in particular
Xt (1) =
1,p Xt
+ ... +
p,p Xt p+1
which is Xt+1 with the future innovation set to zero.
(i)
Show that, for p
2,
Xt+2 =
1,p
[Xt (1) + ✏t+1 ] +
p
X
j,p Xt+2 j
+ ✏t+2
j=2
and hence that
Xt+2
p
X
=
[
1,p j,p
+
j+1,p ]Xt+1 j
+
1,p ✏t+1
+ ✏t+2 ,
j=1
where
(ii)
j,p
= 0 if j > p.
What does the result in (i) suggest for Xt (2)? Show that this Xt (2) agrees with
the result in (b).
M3S8/M4S8 Time Series (2007)
Page 6 of 6
1. (a) {Xt } is second-order stationary if E{Xt } is a finite constant for all t, var{Xt }
is a finite constant for all t, and cov{Xt , Xt+⌧ }, is a finite quantity depending
only on ⌧ and not on t.
(b)
1
(1+ 12
z
1 2
z )
24
The corresponding characteristic polynomial is (z) =
which
can be factorized as (1 16 z)(1+ 14 z) so that the roots are 6 and -4, which are
both outside the unit circle, and therefore this AR(2) process is stationary.
seen +
4
sim. seen +
4
unseen +
(c)
Note first that E{I} = 1/2 ⇥ 1 + 1/2 ⇥ ( 1) = 0. We thus have
E{Xt } = E{yI} = yE{I} = 0.
2
2
2
2
For the variance, note that var{I} = E{I } = 1/2 ⇥ 1 + 1/2 ⇥ ( 1) = 1.
We thus have var{Xt } = var{yI} = y 2 E{I 2 } = 1 since y 2 = 1 for all t.
2
For the autocovariance, we have cov{Xt , Xt+⌧ } = E{Xt Xt+⌧ } =
yt yt+⌧ E{I 2 } = yt yt+⌧ . Now, if either (a) t + ⌧  0 and t  0 or (b)
t + ⌧ > 0 and t > 0, then yt yt+⌧ = 1; otherwise, yt yt+⌧ = 1.
2
A requirement of stationarity is that cov{Xt , Xt+⌧ } be a finite number
independent of t. This is not true for this stochastic process. For example,
if ⌧ = 5 and t = 0, then cov{Xt , Xt+⌧ } = 1; on the other hand, if
⌧ = 5 and t = 1, then cov{Xt , Xt+⌧ } = 1. We conclude that Xt is not
a stationary process.
2
unseen +
(d)
Firstly,
1
E{Zt } =
2⇡
Z
⇡
i(2⇡f0 t+✓)
Ce
i2⇡f0 t
d✓ = Ce
⇡

ei⇡
e
i2⇡
i⇡
= 0.
So,
cov{Zt , Zt+⌧ } = E{Cei(2⇡f0 t+✓) · Ce
i(2⇡f0 [t+⌧ ]+✓)
} = C 2e
i2⇡f0 ⌧
,
which is finite and dependent on ⌧ and not t. Hence, the process is stationary.
4
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
Page 2 of 10
2. (a) Since E{✏t ✏t+⌧ } = 0 8 ⌧ 6= 0 we have for ⌧
s⌧ = cov{Xt , Xt+⌧ } =
=
q
q
X
X
seen +
0.
✓j,q ✓k,q E{✏t j ✏t+⌧
j=0 k=0
2
✏
q ⌧
X
✓j,q ✓j+⌧,q ,
k}
(k = j + ⌧ )
j=0
Since an MA(q) is stationary, s⌧ = s ⌧ , and so the autocovariance sequence
is given by
( Pq |⌧ |
2
✏
j=0 ✓j,q ✓j+|⌧ |,q , if |⌧ |  q,
s⌧ =
0,
if |⌧ | > q.
(b) (i)
We have ✓0,2 =
(ii)
(c)
unseen +
1, ✓1,2 = 0, ✓2,2 = 9/4. So
8
>
[1 + 81
] 2 , if ⌧ = 0,
>
16 ✏
>
>
<0,
if |⌧ | = 1,
s⌧ =
9 2
>
,
if |⌧ | = 2,
>
4 ✏
>
>
:
0,
if |⌧ | > 2,
so
8
>
1,
>
>
>
<0,
⇢⌧ =
36
>
,
>
97
>
>
:
0,
if ⌧ = 0,
if |⌧ | = 1,
if |⌧ | = 2,
if |⌧ | > 2.
We note that the only non-zero term (apart from ⇢0 = 1) is for lag ±2
and is a ratio of the form x/(1 + x2 ) (with x = ✓2,2 ) which is equal
to (1/x)/[1 + (1/x2 )]. (This is of the same form as for an MA(1) and
lag ±1, which the students have met, and know to invert the root).
So replace the roots of the characteristic polynomial by their inverses:
(2/3, 2/3) > (3/2, 3/2) which is the same as inverting ✓2,2 , and so
the desired process is
4
Xt = ✏t
✏t 2 .
9
The roots are outside the unit circle so that the process is invertible.
From part(b),
sX,⌧
and so
sY,⌧
4
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
6
unseen +
8
>
[1 + ✓2 ] ✏2 , if ⌧ = 0,
>
>
>
<0,
if |⌧ | = 1,
=
2
>
✓ ✏,
if |⌧ | = 2,
>
>
>
:
0,
if |⌧ | > 2,
8
2 2
>
>
<[1 + ✓ ] ✏ , if ⌧ = 0,
=
✓ ✏2 ,
if |⌧ | = 1,
>
>
:0,
if |⌧ | > 1.
4
Page 3 of 10
From the formula given in the hint,
SY (f ) = 2
=
=
And
SX (f ) =
=
=
1
X
⌧= 1
⇥
2 ✏2 (1
⇥
2 ✏2 (1
1
X
i2⇡f ⌧ ·2
sY,⌧ e
+ ✓2 )
+ ✓2 )
sX,⌧ e
⌧= 1
⇥
2
✏ (1
⇥
2
✏ (1
+ ✓2 )
+ ✓2 )
✓e+i4⇡f
⇤
2✓ cos 4⇡f .
✓e
i4⇡f
⇤
i2⇡f ⌧
✓e+i4⇡f
⇤
2✓ cos 4⇡f .
✓e
i4⇡f
⇤
Since the Nyquist frequency is 1/(2 t), it is 1/4 for SY (f ). Hence, SY (f ) =
2SX (f ), for |f |  1/4, as required.
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
Page 4 of 10
6
unseen +
3. (a) Put z = ei2⇡f so that
(1
i2⇡f
e
)
N
X1
ei2⇡f t = e
i2⇡f (N 1)
ei2⇡f N .
t= (N 1)
Hence,
N
X1
ei2⇡f t =
ei⇡f (e
t= (N 1)
= (2N
i(2N 1)⇡f
ei⇡f (e
1)
i⇡f
ei(2N
ei⇡f )
1)⇡f
)
sin[(2N 1)⇡f ]
= (2N
(2N 1) sin(⇡f )
1)D2N
1 (f ).
5
(b)
The Fourier transform of {s⌧ } exists since the sequence is obviously squaresummable. So put K = N 1 in part (a) to get
K
X
t= K
1 · ei2⇡f t = (2K + 1)D2K+1 (f ) =
unseen +
sin[(2K + 1)⇡f ]
.
sin(⇡f )
But a requirement of a spectral density function is that it should be nonnegative. But for example sin(2K + 1)⇡f0 = sin 3⇡/2 = 1 when
f0 = 3/(4K + 2) < 1/2 since K
2. But sin ⇡f0 > 0 so that the ratio
is negative, and hence {s⌧ } is not the autocovariance sequence for some
discrete stationary process with spectral density function S(f ).
5
seen +
(c)
A digital filter L that transforms an input sequence {Xt } into an output
sequence {Yt } is called a linear time-invariant digital filter if it has the
following three properties:
(1.)
Scale preservation: L{↵Xt } = ↵L{Xt }.
(2.)
Superposition: L{Xt,1 + Xt,2 } = L{Xt,1 } + L{Xt,2 }.
(3.)
Time invariance:
if L{Xt } = {Yt }, then L{Xt+⌧ } = {Yt+⌧ },
where ⌧ is integer-valued and the notation {Xt+⌧ } refers to the sequence
whose tth element is Xt+⌧ .
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
Page 5 of 10
4
(d)
Input ei2⇡f t in place of Xt ; the output is of the form ei2⇡f t G(f ):
"
#
K
K
X
X
1
1
ei2⇡f t
ei2⇡f (t+j) = ei2⇡f t 1
ei2⇡f j = ei2⇡f t G(f ),
2K + 1 j= K
2K + 1 j= K
where G(f ) is the frequency response function.
unseen +
3
From part(a) we see that
G(f ) = 1
D2K+1 (f ),
and since {Xt } is white noise with variance unity, SX (f ) = 1. Hence
SY (f ) = |G(f )|2 SX (f ) = |1
D2K+1 (f )|2 .
3
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
Page 6 of 10
4. (a) Two real-valued discrete time stochastic processes {Xt } and {Yt } are said
to be jointly stationary stochastic processes if {Xt } and {Yt } are each,
separately, second-order stationary processes, and cov{Xt , Yt+⌧ } is a function
of ⌧ only.
seen +
2
unseen +
(b) (i)
E{Xt Yt+⌧ } = E{(✏t + ✓✏t 1 )(✏t+⌧
So
✓✏t+⌧
1)
✓2 E{✏t 1 ✏t+⌧
= E{✏t ✏t+⌧ } ✓E{✏t ✏t+⌧ 1 } + ✓E{✏t 1 ✏t+⌧ }
8
2
>
✓2 ), if ⌧ = 0,
✏ (1
>
>
>
< ✓ 2,
if ⌧ = 1,
✏
=
>
✓ ✏2 ,
if ⌧ = 1,
>
>
>
:
0,
if |⌧ | 2.
SXY (f ) =
=
1
X
sXY,⌧ e
⌧= 1
2
✏ [(1
i2⇡f ⌧
=
2
✏
⇥
(1
✓2 ) + ✓(ei2⇡f
✓2 ) + 2i✓ sin 2⇡f ].
e
i2⇡f
⇤
)
6
(ii)
Now
|SXY (f )|2 =
=
=
4
✏ [(1
4
✏ [1 +
4
✏ [1 +
✓2 )2 + 4✓2 sin2 2⇡f ]
✓4
2✓2 (cos2 2⇡f + sin2 2⇡f ) + 4✓2 sin2 2⇡f ]
✓4
2✓2 cos 4⇡f ].
SX (f ) = [1 + ✓e
i2⇡f
][1 + ✓ei2⇡f ]
SY (f ) = [1
i2⇡f
][1
✓e
✓ei2⇡f ]
2
✏
2
✏.
So
SX (f )SY (f ) =
4
✏ [(1
✓2 ei4⇡f )(1 ✓2 e
i4⇡f
)] =
4
4
✏ [1+✓
2✓2 cos 4⇡f ].
So the magnitude squared coherence is given by
2
XY (f )
=
|SXY (f )|2
= 1.
SX (f )SY (f )
6
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
Page 7 of 10
1}
(iii)
unseen +
Let
|GX (f )|2 = |1 + ✓e
i2⇡f 2
|GY (f )|2 = |1
|;
SY (f ) = |GY (f )|2
2
✏
=
✓e
|GY (f )|2
|GX (f )|2
|GX (f )|2
i2⇡f 2
|
2
✏
|GY (f )|2
SX (f )
|GX (f )|2
=
So
1 ✓e
1 + ✓e
|G(f )|2 =
i2⇡f 2
i2⇡f
is the magnitude squared of the frequency response function for the
filtering of {Xt } to give {Yt }. This is the magnitude squared frequency
response function for the stationary and invertible ARMA(1,1) model
(1 + ✓B)Yt = (1
✓B)Xt .
We note that
Yt + ✓Yt
1
= ✏t
✓2 ✏t
2
= Xt
✓Xt 1 .
4
(iv)
The result in (iii) tells us that {Xt } and {Yt } are perfectly linearly related
so that we would expect the magnitude squared coherence to be unity,
as in (ii).
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
Page 8 of 10
sim. seen +
2
5. (a) We want to minimize,
Xt (l))2 } = E
E{(Xt+l
= E
8
1
< X
k ✏t k
k=0
k ✏t+l
k=0
l 1
X
2
✏
=
k ✏t+l k
:
k=0
8
l 1
< X
:
(
1
X
2
k
k=0
!
1
X
[
k +
!2 9
=
;
k ]✏t k
k+l
k=0
1
X
+
(
2
k)
k+l
seen +
k=0
)
.
!2 9
=
;
The first term is independent of the choice of the { k } and the second term
is clearly minimized by choosing k = k+l , k = 0, 1, 2, . . ..
6
unseen +
(b)
While we have used 2,2 to emphasize it is the second of two parameters,
(the first being zero), henceforth we shall just use .
B 2 ) 1 ✏t = ✏t + ✏t
Xt = (1
2
+
2
2 ✏t 2
+
3 ✏t 3
✏t
4
+ ···
But
Xt =
1
X
k ✏t k
=
0 ✏t
+
1 ✏t 1
+
+
4 ✏t 4
k=0
so
=
k
(
k/2
+ ···
, if k even,
if k odd.
0,
For l even let l = 2j, j = 1, 2 . . .:
Xt (l) =
=
=
1
X
k+2j ✏t k
k=0
j
(✏t + ✏t
j
2
l/2
Xt =
4
=
+
1
X
2n+2j ✏t 2n
n=0
2
✏t 4
Xt .
=
1
X
n+j
✏t
2n
n=0
+ ···)
For l odd let l = 2j + 1, j = 0, 1, . . .:
Xt (l) =
=
=
1
X
k=0
j+1
j+1
k+2j+1 ✏t k
(✏t
1
+ ✏t
Xt
1
=
=
3
1
X
n=0
2
+
(l+1)/2
Hence,
Xt (l) =
(
2n+2j+2 ✏t 2n 1
n+j+1
✏t
2n 1
n=0
✏t
5
Xt 1 .
l/2
=
1
X
Xt ,
(l+1)/2
+ ···)
if l even,
Xt 1 , if l odd.
[Since the indexing is a bit tricky it would be acceptable to write out Xt (l)
explicitly for the cases l = 1, 2, 3, 4 say, and conclude the form is as stated.]
4
unseen +
(c) (i)
Xt+2 =
=
1,p Xt+1
" p
1,p
+ ... +
X
p,p Xt p+2
j,p Xt+1 j
+ ✏t+1
j=1
=
1,p
#
p
[Xt (1) + ✏t+1 ] +
X
+ ✏t+2
p
X
+
j,p Xt+2 j
+ ✏t+2
j=2
j,p Xt+2 j
+ ✏t+2 .
j=2
So
2
Xt+2 =
p
X
1,p j,p Xt+1 j
+
1,p ✏t+1
+
j=1
p
X
=
[
p
X
j,p Xt+2 j
+ ✏t+2
j=2
1,p j,p
+
j+1,p ]Xt+1 j
+
1,p ✏t+1
+ ✏t+2 .
j=1
(ii)
2
Set future innovations to zero:
p
X
Xt (2) =
[
1,p j,p
+
j+1,p ]Xt+1 j .
j=1
For our particular AR(2) model this becomes
2
X
Xt (2) =
[
1,2 j,2
+
j+1,2 ]Xt+1 j
=
2,2 Xt ,
j=1
agreeing with the result in part(b).
M3S8/M4S8 Time Series (SOLUTIONS) (2007)
2
Page 10 of 10
Note: Throughout this paper {✏t } is a sequence of uncorrelated random variables (white noise)
having zero mean and variance ✏2 , unless stated otherwise. The unqualified term “stationary”
will always be taken to mean second-order stationary. All processes are real-valued unless stated
otherwise. The sample interval is unity unless stated otherwise.
1. (a) (i) What is meant by saying that a stochastic process is stationary?
(ii)
(b)
Let W be a real-valued random variable with mean µ and finite nonzero variance 2 .
Define a discrete-time stochastic process {Xt } by letting Xt = W for all integers t.
Determine whether {Xt } is a stationary process or not.
Suppose that the real-valued random variables X1 , X2 and X3 have zero means and a
covariance matrix given by
2
3
1.00 0.69 0.43
40.69 1.00 0.305 .
0.43 0.30 1.00
Could these random variables be part of a stationary process {Xt }? Justify your answer.
(c)
Let {✏t } be a white noise process with a sampling interval of t = 1 second between
adjacent observations. Let Xt = ✏2t , so that {Xt } has a sampling interval of t = 2
seconds.
(i)
What is the Nyquist frequency for {Xt }?
(ii)
What is the autocovariance sequence for {Xt }?
(iii)
What is the spectral density function (SDF) for {Xt }?
(d) (i)
What is meant by saying two discrete time stochastic processes {Yt } and {Zt } are
jointly stationary stochastic processes?
(ii)
The magnitude squared coherence is defined as Y2 Z (f ) = |SY Z (f )|2 /[SY (f )SZ (f )],
where SY Z (f ) is the cross-spectrum of {Yt } and {Zt }, and SY (f ) and SZ (f ) are
the power spectra of {Yt } and {Zt }, respectively.
What does it measure?
(e)
Suppose the stationary process {Xt } can be written as a one-sided linear process,
P
Xt = 1
k=0 k ✏t k , We wish to construct the l-step ahead forecast
Xt (l) =
1
X
k "t k .
k=0
Show that the l-step prediction variance
setting k = k+l , k 0.
M3S8/M4S8 Time Series (2009)
2
(l) = E{(Xt+l
Xt (l))2 } is minimized by
Page 2 of 5
2. (a) Consider the stationary first-order autoregressive process, or AR(1) process,
Xt = Xt
(i)
1
+ ✏t .
Using the method of induction, show that this process can be expressed as
Xt =
q
Xt
q
+
q 1
X
j
✏t
j
j=0
for any positive integer q, and hence specify, with justification, an infinite-order
moving average, i.e., MA(1), representation for {Xt }.
(ii)
Use the MA(1) representation to show that the autocovariance sequence for {Xt }
is given by
s⌧ =
(iii)
2
✏
|⌧ |
2
1
.
Find the frequency response function G(f ) associated with the LTI filter
L{Xt } = Xt
Xt 1 ,
and hence find the spectral density function (SDF) for the AR(1) process.
(iv)
Use the SDF to derive the autocovariance, s⌧ , for {Xt } and show it agrees with
that in (a)(ii).
[You may use the fact that
Z ⇡
cos(nx)
⇡an
dx
=
,
2
2a cos x
1 a2
0 1+a
for |a| < 1 and integer n.]
(b)
The characteristic function for a bivariate AR(1) process takes the form
(B) = I
where I is the (2 ⇥ 2) identity matrix, and
For
B,
is a (2 ⇥ 2) matrix of parameters.

1/2 1/10
=
1/2 1/2
calculate the determinantal polynomial, | (z)|, and hence determine whether the
corresponding bivariate AR(1) process is stationary.
M3S8/M4S8 Time Series (2009)
Page 3 of 5
3. (a) Let a be a real-valued nonzero constant, and suppose that {a, 0, a} is a realization of
length N = 3 of a portion X1 , X2 , X3 of a stationary process with mean of zero, spectral
density function (SDF) S(f ) and autocovariance sequence (ACVS) {s⌧ }.
(i)
(p)
Show that the biased estimator {b
s⌧ } of the
8
>
2a2 /3,
>
>
>
<0,
(p)
sb⌧ =
>
a2 /3,
>
>
>
:
0,
ACVS for {Xt } is given by
⌧ = 0;
|⌧ | = 1;
|⌧ | = 2;
|⌧ | > 2.
(u)
Also determine the ‘unbiased’ estimator {b
s⌧ } of the ACVS.
(u)
Is {b
s⌧ } necessarily a valid ACVS for some stationary process?
reasoning.
(ii)
(iii)
Explain your
(p)
Use the fact that {b
s⌧ } and the periodogram Sb(p) (·) are a Fourier transform pair to
show that
2a2
Sb(p) (f ) =
[1 cos (4⇡f )] ,
|f |  1/2.
3
An equivalent way of obtaining the periodogram is via
b(p)
S
N
1 X
(f ) =
Xt e
N t=1
2
i2⇡f t
.
Verify that computing the periodogram in this alternative manner gives the same
result as in part (a)(ii).
(b)
Let S(f ) denote the spectral density function (SDF) of a stationary process, and let
Sb(p) (f ) denote the periodogram derived from a finite realization of the process. The
quantity
b(f ) ⌘ E{Sb(p) (f )} S(f )
is the bias in the periodogram at frequency f .
Use the fact that
b(p)
E{S
(f )} =
where F is Féjer’s kernel defined by
Z
1/2
1/2
N
X
1
p e
F(f ) =
N
t=1
f 0 )S(f 0 ) df 0 ,
F(f
2
i2⇡f t
=
sin2 (N ⇡f )
,
N sin2 (⇡f )
to show that the average value of the bias in the periodogram over the interval
[ 1/2, 1/2] is zero.
NOTE: If g(·) is a function defined over the interval [a, b], then, by definition,
Rb
[1/(b a)] a g(x) dx is the average value of g(·) over [a, b].
M3S8/M4S8 Time Series (2009)
Page 4 of 5
4. (a) Let {Yt } and {Zt } be two real-valued jointly stationary stochastic processes, for which
and Zt = Xt + ↵⌫t
Yt = Xt + ↵⌘t
where {Xt }, {⌘t } and {⌫t } are all white noise processes, independent of each other, each
with a zero mean and variance of unity, and ↵ is a non-zero constant.
(i)
Show that
sY Z,⌧ = sX,⌧
and find the form of sY,⌧ and sZ,⌧ .
Here {sY Z,⌧ } is the cross-covariance sequence of {Yt } and {Zt }, and {sX,⌧ }, {sY,⌧ }
and {sZ,⌧ }, are the autocovariance sequences of {Xt }, {Yt } and {Zt }, respectively.
(ii)
(b)
Hence find the magnitude squared coherence
2
Y Z (f ),
defined in Question 1(d)(ii).
Suppose that we have observed L time series, all of which can be regarded as independent
realizations from the same stationary process with SDF S(f ). Suppose we use the lth
(p)
series to form a periodogram spectral estimate Sbl (f ) at some frequency f such that
|f | < 1/2. We then combine the L di↵erent direct spectral estimates together to form
where ↵l > 0 for all l.
b )=
S(f
L 1
X
l=0
(p)
↵l Sbl (f ),
(p)
Assume that each of the independent periodogram estimators Sbl (f ) has the same
b ) has the same distribution as the
distribution as the random variable S(f ) 22 /2, and S(f
random variable a 2⌫ /⌫, where 2⌫ denotes a chi-squared random variable with ⌫ degrees
of freedom for which E{ 2⌫ } = ⌫ and var{ 2⌫ } = 2⌫, and a is a constant.
(i)
(ii)
b ) is an unbiased
What condition do we need to impose on the weights ↵l so that S(f
estimator of S(f )?
b ) is unbiased, determine the degrees
Assuming that the ↵l ’s are chosen so that S(f
b ) in terms of the ↵l ’s.
of freedom ⌫ for S(f
M3S8/M4S8 Time Series (2009)
Page 5 of 5
1. (a) (i) {Xt } is second-order stationary if E{Xt } is a finite constant for all t,
var{Xt } is a finite constant for all t, and cov{Xt , Xt+⌧ }, is a finite
quantity depending only on ⌧ and not on t.
(ii) Since E{Xt } = E{W } = µ (a constant independent of t) and
since cov {Xt , Xt+⌧ } = cov {W, W } = var {W } = 2 (a constant
independent of t for all ⌧ , it follows that {Xt } is a stationary process
with ACVS given by s⌧ = 2 for all ⌧ .
(b) The variance-covariance matrix for a portion of a real-valued stationary
process must be a symmetric Toeplitz matrix, which implies that elements
along any given diagonal must be the same. This matrix does not have
this structure along the diagonals above and below the main diagonal since
cov {X1 , X2 } = 0.69, while cov {X2 , X3 } = 0.30. Because cov {X1 , X2 } 6=
cov {X2 , X3 }, a basic requirement of stationarity is violated, namely, that
these covariances be the same because the lag between X1 and X2 is the
same as the lag between X2 and X3 .
(c) (i)
(ii)
(iii)
(d) (i)
(ii)
(e)
seen +
2
unseen +
2
unseen +
2
The Nyquist frequency is 1/(2 t) = 1/4 Hz.
Subsampling an uncorrelated sequence gives an uncorrelated sequence
with the same properties, so var{Xt } = sX,0 = ✏2 and sX,⌧ = 0 for
|⌧ | > 0.
Since {Xt } is white also, it has a flat spectrum over [ 1/4, 1/4] which
must integrate to its variance of ✏2 , so SX (f ) = 2 ✏2 for f 2 [ 1/4, 1/4].
Two real-valued discrete time stochastic processes {Yt } and {Zt } are
said to be jointly stationary stochastic processes if {Yt } and {Zt } are
each, separately, second-order stationary processes, and cov{Yt , Zt+⌧ } is
a function of ⌧ only.
The magnitude squared coherence (msc) measures the linear correlation
between the components of {Xt } and {Yt } at frequency f.
We want to minimize,
E{(Xt+l
Xt (l))2 } = E
= E
=
8
1
< X
:
k=0
8
l 1
< X
:
(
2
✏
k ✏t+l k
k=0
k ✏t k
k=0
k ✏t+l
k=0
l 1
X
1
X
2
k
!
1
X
[
k +
!2 9
=
k ]✏t k
k+l
k=0
1
X
+
(
k=0
k+l
;
2
k)
)
.
seen +
2
2
!2 9
=
;
The first term is independent of the choice of the { k } and the second term
is clearly minimized by choosing k = k+l , k = 0, 1, 2, . . ..
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
6
Page 2 of 8
4
sim. seen +
2. (a) (i) It holds for q = 2 since
Xt = ( Xt
2
2
+ ✏t 1 ) + ✏t =
Suppose now that it holds for q
Xt =
q 1
Xt
+ ✏t
2
1
+ ✏t .
1 with q > 3:
Xt
q+1
+
q 2
X
j
✏t j .
j=0
To see that it holds for q, note that, since Xt
q 1
Xt =
( Xt
q
+ ✏t
q+1 )
+
q 2
X
j
✏t
q
=
j
= Xt
q+1
Xt
q
+
j=0
+ ✏t
q
q 1
X
j
q+1 ,
✏t j ,
j=0
which establishes the result.
Since | | < 1 for stationarity, as q ! 1, we obtain an infinite moving
average representation
1
X
j
Xt =
✏t j .
4
j=0
(ii)
Since
E{Xt } = E
(1
X
j
✏t
j
j=0
)
=
j=0
the ACVS for {Xt } is given by
s⌧ = cov {Xt , Xt+⌧ } = cov {Xt , Xt
=
1 X
1
X
j k
E{✏t j ✏t
j=0 k=0
|⌧ | k }
1
X
|⌧ | }
2
✏
=
j
E{✏t j } = 0,
(
=E
1
X
1
1
X
j=0
j
✏t
j
!
1
X
k=0
k
✏t
|⌧ | k
k+|⌧ | k
k=0
because E{✏t j ✏t |⌧ | k } = 0 unless t |⌧ | k = t j, i.e., j = k + |⌧ |,
P
k
in which case E{✏t j ✏t |⌧ | k } = ✏2 . Using the fact that 1
k=0 x =
1/(1 x) for |x| < 1, we obtain
1
X
k k+|⌧ |
=
|⌧ |
k=0
1
X
2k
=
k=0
and hence
s⌧ =
2
✏
|⌧ |
1
2
|⌧ |
1
2
,
,
as required.
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
5
Page 3 of 8
!)
(iii)
L{ei2⇡f t } = ei2⇡f t (1
) |G(f )|2 = 1 +
2
i2⇡f
e
) ) G(f ) = 1
i2⇡f
e
2 cos(2⇡f ).
Hence, S✏ (f ) = |G(f )|2 SX (f ) so that
SX (f ) =
(iv)
2
✏
|1
i2⇡f |2
e
=
2
✏
2
[1 +
2 cos(2⇡f )]
.
2
For s⌧ :
s⌧ =
Z
1/2
SX (f )ei2⇡f ⌧ df
1/2
Z
⇡
eix⌧
dx
2⇡ ⇡ 1 + 2 2 cos x
2 Z ⇡
cos(x⌧ )
✏
=
dx (since l.h.s is real or sine part is odd)
2⇡ ⇡ 1 + 2 2 cos x
2 Z ⇡
cos(x⌧ )
✏
=
dx
⇡ 0 1 + 2 2 cos x
2
⇡ ⌧
= ✏
,
2
⇡ 1
=
2
✏
and so since s⌧ is symmetric, we get
s⌧ =
2
✏
|⌧ |
1
2
,
as required for equality with (a)(ii).
(b)
The polynomial is
⇢

1 0
1/2 1/10
det
z
0 1
1/2 1/2
= det
⇢
1
4
(z/2)
(z/10)
(z/2) 1 (z/2)
z2
= 1 z+ .
5
unseen +
2
p
p
p
Roots are [1 ± (1/5)]/[2/5] = 52 ± 52 15 . Clearly 52 + 52 15 > 1. Also
p
p p
p
5
5p1
= 52 12 5 = 12 [ 5( 5 1)] > 5 1 > 1. So both roots are
2
2
5
outside the unit circle and the process is stationary.
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
Page 4 of 8
2
unseen +
3. (a) (i) Using
sb(p)
⌧
N |⌧ |
1 X
=
Xt Xt+|⌧ | ,
N t=1
⌧ = 0, ±1, . . . , ±(N
(p)
and recalling that sb⌧ ⌘ 0 for |⌧ |
(p)
sb0
(p)
sb1
(p)
sb2
sb(p)
⌧
(u)
N , we obtain
1 2
2a2
a + 02 + a2 =
3
3
1
(p)
=
(a · 0 + 0 · ( a)) = 0 = sb 1
3
1
a2
(p)
=
a · ( a) =
= sb 2
3
3
= 0 when |⌧ | 3.
=
(p)
Since sb⌧ = sb⌧ N/(N
4
|⌧ |) for 0  |⌧ |  N
(u)
sb0
(u)
sb1
(u)
(u)
|b
s2 |
1).
(u)
sb0 ,
sb2
(p)
= sb0 =
1, we find that
2a2
3
3 (p)
sb = 0
2 1
(p)
= 3b
s2 = a2 .
=
2
Because
>
this unbiased ACVS estimate does not correspond
to an ACVS for any stationary process.
(ii)
Since N
1=2
Sb(p) (f ) =
2
X
⌧= 2
i2⇡f ⌧
sb(p)
⌧ e
=
(p)
sb0
+2
2
X
⌧ =1
sb(p)
⌧ cos (2⇡f ⌧ ),
substituting the results of part (a) yields
✓ 2◆
2
2a
a
2a2
(p)
Sb (f ) =
+2·0·cos (2⇡f )+2·
·cos (4⇡f ) =
[1
3
3
3
(p)
⌧
where we have made use of the result, that, since sb
sb(p)
⌧ e
i2⇡f ⌧
(p)
+ sb ⌧ ei2⇡f ⌧ = sb(p)
e
⌧
i2⇡f ⌧
2
cos (4⇡f )] ;
(p)
= sb⌧ ,
+ ei2⇡f ⌧ = 2b
s(p)
⌧ cos (2⇡f ⌧ ).
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
4
Page 5 of 8
(iii)
We have
N
X
i2⇡f t
Xt e
= ae
i2⇡f
i6⇡f
ae
= ae
i2⇡f
[1
e
i4⇡f
].
t=1
Then, with N = 3,
N
X
2
Xt e
i2⇡f t
= a2 1
t=1
e
⇥
= a2 2
i4⇡f
ei4⇡f + e
= 2a2 [1
ei4⇡f
1
i4⇡f
cos (4⇡f )] .
⇤
Hence
N
1 X
(p)
b
S (f ) =
Xt e
N t=1
2
i2⇡f t
=
2a2
[1
3
cos (4⇡f )] ,
which agrees with the result obtained in part (a)(ii).
(b)
Given that
then
Z 1/2
E{Sb(p) (f )} =
b(f ) df =
1/2
=
=
=
=
Z
Z
Z
Z
Z
1/2
1/2
1/2
1/2
1/2
1/2
1/2
Z
1/2
1/2
S(f ) df
E{Sb(p) (f )} df
Z
1/2
F(f
1/2
"Z
1/2
0
S(f )
1/2
1/2
0
S(f ) df
1/2
0
f 0 )S(f 0 ) df 0 ,
F(f
E{Sb(p) (f )}
2
1/2
Z
0
1/2
S(f ) df
1/2
0
f )S(f ) df df
#
F(f
Z
Z
0
0
f ) df
df
1/2
0
1/2
S(f ) df
1/2
Z
1/2
S(f ) df
1/2
S(f ) df = 0,
1/2
where we have made use of the fact that, because F(·) is a periodic
R 1/2
function with a period of unity, 1/2 F(f f 0 ) df = 1 for any f 0 because
R 1/2
F(f ) df = 1.
1/2
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
Page 6 of 8
6
unseen +
4. (a) (i) Since E{Yt } = E{Zt } = 0,
sY Z,⌧ = E{Yt Zt+⌧ } = E{(Xt + ↵⌘t )(Xt+⌧ + ↵⌫t+⌧ )} = E{Xt Xt+⌧ }
(
1, if ⌧ = 0;
= sX,⌧ =
0, otherwise,
by the independence of the processes and fact that {Xt } is white noise
with unity variance.
Also,
2
sY,⌧ = E{Yt Yt+⌧ } = E{(Xt + ↵⌘t )(Xt+⌧ + ↵⌘t+⌧ )}
= E{Xt Xt+⌧ } + ↵2 E{⌘t ⌘t+⌧ }
(
1 + ↵2 , if ⌧ = 0;
= sX,⌧ + ↵2 s⌘,⌧ =
0,
otherwise.
Likewise,
sZ,⌧ =
(ii)
So, since SY Z (f ) =
P
⌧
(
1 + ↵2 , if ⌧ = 0;
otherwise.
0,
4
sY Z,⌧ e
SY Z (f ) = 1, SY (f ) = 1 + ↵2 ,
i2⇡f ⌧
, etc
and SZ (f ) = 1 + ↵2
for |f |  1/2.
Then
2
Y Z (f )
(b)
=
|SY Z (f )|2
1
=
,
SY (f )SZ (f )
(1 + ↵2 )2
|f |  1/2.
4
We note that
(i)
)
Now E{Sbl (f )} = S(f
E{
2
S 2 (f )
var{ 22 } = S 2 (f ). Since
4
(p)
b )} =
E{S(f
the condition
PL
L 1
X
l=0
1
l=0
2
2}
= S(f ) and var{Sbl (f )} =
↵l E{Sbl (f )} =
(p)
(p)
L 1
X
↵l S(f ) = S(f )
l=0
L 1
X
↵l ,
l=0
↵l = 1 yields an unbiased estimator.
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
unseen +
4
Page 7 of 8
(ii)
b )} = a E{
Now E{S(f
⌫
Then
b )} =
var{S(f
L 1
X
l=0
↵l2
2
⌫}
b )} =
= a and var{S(f
(p)
var{Sbl (f )} =
L 1
X
var{
↵l2 S 2 (f )
2
2
⌫}
= S (f )
l=0
b ) is unbiased, a = S(f ),
But since S(f
so
a2
⌫2
= 2a2 /⌫.
L 1
X
↵l2 .
l=0
L 1
X
2a2
2S 2 (f )
b
var{S(f )} =
=
= S 2 (f )
↵l2 ,
⌫
⌫
l=0
⌫ = PL
2
1
l=0
M3S8/M4S8 Time Series (SOLUTIONS) (2009)
↵l2
.
6
Page 8 of 8