AMS586.01
Problem Set #1
Spring 2017 ♡♣♥♦
1. For the time series
where { } is a series of white noise with mean 0
and variance 1. Please derive:
(a). Is this series stationary?
(b). Is this series invertible?
(c). The mean of the series.
(d). The variance of the series.
(e). The autocovariance functions and the autocorrelation functions of the series.
2. For the AR(2) process
, where { } is a series of white noise with
mean 0 and variance . Please derive:
(a). Is this series stationary?
(b). Is this series invertible?
(c). The mean of the series.
(d). The variance of the series.
(e). The autocovariance functions and the autocorrelation functions of the series.
3. Please derive the Yule-Walker equation for a stationary AR(2).
4. Consider the stationary and invertible ARMA(1,1) process
, where
{ } is a series of white noise with mean 0 and variance .
(a) For what values of and is this process stationary, and invertible? Please show detailed
derivations.
(b) Please derive its mean, variance, auto-covariances and auto-correlations.
1
Solution:
1.
(c) This is a MA(1) series. The MA series is always (weakly) stationary, therefore we have:
( )
(
)
( )
( )
(
)
( )
( )
(
)
(d) Now we derive the variance.
( )
( )
(
( )
( )
)
(
(
)
)
( )
(e) Now we derive the autocovariances of lag h (
( )
(
)
(
(
(
)
( )
(
)
(
Similarly, we found that ( )
).
)
)
( )
)
, for all
Thus the autocorrelation functions are:
( )
( )
( )
( )
( )
( )
2.
(c) Now we have:
(
(
)
)
(
)
(
)
( )
Therefore
(d) Now we derive the variance. From
(
)
(
)
(
( )
)
(
,
(
)
)
(
)
( )
(
( )
)(
)
(
)
This yields:
( )
2
On the other hand, we have:
( )
(
)
(
)
( )
This yields:
( )
Solving these two equations we obtain:
( )
( )
( )
).
(e) Now we derive the autocovariances and autocorrelations of lag h (
Note that we can derive the autocovariance directly as before by computing
( )
(
)
(
( )
Thus
( )
( )
And so on for higher order autocovariances and autocorrelations.
In general, for
( )
(
)
(
(
)
(
)
Thus
( )
( )
(
)
(
)
)
)
For example,
( )
( )
( )
Alternatively, we can utilize the general format of the AR(2) autocorrelation functions as
follows.
The auxiliary equation is:
Its roots are the inverse of the equation
.
That is,
( )(
)
√( )
√
√
We denote the roots as
√
The autocorrelations (
( )
( )
√
) are:
( )
3
We can compute the values of
( )
( )
( )
( )
(
)
(
)
and
(
√
from:
)
(
√
)
The solutions are:
√
√
For example,
( )
(
)
(
)
3. For the following stationary AR(2) model:
(1)
[ ] should satisfy:
[ ]
Approach 1.
Multiplying (1) by
[
]
and take expectations, we can have:
[
]
[
]
[
]
[
]
[ ]
[
]
[
]
Thus, we can derive the following formula:
( )
[
]
[ ] [
]
[ ]
[
]
[
]
[ ] [
]
( [
]
[
] [
])
( [
]
[
] [
])
(
)
(
)
( ) is a constant, we can divide the formula of ( ) by
( ) to derive the
Since
a.c.f. of ( ) that
( )
(
)
(
)
Approach 2.
Alternatively, one can center the AR series to obtain a new series with mean zero – and
hence, with no intercept term, as the following:
(
)
(
)
Let
, we have:
(2)
Multiplying (2) by
and take expectations, we can have:
[
]
[
]
[
]
[
]
Thus, we can derive the following formula:
( )
(
)
(
)
( ) is a constant, we can divide the formula of ( ) by
( ) to derive the
Since
a.c.f. of ( ) that
( )
(
)
(
)
4
4. Solution:
(a)
| |
(b)
| |
(
( )
(
(
)
(
)
)
( )
(
{
(
)
)
(
( ) (
)
(
( ) (
)
)
)
)
( )
We get
( )
( )
{
( )
( )
( )
(
(
)
)
(
( )
( )
(
)
(
)
(
(
)
( )
)(
)
)
(
)(
)
5
© Copyright 2026 Paperzz