The Spectral Representation of
Stationary Time Series
Stationary time series satisfy the properties:
1. Constant mean (E(xt) = m)
2. Constant variance (Var(xt) = s2)
3. Correlation between two observations (xt, xt + h)
dependent only on the distance h.
These properties ensure the periodic nature of a
stationary time series
15
10
5
0
0
-5
-10
-15
20
40
60
80
100
Recall
k
xt X i cosi t Yi sin i t
i 1
is a stationary Time series
where 1, 2, … k are k values in (0,p)
and X1, X1, … , Xk and Y1, Y2, … , Yk are independent
independent random variables with
E X i E Yi 0
E Y s
and E X
2
i
2
i
2
i
With this time series
1. E xt 0
We can give it a non-zero mean, m, by
adding m to the equation
k
2.
s h s i2 cos i h
i 1
and
3.
h
s h k
w i cos i h
s 0 i 1
where wi
s i2
k
s
j 1
2
j
We now try to extend this example to a wider
class of time series which turns out to be the
complete set of weakly stationary time series. In
this case the collection of frequencies may even
vary over a continuous range of frequencies
[0,p].
The Riemann integral
b
n
g x dx lim g c x x
x 0
a
i
i 0
i 1
i
The Riemann-Stiltjes integral
b
n
g x dF x lim g c F x F x
x 0
a
i
i 0
i
i 1
If F is continuous with derivative f then:
b
b
a
a
g x dF x g x f x dx
If F is is a step function with jumps pi at xi then:
b
g x dF x p g x
i
a
i
i
First, we are going to develop the concept of
integration with respect to a stochastic process.
Let {U(): [0,p]} denote a stochastic
process with mean 0 and independent
increments; that is
E{[U(2) - U(1)][U(4) - U(3)]} = 0
for
0 ≤ 1 < 2 ≤ 3 < 4 ≤ p.
and
E[U() ] = 0 for 0 ≤ ≤ p.
In addition let
G() =E[U2() ] for 0 ≤ ≤ p and
assume G(0) = 0.
It is easy to show that G() is monotonically non
decreasing. i.e. G(1) ≤ G(2) for 1 < 2 .
Now let us define:
p
g ( )dU ( ).
0
analogous to the Riemann-Stieltjes integral
p
g ( )dF ( ).
0
Let 0 = 0 < 1 < 2 < ... < n = p be any
partition of the interval. Let n .max i i 1
Let *i denote any value in the interval [i-1,i]
Consider:
n
Vn g (*i )[U (i ) U (i 1 )]
i 1
Suppose that lim n 0 and
n
there exists a random variable V such that
lim EVn V 0
2
n
Then V is denoted by:
p
g ( )dU ( ).
0
Properties:
1.
p
E g ( )dU ( ). 0
0
p
2
p 2
2. E g ( )dU ( ). g ( )dG( ).
0
0
p
p
3. E g1 ( )dU ( ). g 2 ( )dU ( ).
0
p 0
g1 ( ) g 2 ( )dG( ).
0
The Spectral Representation of
Stationary Time Series
Let {X(): [0,p]} and {Y(): l [0,p]}
denote a uncorrelated stochastic process with
mean 0 and independent increments.
Also let
F() =E[X2() ] =E[Y2() ]
for 0 ≤ ≤ p and F(0) = 0.
Now define the time series {xt : t T}as follows:
p
p
0
0
xt cos(t )dX ( ) sin( t )dY ( )
Then
p
p
E xt E cos(t )dX ( ) sin( t )dY ( )
0
0
p
p
E cos(t )dX ( ) E sin( t )dY ( )
0
0
0
Also
s h E xt h xt 0
p
p
E cos( t h )dX ( ) sin( t h )dY ( )
0
0
p
p
cos(t )dX ( ) sin( t )dY ( )
0
0
p
p
E cos( t h )dX ( ) cos( t )dX ( )
0
0
p
p
E sin( t h )dY ( ) sin( t )dY ( )
0
0
p
p
E cos( t h )dX ( ) sin( t )dY ( )
0
0
p
p
E sin( t h )dY ( ) cos(t )dX ( )
0
0
p
cos( t h ) cos(t )dF ( )
0
p
sin( t h ) sin( t )dF ( ) 0 0
0
p
cos( t h ) cos(t )
sin( t h) sin( t )dF ( )
0
p
cos(h)dF ( )
0
Thus the time series {xt : t T} defined as
follows: p
p
xt cos(t )dX ( ) sin( t )dY ( )
0
0
is a stationary time series with:
m E xt 0
p
and s h cos(h)dF ( )
0
F() is called the spectral distribution function:
If f() = Fˊ() is called then is called the spectral
density function:
p
p
0
0
s h cos(h)dF ( ) cos(h) f d
Note
p
p
0
0
Var xt s 0 dF ( ) f d
The spectral distribution function, F(), and
spectral density function, f() describe how the
variance of xt is distributed over the frequencies
in the interval [0,p]
The autocovariance function, s(h), can be
computed from the spectral density function, f(),
as follows:
p
p
0
0
s h cos(h) f d eih f d
ei cos( ) i sin( )
Also the spectral density function, f(), can be
computed from the autocovariance function,
s(h), as follows:
1
1
f
s 0 s h cos( h)
2p
p h 1
Example:
Suppose X1, X1, … , Xk and Y1, Y2, … , Yk are
independent independent random variables with
2
2
2
and
E
X
E
Y
s
E X i E Yi 0
i
i
i
Let 1, 2, … k denote k values in (0,p)
k
xt X i cosi t Yi sin i t
i 1
k
Then
s h s i2 cosi h
i 1
If we define
{X(): [0,p]} and
{Y(): [0,p]}
by X ( )
X
i: i
i
and Y ( )
Y
i: i
i
2
2
F ( ) E X ( ) Var X i s i E Y ( )
i:i i:i
2
Note: X() and Y() are “random” step
functions and F() is a step function.
then
p
p
0
0
xt cos(t )dX ( ) sin( t )dY ( )
k
X i cosi t Yi sin i t
i 1
p
s h cos(h)dF ( )
0
k
s i2 cosi h
i 1
k
Note:
s 0 s i2
i 1
Another important comment
In the case when F() is continuous
and F ( ) f ( )
p
then
s h cos(h)dF ( )
0
p
cos(h) f ( )d
0
Sometimes the spectral density function, f(), is
extended to the interval [-p,p] and is assumed
symmetric about 0 (i.e. fs() = fs (-) = f ()/2 )
in this case
p
p
0
p
s h cos(h) f d cos(h) f s d
It can be shown that
1
1
f s
s 0 s h cos(h)
2p
p h 0
1
2
and f s 0 s h cos(h)
p
p
h 0
From now on we will use the symmetric spectral
density function and let it be denoted by, f().
Hence
p
p
p
p
s h cos(h) f d e f d
ih
ei cos( ) i sin( )
1
1
and f
s 0 s h cos(h)
2p
p h 0
Example: Let {ut : t T} be identically
distributed and uncorrelated with mean zero (a
white noise series). Thus
s 2
s h
0
h0
h0
and
1
1
f
s 0 s h cos( h)
2p
p h 1
s2
if p p
2p
Graph:
Linear Filters
Let {xt : t T} be any time series and suppose
that the time series {yt : t T} is constructed as
follows: :
yt
a x
s
s t s
The time series {yt : t T} is said to be
constructed from {xt : t T} by means of a
Linear Filter.
input xt
Linear
Filter
as
output yt
Let sx(h) denote the autocovariance function of
{xt : t T} and sy(h) the autocovariance function
of {yt : t T}. Assume also that E[xt] = E[yt] = 0.
Then:
:
s y h E yt h yt
E as xt h s ak xt k
k
s
E as ak xt h s xt k
s k
a a E x
s k
s
k
x
t hs t k
a a s h s k
s k
s
k
x
p
a a cosh s k f d
p
s k
s k
x
p
i h s k
a
a
Re
e
f x d
s k
s k
p
p
i h s k
Re as ak e
f x d
s k
p
p
ih
i s k
Re e as ak e
f x d
s k
p
p
ih
is
ik
Re e as e ak e f d
k
s
p
ih
Re e as e is
s
p
p
p
p
s
2
f d
x
2
Re eih as e is f x d
p
p
s
2
is
cos
h
a
e
f x d
s y h
s
Hence
p
s y h cosh A f x d
2
p
where
A
i s
a
e
s the Transfer function
s
of the linear filter
Note:
p
p
s
2
is
cos
h
a
e
f x d
s y h
s
p
cosh A f x d
2
p
p
cosh f y d
p
hence
f y A f x
2
2
is
a
e
f x
s
s
Spectral density function
Moving Average Time series of order q, MA(q)
Let 0 =1, 1, 2, … q denote q + 1 numbers.
Let {ut|t T} denote a white noise time series with
variance s2.
Let {xt|t T} denote a MA(q) time series with m = 0.
xt ut 1ut 1 2ut 2 q ut q
Note: {xt|t T} is obtained from {ut|t T} by a
linear filter.
Now
s2
f u
2p
Hence
f x A f u
2
s
2p
2
q
i s
e
f u
s
s 0
i s
e
s
s 0
2
q
2
Example: q = 1
2
s
is
f x
e
s
2p s 0
s2
i 2
1 1e
2p
s2
1 1e i 1 1ei
2p
s2
1 12 1 e i ei
2p
2
1
s2
1 12 21 cos
2p
Example: q = 2
2
s
is
f x
e
s
2p s 0
s2
i
i 2 2
1 1e 2 e
2p
s2
1 1e i 2e i 2 1 1ei 2ei 2
2p
s2
1 12 22 1 1 2 e i ei
2p
2 e i 2 ei 2
2
2
s2
1 12 22 21 1 2 cos 2 2 cos2
2p
Spectral density function for MA(1) Series
0.6
0.6
0.5
0.5
0.4
0.4
0.3
0.3
0.2
0.2
0.1
0.1
0.0
0.0
0.0
1.0
2.0
3.0
0.0
1.0
S pectral Density fuction of an MA(2) S eries
0.4
0.3
0.2
1 = 0.70
2 = -0.20
0.1
0.0
0.0
1.0
2.0
3.0
2.0
3.0
Spectral density function
Autoregressive Time series of order p, AR(p)
Let 1, 2, … p denote p + 1 numbers.
Let {ut|t T} denote a white noise time series with
variance s2.
Let {xt|t T} denote a AR(p) time series with d = 0.
xt 1 xt 1 2 xt 2 p xt p ut
or xt 1 xt 1 2 xt 2 p xt p ut
Note: {ut|t T} is obtained from {xt|t T} by a
linear filter.
Now
s2
f u
2p
Hence
2
p
f u A f x 1 s e is f x
2
s 1
2
p
s2
or
1 s e is f x
2p
s 1
or f x
s2
p
2p 1 s e is
s 1
2
Example: p = 1
s2
f x
1
2p 1 s e
2
is
s2
2p 1 1e
s 1
s2
2p 1 e 1 e
i
1
i
1
s2
2p 1 12 1 e i ei
s2
2p 1 12 21 cos
i 2
Example: p = 2
f x
s2
2p 1 12 22 21 1 2 cos 2 2 cos2
Example : Sunspot Numbers (1770-1869)
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
101
82
66
35
31
7
20
92
154
125
85
68
38
23
10
24
83
132
131
118
90
67
60
47
41
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
21
16
6
4
7
14
34
45
43
48
42
28
10
8
2
0
1
5
12
14
35
46
41
30
24
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
16
7
4
2
8
17
36
50
62
67
71
48
28
8
13
57
122
138
103
86
63
37
24
11
15
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
40
64
98
124
96
66
64
54
39
21
7
4
23
55
94
96
77
59
44
47
30
16
7
37
74
Example B : Annual Sunspot Numbers
(1790-1869)
200
100
0
1770
1800
1830
1860
1890
Autocorrelation function and partial
autocorrelation function
1.0
rh
xt
0.0
10
20
30
40
h
-1.0
1.0
kk
xt
0.0
10
-1.0
20
30
k
40
Spectral density Estimate
Smoothed Spectral Estimator
(Bandwidth = 0.11)
8000
Period = 10 years
6000
4000
2000
0
0.0
0.1
0.2
0.3
f requency
0.4
0.5
Assuming an AR(2) model
Spectral density of Sunspot data
600
Period = 8.733 years
f( )
300
0
0.0
1.0
2.0
3.0
A linear discrete time series
Moving Average time series of
infinite order
Let 0 =1, 1, 2, … denote an infinite sequence of
numbers.
Let {ut|t T} denote a white noise time series with
variance s2.
– independent
– mean 0, variance s2.
Let {xt|t T} be defined by the equation.
xt m ut 1ut 1 2ut 2 3ut 3
Then {xt|t T} is called a Linear discrete time series.
Comment: A linear discrete time series is a Moving
average time series of infinite order
The AR(1) Time series
Let {xt|t T} be defined by the equation.
xt 1 xt 1 d ut
Then
xt 1 1 xt 2 d ut 1 d ut
12 xt 2 1 1 d ut 1ut 1
12 1 xt 3 d ut 2 1 1 d ut 1ut 1
x
3
1 t 3
1 1 d ut 1ut 1 x
2
1
m ut 1ut 1 2ut 2
2
1 t 2
where
and
i
i
1
m 1 1 d
2
3
1
1
d
1 1
An alternative approach using the back shift operator, B.
The equation:
xt 1 xt 1 d ut
can be written
I 1B xt d ut
Now
since
I 1B I 1B B B
2 2
3 3
I 1BI 1B B B I
1
2
2
3
1
1
1
The equation:
3
1
I 1B xt d ut
has the equivalent form:
xt I 1B d I 1B ut
I 1B 12 B 2 13 B3 d
2 2
3 3
I 1B 1 B 1 B ut
I 1 12 13 d
2
3
ut 1ut 1 1 ut 2 1 ut 3
1
1
For the general AR(p) time series:
where
B xt d ut
B I 1B 2 B2 p B p
The time series {xt |t T} can be written as a linear
discrete time series
xt B d B ut
1
and
B
1
1
I 1B 2 B 3 B
2
3
[(B)]-1can be found by carrying out the
multiplication
B I 1B 2 B 2 3 B 3 I
Thus the AR(p) time series:
B xt d ut
can be written:
xt B d B ut
where
Hence
B B I 1 B 2 B
1
2
xt 1d B ut
d
B ut
1
m ut 1ut 1 2ut 2
This called the Random Shock form of the series
Thus the AR(p) time series:
B xt d ut
can be written:
xt B d B ut
where
Hence
B B I 1 B 2 B
1
2
xt 1d B ut
d
B ut
1
m ut 1ut 1 2ut 2
This called the Random Shock form of the series
The Random Shock form of an
ARMA(p,q) time series:
An ARMA(p,q) time series {xt |t T} satisfies the
equation:
B xt d B ut
where
and
B I 1B 2 B2 p B p
B I 1B 2 B2 q B q
Again the time series {xt |t T} can be written as a
linear discrete time series
namely
xt B d B But
1
1
1 d But
1
where
B B B I 1B 2 B
1
2
3 B
3
(B) =[(B)]-1[(B)] can be found by carrying out
the multiplication
B I B B
1
2
2
3 B B
3
Thus an ARMA(p,q) time series can be written:
xt m ut 1ut 1 2ut 2
where
B B B I 1B 2 B2 3 B3
1
d
d
and m
1 1 1 2 p
The inverted form of a stationary
time series
Autoregressive time series of infinite
order
An ARMA(p,q) time series {xt |t T} satisfies the
equation:
B xt d B ut
where
and
B I 1B 2 B2 p B p
B I 1B 2 B2 q B q
Suppose that B
1
exists.
This will be true if the roots of the polynomial
x 1 1 x 2 x 2 q x q
all exceed 1 in absolute value.
The time series {xt |t T} in this case is called
invertible.
Then
B Bxt B
1
or
1
d ut
p B xt d * ut
where
B B p B I p1B p 2 B
1
and d B d 1
*
1
1
d
d
1
2
p 3B
3
Thus an ARMA(p,q) time series can be written:
xt p 1 xt 1 p 2 xt 2 d ut
*
where
p B B B I p1B p 2 B p 3 B
d
d
*
and d
1 1 1 2 q
1
2
3
This is called the inverted form of the time series.
This expresses the time series an autoregressive
time series of infinite order.
© Copyright 2026 Paperzz