System Identification and Data Analysis Homework # 2

System Identification and Data Analysis
Homework # 2
Consider the stochastic process y(t) generated as in Figure 1
r
y
+
C
G
−
K
Figura 1.
where
G(z) =
z−
z−
3
4
1
2
,
H(z) =
1
,
z
K is a real constant and u(t) is a Gaussian white process with zero mean and variance
equal to 8.
1. Find the values of K for which y(t) is asymptotically stationary.
2. Find the spectrum Φy (z) of y(t) and write the equation of y(t) in the form of a MA,
AR or ARMA process.
3. Set K = 1 and simulate a realization of process y(t), for t = 1, . . . , N , with N = 100.
4. By using the realization obtained at point 3, compute the 2-step-ahead FIR predictor
of order 1
ŷ(t + 2) = g0∗ y(t) + g1∗ y(t − 1)
and the corresponding sample mean square error.
5. By using the realization obtained at point 3, compute the 2-step-ahead LMSE
prediction ŷ(t + 2|t) and the corresponding sample mean square error.
6. Compare the results obtained in the last two points. Find a meaningful way to
evaluate the true mean square error of the two predictors.
Solution of homework # 2
1. The transfer function from u(t) to y(t) is equal to
3
Kz
z
−
K G(z)
4
.
G(z) =
=
1 − K G(z) H(z)
z 2 − K + 21 z + 34 K
The poles of the system are
q
1
K + 2 ± K 2 − 2K + 14
p1,2 =
2
4
6
and by imposing |p1,2 | < 1 one gets: − < K < .
7
3
2. The spectrum of y(t) is given by
Kz z − 34
Kz −1 z −1 − 34
Φy (z) = G(z) G(z −1 ) σu2 =
8
z 2 − K + 12 z + 43 K z −2 − K + 12 z −1 + 43 K
1 − 34 z −1
1 − 34 z
=
8K 2 .
1
1
3
3
−1
−2
2
1 − K + 2 z + 4 Kz 1 − K + 2 z + 4 Kz
Therefore, y(t) is an ARMA(2,1) process, whose equation can be written as
3
1
3
y(t) − K +
y(t − 1) + Ky(t − 2) = w(t) − w(t − 1)
2
4
4
where w(t) is a white process with zero mean and variance equal to 8K 2 .
3.-5. See file homework2 solution.m
6. From the theory, one has that the mean square error for the FIR filter of order m
is given by
M SEF IRm = Ry (0) −
m
X
gi∗ Ry (l + i)
i=0
where l is the prediction horizon. Here, l = 2, m = 1 and M SEF IR1 = 14.7781.
For the Wiener filter, one can apply the long division
1 − 43 z −1
1 − 32 z 1 + 43 z −2
1 − 32 z −1 + 34 z −2
1 + 34 z −1
3 −1
z
4
− 34 z −2
3 −1
z
4
− 98 z −2 +
9 −3
z
16
3 −2
z
8
9 −3
z
16
−
which yields the Wiener predictor
3 − 9 z −1
ŷ(t + 2|t) = 8 16
y(t)
1 − 34 z −1
whose MSE is given by
M SEW iener =
σe2
1+
q12
=8
9
1+
16
= 12.5.
A possible alternative to the theoretical approach is to repeat steps 3.-5. for a large
number of realizations (say, 1000) and then average the sample MSE values obtained
by the two predictors. The resulting averages approach the theoretical MSE values
as the number of realizations increase.