Overview of Linear Prediction
Introduction
• Terms and definitions
• Important for more applications than just prediction
• Nonstationary case
• Prominent role in spectral estimation, Kalman filtering, fast
algorithms, etc.
• Stationary case
• Prediction is equivalent to whitening! (more later)
• Forward linear prediction
• Clearly many practical applications as well
• Backward linear prediction
• Stationary processes
• Exchange matrices
• Examples
• Properties
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
1
J. McNames
Portland State University
Nonstationary Problem Definition
x̂(n − i) −
x̂(n − i) −
ŷo (n) =
2
− k)
(i)
e (n)
M
ck ∗ (n)x(n − k)
k=0
M
−1
ho (k)x(n − k) =
M
−1
c∗k+1 x(n − k)
k=0
• Also the sums have M + 1 terms in them, rather than M terms as
before
ck ∗ (n)x(n − k)
• Presumably motivated by a simple expression for the error e(i) (n)
where ci (n) 1
Portland State University
c∗k (n)x(n
k=0
k=0
J. McNames
Ver. 1.02
• Note that this is inconsistent with the notation used earlier,
c∗k (n)x(n − k)
e(i) (n) x(n − i) − x̂(n − i)
=
M
k=0
k=i
k=0
k=i
M
Linear Prediction
Change in Notation
Given a segment of a signal {x(n), x(n − 1), . . . , x(n − M )} of a
stochastic process estimate x(n − i) for 0 ≤ i ≤ M using the
remaining portion of the signal
M
ECE 539/639
ECE 539/639
Linear Prediction
Ver. 1.02
3
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
4
Types of “Prediction”
x̂(n − i) −
M
c∗k (n)x(n
− k)
e (n) (i)
k=0
k=i
Linear “Prediction” Notation and Partitions
M
x(n) x(n) x(n − 1) . . .
x̄(n) x(n) x(n − 1) . . .
T
= x(n) xT (n − 1)
T
= xT (n) x(n − M )
∗
ck (n)x(n − k)
k=0
• Forward Linear Prediction: i = 0
• Backward Linear Prediction: i = M
– Misnomer, but terminology is rooted in the literature
R(n) = E[x(n)xH (n)]
• Symmetric Linear Smoother: i = M/2
T
x(n − M + 1)
T
x(n − M )
R̄(n) = E[x̄(n)x̄H (n)]
Forward and backward linear prediction use specific partitions of the
“extended” autocorrelation matrix
R(n)
rb (n)
rfH (n)
Px (n)
R̄(n) H
R̄(n) rb (n) Px (n − M )
rf (n) R(n − 1)
rf (n) = E[x(n − 1)x∗ (n)]
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
5
Forward and Backward Linear Prediction Estimator and Error
x̂f (n) = −
M
a∗k (n)x(n − k)
J. McNames
x̂b (n) = −
= −aH (n)x(n − 1)
eb =
− k)
= −b (n)x(n)
H
6
Ver. 1.02
8
Pf,o (n) = Px (n) + rfH (n)ao (n)
M
Pb,o (n) = Px (n − M ) + rbH (n)bo (n)
a∗k (n)x(n − k)
= x(n) + aH (n)x(n − 1)
x̂f (n) = −
b∗k (n)x(n
M
a∗k (n)x(n − k)
k=1
− k) + x(n − M )
= −aH (n)x(n − 1)
= b (n)x(n) + x(n − M )
H
k=0
x̂b (n) = −
• Again, new notation compared to the FIR linear estimation case
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
M
−1
b∗k (n)x(n − k)
k=0
H
• I use subscripts for the f instead of superscripts like the text
J. McNames
Ver. 1.02
R(n)bo (n) = −rb (n)
b∗k (n)x(n
k=1
M
−1
Linear Prediction
R(n − 1)ao (n) = −rf (n)
k=0
ef = x(n) +
ECE 539/639
Forward and Backward Linear Prediction Solution
Solution is the same as before, but watch the minus signs
k=1
M
−1
Portland State University
rb (n) = E[x(n)x∗ (n − M )]
= −b (n)x(n)
7
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Stationary Case: Autocorrelation Matrix
Stationary Case: Cross-correlation Vector
When the process is stationary, something surprising happens!
rx (0)
rx∗ (1)
..
R̄
=
∗ .
(M +1)×(M +1)
rx (M − 1)
rx∗ (M )
r rx (1)
M ×1
rx (1)
rx (0)
..
.
rx∗ (M − 2)
rx∗ (M − 1)
rx (2)
...
...
...
..
.
...
...
T
rx (M − 1)
rx (M − 2)
..
.
rx (0)
rx∗ (1)
rx (0)
rx∗ (1)
R̄
= .
(M +1)×(M +1)
..
rx∗ (M )
rx (M )
rx (M − 1)
..
.
rx (1) ...
...
..
.
...
rx (0) rfH
rf
R
R
rb
R̄ = H
rb rx (0)
rx (0)
rx (M )
Clearly,
Px (0) = Px (n − M ) = rx (0)
rx (M )
rx (M − 1)
..
.
rx (0)
rx (1)
r
x (2) r . M ×1
.. rx (M )
rx (0) r T
r∗
R
R
Jr
= H
r J rx (0)
R̄ =
R(n)
rb (n)
rfH (n)
Px (n)
= H
R̄(n) =
rf (n) R(n − 1)
rb (n) Px (n − M )
R(n) = R(n − 1)
rx (1)
rx (0)
..
.
rx∗ (M − 1)
=
rf = E[x(n − 1)x∗ (n)]
= r∗
rb = E[x(n)x∗ (n − M )]
= Jr
where J is the exchange matrix
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
9
J. McNames
Exchange Matrix
⎡
0
⎢ ..
⎢
J ⎢.
⎣0
1
0
..
.
1
0
...
..
.
...
...
⎤
1
.. ⎥
.⎥
⎥
0⎦
0
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
10
Forward/Backward Prediction Relationship
Rbo = −rb = −J r
J J = JJ
H
H
Rao = −rf = −r ∗
=I
J Rao = −J r ∗
J R∗ a∗o = −J r = −rb
J R∗ = RJ
• Counterpart to the identity matrix
R(J a∗o ) = −rb
• When multiplied on the left, flips a vector upside down
bo = J a∗o
• When multiplied on the right, flips a vector sideways
• The BLP parameter vector is the flipped and conjugated FLP
parameter vector!
• Don’t do this in MATLAB—many wasted multiplications by zeros
• See fliplr and flipud
• Useful for estimation: can solve for both and combine them to
reduce variance
• Further the prediction errors are the same!
Pf,o = Pb,o = r(0) + r H ao = r(0) + r H J bo
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
11
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
12
Example 1: MA Process
Example 1: MMSE Versus Prediction Index i
Create a synthetic MA process in MATLAB. Plot the pole-zero and
transfer function of the system. Plot the MMSE versus the point
being estimated,
x̂(n − i) −
M
1
Minimum NMSE Estimated
MNMSE
0.8
c∗k (n)x(n − k)
MNMSE
k=0
k=i
for M = 25.
0.6
0.4
0.2
0
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
13
J. McNames
0
5
Portland State University
Example 1: Prediction Example
ECE 539/639
20
Linear Prediction
14
x(n)
x̂(n + 13)
Signal + Estimate (scaled)
Signal + Estimate (scaled)
Ver. 1.02
M:25 i:13 NMSE:0.168
5
x(n)
x̂(n + 0)
0
−5
0
−5
0
J. McNames
25
Example 1: Prediction Example
M:25 i:0 NMSE:0.388
5
10
15
Prediction Index (i, Samples)
10
20
30
40
50
60
Sample Time (n)
Portland State University
ECE 539/639
70
80
90
Linear Prediction
100
Ver. 1.02
0
15
J. McNames
10
20
30
40
50
60
Sample Time (n)
Portland State University
ECE 539/639
70
80
90
Linear Prediction
100
Ver. 1.02
16
Example 1: Prediction Example
Example 1: MATLAB Code
clear all;
close all;
M:25 i:25 NMSE:0.388
5
N = 5000;
% Number of samples
M = 25;
% Size of filter
b = poly([0.99 0.98*j -0.98*j 0.98*exp(j*0.8*pi) 0.98*exp(-j*0.8*pi)]);
nz = length(b)-1;
% Number of zeros
x(n)
x̂(n + 25)
Signal + Estimate (scaled)
Mmax = M+1;
% Locations of zeros
% Maximum value to consider
%================================================
% Calculate the Auto- and Cross-Correlation
%================================================
rx = conv(b,fliplr(b));
% Quick calculation of rx
k = -nz:nz;
rx = rx(nz+1:end);
% Trim off negative lags
ryx = rx(2:end);
% Cross-correlation one-step ahead
0
%================================================
% Generate Example
%================================================
w = randn(N,1);
x = filter(b,1,w);
%================================================
% Build extended R
%================================================
Re = zeros(M+1,M+1);
for c1=1:M+1,
for c2=1:M+1,
id = abs(c1-c2);
if id<=nz,
Re(c1,c2) = rx(id+1);
end;
−5
0
J. McNames
10
20
30
40
50
60
Sample Time (n)
Portland State University
ECE 539/639
70
80
90
Linear Prediction
100
Ver. 1.02
17
end;
Po = zeros(M+1,1);
X = zeros(N,M+1);
Poh = zeros(M+1,1);
for id=0:M,
R = [Re(1:id
,1:id),Re(1:id
,id+2:end);...
Re(id+2:end,1:id),Re(id+2:end,id+2:end)];
d = [Re(1:id,id+1);Re(id+2:end,id+1)]; % Extract i’th column sans the ith row
Px = Re(id+1,id+1);
co = -inv(R)*d;
Po(id+1) = Px + d’*co;
X(:,id+1) = filter(-[co(1:id);0;co(id+1:end)],1,x);
k = 1:N-(id+1);
Poh(id+1) = mean((x(k)-X(k+id,id+1)).^2);
end;
ECE 539/639
Linear Prediction
ECE 539/639
Linear Prediction
Ver. 1.02
18
%================================================
% Plot Estimates
%================================================
id = [0,round(M/2),M];
for c1=1:length(id),
k = 1:N-(id(c1)+1);
xh = X(k+id(c1),id(c1)+1);
figure;
FigureSet(2,’LTX’);
h = plot(k,x(k),’b’,k,xh(k),’g’);
set(h,’LineWidth’,0.8);
set(h,’Marker’,’.’);
set(h,’MarkerSize’,8);
xlim([0 100]);
ylim(std(x)*[-3 3]);
AxisLines;
xlabel(’Sample Time (n)’);
ylabel(’Signal + Estimate (scaled)’);
set(get(gca,’Title’),’Interpreter’,’LaTeX’);
title(sprintf(’M:%d i:%d $\\widehat{\\mathrm{NMSE}}$:%5.3f’,M,id(c1),mean((x(k)-xh).^2)/var(x(k))));
box off;
AxisSet(8);
hl = legend(h,’$x(n)$’,sprintf(’$\\hat{x}(n+%d)$’,id(c1)));
set(hl,’Interpreter’,’LaTeX’)
print(sprintf(’MAExamplePlot%02d’,id(c1)),’-depsc’);
end;
%================================================
% Plot MMSE Versus Order
%================================================
figure;
FigureSet(1,’LTX’);
h1 = plot3(0:M,Poh/var(x),-1*ones(M+1,1),’ro’);
set(h1,’MarkerFaceColor’,’r’);
set(h1,’MarkerSize’,6);
view(0,90);
hold on;
h2 = stem(0:M,Po/Px,’k’);
set(h2,’MarkerFaceColor’,’k’);
set(h2,’MarkerSize’,4);
hold off;
xlabel(’Prediction Index (i, Samples)’);
ylabel(’MNMSE’);
box off;
xlim([-0.5 M+0.5]);
ylim([0 1.05]);
Portland State University
Portland State University
AxisLines;
AxisSet(8);
legend([h1;h2],’Minimum NMSE Estimated’,’MNMSE’);
print(’MAExampleMNMSEvi’,’-depsc’);
end;
J. McNames
J. McNames
Ver. 1.02
19
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
20
Selected Properties
Example 2: Financial Time Series Forecast
Try predicting a tremor signal acquired from an accelerometer
attached to the wrist of a patient with Parkinson’s disease.
If x(n) is stationary,
• The linear smoother has linear phase
• The forward prediction error filter (PEF) is minimum phase
• The backward PEF is maximum phase
• The forward and backward prediction errors can be expressed as
Pf,o (n) =
det R̄(n)
det R(n − 1)
Pb,o (n) =
det R̄(n)
det R(n)
Other properties and proofs are given in the booik
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
21
J. McNames
Portland State University
Example 2: Data Details
ECE 539/639
Linear Prediction
Ver. 1.02
22
Ver. 1.02
24
Example 2: Prediction Example
The files tremor.park.gz and tremor.phys.gz contain measurements
of the acceleration of the outstretched hand which is supported
and fixed at the wrist.
N:10240
300
The units are arbitrary. The sampling frequency is 300 Hz.
tremor.phys.gz shows the tremor of a healthy person, tremor.park.gz
that of a person suffering from Parkinson’s disease.
The real amplitude of the latter is of course much larger than that
of the former.
Acceleration (scaled)
200
For further information see:
http://www.fdm.uni-freiburg.de/User/lauk/tremor/tremor.html
or
mail to:[email protected]
Jens Timmer
100
0
−100
−200
−300
5
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
23
J. McNames
10
Portland State University
15
20
Time (s)
ECE 539/639
25
30
Linear Prediction
Example 2: Prediction Example
Example 2: Prediction Example
Autocorrelation
1
Partial Autocorrelation
1.2
1
0.5
0.8
ρ
ρ
0.6
0
0.4
0.2
−0.5
0
−1
J. McNames
0
0.5
1
1.5
Portland State University
2
2.5
Lag (s)
3
ECE 539/639
3.5
4
−0.2
4.5
Linear Prediction
Ver. 1.02
25
0
J. McNames
0.05
0.1
Portland State University
Example 2: Prediction Example
0.2
0.25
Lag (s)
0.3
ECE 539/639
0.35
0.4
0.45
Linear Prediction
Ver. 1.02
26
Example 2: Prediction Example
5
x 10
10
0.15
Blackman-Tukey Estimated PSD
0
30
5
10
15
20
25
30
30
25
25
20
20
15
15
10
10
5
5
Frequency (Hz)
PSD (scaled)
8
6
4
2
Signal
0
0
0
J. McNames
50
Portland State University
Frequency (Hz)
ECE 539/639
100
Linear Prediction
0
150
Ver. 1.02
0
400
200
0
−200
27
J. McNames
5
10
Portland State University
15
20
Time (s)
ECE 539/639
25
Linear Prediction
30
Ver. 1.02
28
Example 2: Prediction Example
Example 2: Prediction Example
NMSE:0.542 P:1 M:25
P:1
300
1
Signal
Predicted
0.8
100
NMSE
Acceleration (scaled)
200
0
0.6
0.4
−100
−200
0.2
−300
0.05
J. McNames
0.1
0.15
0.2
Portland State University
0.25 0.3
Time (s)
0.35
0.4
0.45
ECE 539/639
Linear Prediction
0
0.5
Ver. 1.02
29
J. McNames
10
40
50
60
M (samples)
ECE 539/639
70
80
90
Linear Prediction
100
Ver. 1.02
30
Ver. 1.02
32
Example 2: MATLAB Code
clear;
close all;
M:25
%================================================
% Load the Data
%================================================
x = load(’R:/Tremor/Parkins.dat’);
x = x - mean(x);
fs = 300;
% Sample rate (Hz)
1
0.8
nx = length(x);
k = 1:nx;
t = (k-0.5)/fs;
NMSE
30
Portland State University
Example 2: Prediction Example
0.6
% Length of data
% Sample index
% Sample times
%================================================
% Plot the Data
%================================================
figure;
FigureSet(1,’LTX’);
h = plot(t(1:10:end),x(1:10:end),’r’);
set(h,’LineWidth’,0.8);
AxisSet(8);
xlabel(’Time (s)’);
ylabel(’Acceleration (scaled)’);
title(sprintf(’N:%d’,nx));
xlim([t(1) t(end)]);
ylim(prctile(x,[0.5 99.5]));
box off;
print(’RESignal’,’-depsc’);
0.4
0.2
0
20
0
J. McNames
0.1
0.2
0.3
0.4
Portland State University
0.5
P (s)
0.6
ECE 539/639
0.7
0.8
0.9
Linear Prediction
%================================================
% Plot the Autocorrelation
%================================================
Autocorrelation(x,fs,5);
1
Ver. 1.02
31
J. McNames
Portland State University
ECE 539/639
Linear Prediction
for c1=1:m,
for c2=1:m,
R(c1,c2) = rx(abs(c1-c2)+1);
end;
end;
FigureSet(1,’LTX’);
AxisSet(8);
print(’REAutocorrelation’,’-depsc’);
%================================================
% Plot the Partial Autocorrelation
%================================================
PartialAutocorrelation(x,fs,.5);
FigureSet(1,’LTX’);
AxisSet(8);
print(’REPartialAutocorrelation’,’-depsc’);
d = zeros(m,1);
for c1=1:m,
d(c1) = rx(c1+p);
end;
co =
xh =
xh =
NMSE
%================================================
% Plot the Power Spectral Density
%================================================
BlackmanTukey(x,fs,10);
FigureSet(1,’LTX’);
AxisSet(8);
print(’REBlackmanTukey’,’-depsc’);
%================================================
% Plot Segment of Signal and Predicted
%================================================
figure;
FigureSet(1,’LTX’);
h = plot(t,x,’r’,t,xh,’g’);
set(h(1),’LineWidth’,0.8);
set(h(2),’LineWidth’,1.2);
AxisSet(8);
xlabel(’Time (s)’);
ylabel(’Acceleration (scaled)’);
title(sprintf(’NMSE:%5.3f P:%d M:%d’,NMSE,p,m));
legend(h,’Signal’,’Predicted’);
xlim([t(1) 0.5]);
ylim(prctile(x,[0.5 99.5]));
box off;
print(’RESignalPredicted’,’-depsc’);
%================================================
% Plot the Spectrogram
%================================================
NonparametricSpectrogram(decimate(x,5),fs/5,2);
FigureSet(1,’LTX’);
AxisSet(8);
print(’RESpectrogram’,’-depsc’);
%================================================
% Calculate the Auto- and Cross-Correlation
%================================================
rx = Autocorrelation(x,fs,2);
m = 25;
p = 1;
% Filter length
% Number of steps ahead to predict
%================================================
% Sweep M for P=1
%================================================
R = zeros(m,m);
J. McNames
Portland State University
p = 1;
M = 1:100;
nM = length(M);
NMSE = zeros(nM,1);
for c0=1:nM,
m = M(c0);
ECE 539/639
Linear Prediction
Ver. 1.02
33
% Number of steps ahead to predict
% Filter length
d = zeros(m,1);
for c1=1:m,
d(c1) = rx(c1+p);
end;
Portland State University
for c0=1:nP,
p = P(c0);
d = zeros(m,1);
for c1=1:m,
d(c1) = rx(c1+p);
end;
co = inv(R)*d;
xh = filter(co,1,[zeros(p,1);x]);
xh = xh(1:nx);
NMSE(c0) = mean((x-xh).^2)/mean(x.^2);
end;
ECE 539/639
Linear Prediction
Ver. 1.02
34
Ver. 1.02
36
% Filter length
co = inv(R)*d;
xh = filter(co,1,[zeros(p,1);x]);
xh = xh(1:nx);
NMSE(c0) = mean((x-xh).^2)/mean(x.^2);
end;
figure;
FigureSet(1,’LTX’);
h = plot(M,NMSE,’b’);
set(h(1),’LineWidth’,0.8);
AxisSet(8);
xlabel(’M (samples)’);
ylabel(’NMSE’);
title(sprintf(’P:%d’,p));
xlim([0.5 M(end)+0.5]);
ylim([0 1.05]);
box off;
print(’RESweepM’,’-depsc’);
Portland State University
J. McNames
%================================================
% Sweep P for P=1
%================================================
P = 1:300;
% Number of steps ahead to predict
m = 25;
nP = length(P);
NMSE = zeros(nP,1);
R = zeros(m,m);
for c1=1:m,
for c2=1:m,
R(c1,c2) = rx(abs(c1-c2)+1);
end;
end;
R = zeros(m,m);
for c1=1:m,
for c2=1:m,
R(c1,c2) = rx(abs(c1-c2)+1);
end;
end;
J. McNames
inv(R)*d;
filter(co,1,[zeros(p,1);x]);
xh(1:nx);
= mean((x-xh).^2)/mean(x.^2);
figure;
FigureSet(1,’LTX’);
h = plot(P/fs,NMSE,’b’);
set(h,’LineWidth’,0.8);
set(h,’Marker’,’.’);
set(h,’MarkerSize’,5);
hold on;
hb = plot([0 P(end)/fs],[1 1],’r:’);
hold off;
ECE 539/639
Linear Prediction
Ver. 1.02
35
J. McNames
Portland State University
ECE 539/639
Linear Prediction
AxisSet(8);
xlabel(’P (s)’);
ylabel(’NMSE’);
title(sprintf(’M:%d’,m));
xlim([0 P(end)/fs]);
ylim([0 1.05]);
box off;
print(’RESweepP’,’-depsc’);
J. McNames
Portland State University
ECE 539/639
Linear Prediction
Ver. 1.02
37
© Copyright 2026 Paperzz