Adaptive Estimation Using State

State-Space Recursive Least
Squares
EE-869 Adaptive Filters
College of Electrical & Mechanical Engineering
National University of Sciences & Technology (NUST)
Introduction

A recursive algorithm

Built around state-space model of an unforced system

Based on least squares approach

Does not require process or observation noise statistics

Works for time-invariant and time-variant environment alike

Can handle scalar and vector observations
SSRLS
3
State-Space Model
v[k ]
Unforced System
x[k  1]
1
z I
x[k ]
C[k ]

y[k ]
A[k ]
x[k ]  n
y[k ]  m
v[k ]  m
A[k ]  n n
C[ k ]  m  n
A[k ], C[k ]
SSRLS
process states
output signal
observation noise
system matrix (full rank)
observation matrix (full rank)
L-step observable
4
Batch Processed Least
Squares Approach
Batch of Observations
 p 1

CA
x[k ] 
 y[k  p  1]   Cx[k  p  1] 
  p2

 y[k  p  2] Cx[k  p  2]
x[k ]
CA

 




 

  v p [k ]
y p [k ]  

  v p [k ]  
 CA2 x[k ] 
 y[k  2]   Cx[k  2] 


 y[k  1]   Cx[k  1] 
1
 CA x[k ] 

 



y[k ]   Cx[k ] 

Cx
[
k
]


v p [k ]  v[k  p  1] v[k  p  2]
v[k  1] v[k ]
T
Noise Vector
y p [k ]  H p x[k ]  v p [k ]
Batch Processed Least Squares Approach
6
Least Squares Solution
 CA p 1 
  p  2  Full rank for p  l
CA




Hp  
 CA2 


1
 CA 


C


Weighting Matrix
 p 1I m

 0
W 

 0

 0
xxˆ[k ]  ( H Tp H p )1 H Tp y p [k ]
xˆ[k ]  ( H TpWH p )1 H TpWy p [k ]
Batch Processed Least Squares Approach
0
0
 p2 Im
0
0
 Im
0
0
0

0


0

Im 
Batch Processed Least
Squares Solution
Batch Processed Weighted
Least Squares Solution
7
Recursive Algorithm
Predict and Correct
x [k ]  Axˆ[k  1]
y[k ]  C x [k ]  CAxˆ[k  1]
 [k ]  y[k ]  y[k ]
xˆ[k ]  x [k ]  K [k ] [k ]
Predicted States
Predicted Signal
Prediction Error
Predictor Corrector Form
Estimator Gain
Recursive Algorithm
9
Recursive Solution
H [k ]  CA
k
 k 1
CA
y[k ]   y[0] y[1]
 k I m

 0
W [k ]  

 0

 0

Recursive Algorithm
CA
C 
y[k  1] y[k ]
T
0
k 1
1
0
Im
0
0
 Im
0
0
0

0


0

Im 
T
Based on k+1 observations
k+1 observations
Weighting Matrix
10
Recursive Solution (‘contd)
[k ]  H T [k ]W [k ] H [k ]
Defined variables
 [k ]  H T [k ]W [k ] y[k ]
[k ]xˆ[k ]   [k ]
xˆ[k ]   1[k ] [k ]
Recursive Algorithm
Direct Form of
SSRLS
11
Recursive Update of [k ]
[k ]  H T [k ]W [k ] H [k ]
[k ]  
k
A 
T
k
[k  1]  
k
T
C CA
k 1
 
T
A

 k 1
k 1
A 
T
 k 1
CT CA k 1 
CT CA k 1 
  AT CT CA1  CT C
[k ]   AT [k  1] A1  C T C
Recursive Algorithm
  AT CT CA1  C T C
Difference
Lyapunov Equation
12
Matrix Inversion Lemma
En n  Fn1n  GnT m Dm1 mGmn
E 1  F  FGT ( D  GFGT )1 GF
Recursive Algorithm
Matrix Inversion
Lemma
13
1
Recursive Update of  [k ]
E  AT [k ] A
Define
F  [k  1]
G  CA
DI
1
 A [k ] A   1 1[k  1]   2 1[k  1] AT C T 
T
1
 I   CA [k  1] A C  CA 1[k  1]
1
1
T
T
 1[k ]   1 A 1[k  1] AT   2 A 1[k  1] AT C T 
1
 I   CA [k  1] A C  CA [k  1] A
1
Recursive Algorithm
1
T
T
1
T
Riccati Equation
for SSRLS
14
Recursive Update of  [k ]
 [k ]  H T [k ]W [k ] y[k ]
 [k ]  
k
A 
T
k
 [k  1]  
C y[0]  
k 1
T
A 
T
 k 1
k 1
A 
T
 k 1
C T y[0] 
C T y[1] 
  AT C T y[k  2]  C T y[k  1]
 [k ]   AT  [k  1]  CT y[k ]
Recursive Algorithm
  AT C T y[k  1]  C T y[k ]
Recursive solution
15
Observer Gain K [k ]
K [k ]  1[k ]C T
Recursive Algorithm
Defined
16
State-Space Representation of SSRLS
w[k ]   [k  1]
Defined
Therefore
 [k ]   AT  [k  1]  CT y[k ]
w[k  1]   AT w[k ]  C T y[k ]
xˆ[k ]   1[k ] z[k ]
Similarly
  1[k ] AT w[k ]  K [k ] y[k ]


 AT , CT , 1[k ] AT , K[k ]
Recursive Solution
State-Space Matrices
17
Initializing SSRLS
[0], [1],
[0]   I
 0
or
[l  2]
[0]   I  C T C
Rank Deficient
1) Initializing using
Regularization Term
x [0]  0
2) Initialization using batch processing approach leads to delayed
recursion - offers better initialization
Recursive Algorithm
18
Steady-State SSRLS
Steady-State Solution of SSRLS
 AT A1    CT C
Can be written like this


lim [k ]      A
k 
i 0
i
T
 C C A 
i
T
1
i
if
  min Eigenvalues( A)
 1
Steady-State SSRLS
For neutrally stable systems
20
Direct Form of Steady-State SSRLS
xˆ[k ]   1[k ] [k ]
xˆ[k ]   1 [k ]
Steady-State SSRLS
21
Observer Gain for Steady-State SSRLS
K   1 [k ]
Steady-State SSRLS
22
Transfer Function Representation
1
T
H ( z )   A

Steady-State SSRLS
1 
T
A



zI   A
T

1
 zI   A 
T
CT  K
1
 I  CT

23
Initialization of Steady-State
SSRLS
Initialize only
x [0]
x [0]  0
Steady-State SSRLS
Preferable choice if no
other estimate is
available
24
Memory Length
1
1 
1    
2
Steady-State SSRLS
1

1 
Filter Memory
Asymptotic result
25
Special Cases
Constant Model
A 1
C 1
1   z

H ( z) 
z 
Special Cases
System Matrices
Transfer Function
27
Constant Velocity Model
1 T 
A

0
1


C  1 0
 1    z  z (1   )  2  


2
z 


H ( z)  

2
1    z  z  1 

2


z






Special Cases
System Matrices
Transfer Function
28
Constant Acceleration Model
1 T T 2 2 


A  0 1
T 
0 0

1


C  1 0 0
 



 z z 2 1   3  3 z 1   2  3 2 1   

3

z 


 1   2 z  z  1 3 z 1     1  5 



H ( z) 
3

2T  z   

3
2

1    z  z  1


3

T2 z 

Special Cases
System Matrices
 










Transfer Function
29
Sinusoidal Signal
 cos(T ) sin(T ) 
A


sin(

T
)
cos(

T
)


C  1 0
System Matrices


1    z  z (1   )  2 cos T  


2
2
z  2 z cos T   


H ( z)  

1


z
z
(1


)
cos

T


cos
2

T

1










2
2


z

2
z

cos

T






Special Cases
Transfer Function
30
Computational Complexity
Computational Complexities
32
Computer Simulation
Sinusoidal Signal
r (t )  a sin( t   )
Sinusoid
Signal Model
 cos( T ) sin( T ) 
 sin( ) 
x[k  1]  
x[k ]; x[0]  



sin(

T
)
cos(

T
)
cos(

)




y[k ]  [a 0] x[k ]  v[k ]
  0.1
   /3
a 1
Initial Value
Simulation Parameters
T  0.1
 v2  0.001
Computer Simulation
34
Results
Simulation Parameters
  0.1
   /3
a 1
T  0.1
 v2  0.001
  0.98
  0.02
Computer Simulation
35