21. NL State Estimation (Sigma Points) MAE 546 2017.pptx

Nonlinear State Estimation!
Particle, Sigma-Points Filters!
Robert Stengel!
Optimal Control and Estimation, MAE 546 !
Princeton University, 2017
!! Particle filter
Unscented
!! Sigma-Points (
Kalman
) filter
!! Transformation of uncertainty
!! Propagation of mean and
variance
!! Helicopter, HIV state
estimation examples
!! Additional nonlinear filters
Copyright 2017 by Robert Stengel. All rights reserved. For educational use only.
http://www.princeton.edu/~stengel/MAE546.html
http://www.princeton.edu/~stengel/OptConEst.html
1
Criticisms of the Basic
Extended Kalman Filter*
*Julier and Uhlmann, 1997; Wan and van der Merwe, 2001
•! State estimate prediction is deterministic, i.e., not
based on an expectation
–! (Not true; expectation is the same as the deterministic
prediction)
•! State estimate update is linear
•! Jacobians must be evaluated to calculate
covariance prediction and update
•! Not all comments apply to iterated, quasilinear, or
adaptive extended Kalman filters
2
Transformation of Uncertainty
Nonlinear transformation of a random variable
x : Random variable with mean, x, and covariance, Pxx
y = f [x]
Estimate the mean and covariance of the
transformations output
y ( x, Pxx ) and Pyy ( x, Pxx )
The transformation is said to be unscented
*
if its probability distribution is
Consistent, Efficient, and Unbiased
3
*! Julier and Uhlmann, 1997
Consistent Estimate of a
Dynamic State
Let x k ! x, x k+1 ! y
(
)
(
)
x k+1 x k ,Px k x k = y ( x,Pxx ) and Px k+1x k+1 x k ,Px k x k = Pyy ( x,Pxx )
Consistent state estimate converges in the limit
{P
}
T
! E "( x k +1 ! x k +1 ) ( x k +1 ! x k +1 ) $ & 0
#
%
{Estimated Covariance – Actual Covariance} & 0
x k+1 x k+1
Lesson: In filtering, add sufficient process noise to
the filter gain computation to prevent filter divergence
4
Adding Process
Noise Improves
Consistency
{P
Filter’s estimate of RMS state error
Actual state estimate error
}
T
! E "( x k+1 ! x k+1 ) ( x k+1 ! x k+1 ) $ & 0
#
%
{Estimated Covariance – Actual Covariance} & 0
x k+1x k+1
•! Satellite orbit determination
–! Aerodynamic drag
produced unmodeled bias
–! Optimal filter did not
estimate bias
Filter’s estimate of RMS state error
•! Process noise increased
for filter design
Actual state estimate error
–! Divergence is contained
Fitzgerald, 1971
5
Efficient and Unbiased Estimate
of a Dynamic State
Efficient state estimator converges more
quickly than an inefficient estimator
min
Added Process Noise
{Estimated Covariance – Actual Covariance}
Add just enough
process noise
Unbiased estimate
x k +1 = E ( x k +1 ) [Estimated Mean = Actual Mean]
6
Empirical Determination of a
Probability Distribution
•! Monte Carlo evaluation
–! Propagate random noise through
nonlinearity for given prior
distribution (e.g., Gaussian)
–! Generate histogram of output
•! Repeated evaluation (N trials) of
nonlinear system propagation
•! Use histogram as numerical
representation of distribution, or
•! Identify associated theoretical
distribution using functional
minimization
!! Particle Filter
–! Propagate mean, E(x), using the empirical probability distribution
–! As N –> ", estimate error –> 0
7
Empirical Determination of
Mean and Variance
•! Sample mean for N data
points, x1, x2, ..., xN
•! Sample variance for
same data set
N
x=
!x
i
i =1
N
N
! x2 =
#( x
i =1
i
" x)
2
( N " 1)
–! Divisor is (N – 1) rather than N to produce an
unbiased estimate
!! Sigma Points Filter:
!! Propagate mean, E(x), using the sample mean and variance
8
Sigma Points of Pxx and Mean
2-D Example
9
Sigma Points of Pxx and Mean
3-D Example
10
Sigma Points of Estimate
Uncertainty, Pxx
State covariance matrix
Pxx : Symmetric, positive-definite covariance matrix
Eigenvalues are real and positive
sI n ! Pxx = ( s ! "1 ) ( s ! "2 )!( s ! "n )
Eigenvectors
( !i In " Pxx )# ei = 0,
i = 1,n
Modal matrix
E = ! e1 e 2 ! e n #
"
$
11
Sigma Points of Pxx
Diagonalized covariance matrix
Eigenvalues are the Variances
$
&
"1
T
! = E Pxx E = E Pxx E = &&
&
&%
0 ' $
) &
0 #2 ! 0 ) &
=&
! ! ! ! ) &
)
0 0 ! #n ) &
( %
#1
0
!
0 '
)
0 * 22 ! 0 )
)
! ! ! ! )
0
0 ! * n2 )
(
* 12
0
!
1)# Principal axes of the covariance matrix are defined by
modal matrix, E
2)# Location of 2n one-sigma points in state space given by
# ±!x (" ) ±!x (" ) ! ±!x (" )
1
2
n
$%
# ±" 1
%
& = E% 0
% !
'(
%
%$ 0
0
±" 2
!
0
0
&
(
! 0 (
! ! (
(
! ±" n (
'
!
12
Propagation of the
Mean Value and
Covariance Matrix!
13
Propagation of the Mean Value
and the Sigma Points
!! Mean value at tk
x (t k ) = xk
!! Sigma points (relative to mean
value)
$
x k " #x k (! i ) , i = 1,n
&
! ik ! %
&' x k + #x k (! i ) , i = ( n + 1) ,2n
!! Projection from the prior mean
x k +1 = x k +
t k+1
% f !" x (t ), u (t ), w (t ),t #$ dt
tk
Nonlinear projection from each prior sigma point
! ik+1 = ! ik +
t k+1
& f "# ! (t ), u (t ), w (t ),t $% dt ,
tk
i
i = 1, 2n
14
Estimation of the Propagated
Mean Value
•! Assumptions:
•! To 2nd order, the propagated probability distribution is
symmetric about its mean
•! New mean is estimated as average or weighted average of
projected points (arbitrary choice by user)
Ensemble Average for the Mean Value
2n
x̂ k +1 =
x k +1 + " ! ik+1
i =1
2n + 1
Weighted Ensemble
Average for the Mean Value
2n
x̂ k+1 =
x k+1 + ! # " ik+1
i=1
2! n + 1
15
Projected Covariance
Matrix
Unbiased ensemble estimate of the covariance matrix
Px k+1x k+1 =
2n
T '
$
1
T
%( x k+1 ! x̂ k+1 ) ( x k+1 ! x̂ k+1 ) + # " ik+1 ! x̂ k+1 " ik+1 ! x̂ k+1 (
( 2n + 1) ! 1 &
i=1
)
(
)(
)
This estimate neglects effects of disturbance
uncertainty during the state propagation from tk to tk+1
Does not require calculation of Jacobian matrices
16
Sigma Points of
Disturbance Uncertainty, Q
Q:
( s ! s ) Symmetric, positive-definite covariance matrix
sI s " Q = ( s " #1 ) ( s " #2 )!( s " #s ) [s = Laplace operator]
( #i Is " Q)$ ei = 0,
i = 1, s
EQ = % e1 e 2 ! e s '
&
(Q
#
%
! Q = ETQQEQ = %
%
%
%$
0 ! 0 & #
( %
0 "2 ! 0 ( %
=%
! ! ! ! ( %
(
0 0 ! "s ( %
'Q $
"1
# ±!w (" ) ±!w (" ) ! ±!w (" )
1
2
s
%$
) 12
!! Eigenvalues
!! Modal Matrix
&
0 ! !!0 Transformation
(
0 (
(
! ! ! ! (
0 0 ! ) s2 (
'Q
0
) 22 !
# ±" 1 0
%
& = E % 0 ±" 2
Q%
('
! !
%
0
%$ 0
0
&
(
! 0 (
! ! (
(
! ±" s (
'Q
!
17
Propagation of the
Disturbed Mean Value
Sigma points of disturbance (relative to mean value)
%
w k + "w k (# i ) , i = 1, s
'
! ik ! &
'( w k $ "w k (# i ) , i = ( s + 1) ,2s
Incorporation of effects of disturbance
uncertainty on state propagation
(x )
! i k+1
= xk +
t k+1
& f "# x (t ),u (t ),!! (t ),t $% dt ,
tk
i
i = 1,2s
18
Estimation of the Propagated Mean
Value with Disturbance Uncertainty
Estimate now includes effect of disturbance uncertainty
Estimate of the mean is the average or weighted average of
projected points
Ensemble Average for the Mean Value
x̂ k +1 =
2n
2s
i =1
i =1
( )
x k +1 + " ! ik+1 + " x# i
k +1
2 (n + s) + 1
Weighted Ensemble Average for the Mean Value
2s
% 2n
x k+1 + ! ' # " ik+1 + # x$ i
& i=1
i=1
x̂ k+1 =
2! ( n + s ) + 1
( )
(
k+1 *
)
19
Covariance Propagation with
Disturbance Uncertainty
Unbiased sampled estimate of the covariance matrix
Px k+1x k+1 =
1
Pmean + Psigma + Pdisturbance
!" 2 ( n + s ) + 1#$ % 1
(
)
Pmean = ( x k+1 ! x̂ k+1 ) ( x k+1 ! x̂ k+1 )
T
2n
(
)(
Psigma = # " ik+1 ! x̂ k+1 " ik+1 ! x̂ k+1
i=1
2s
( )
Pdisturbance = # % x$ i
&
i=1
( )
! x̂ k+1 ' % x$ i
(&
k+1
)
T
! x̂ k+1 '
(
k+1
T
20
Sigma-Points
(“Unscented Kalman”)
Filter!
21
System Vector Notation
System vector
Expected value of system vector
! x $
v!# w &
&
#
#" n &%
dim ( v ) = ( n + r + s ) ' 1
! x̂ 0
#
v̂ 0 = # ŵ 0
# n̂
" 0
$
! x0
&
#
& = E # w0
&
# n
%
" 0
! 'x
$
# 0
&
w
& ! '0 = # '0
# n
&
#" ' 0
%
$
&
&
&
&%
Propagation of the mean
!
x
k+1
x
k
=! +
t k+1
& f "# ! (t ),u (t ), ! (t ),t $% dt
x
w
tk
Measurement vector, corrupted by noise
(
! = h "x , "n
)
Analogous to z ( t ) = Hx ( t ) + n ( t )
22
Matrix Array of System
and Sigma-Point Vectors
Expected value of system vector
" x̂ 0
$
! 0 = v̂ 0 = $ ŵ 0
$ n̂
# 0
%
'
' ; dim ( ! 0 ) = ( n + r + s ) ( 1 ! L ( 1
'
&
Weighted sigma points for system vector
$
v̂ i + " ( S )i , i = 1, L
&
!i = %
&' v̂ i # " ( S )i , i = L + 1,2L
S : Square root of P; ( S )i
(
&
); dim ( ! i ) = 2L + 1
&*
! i th column of S
Matrix of mean and sigma-point vectors
! ! # "0
$
"1 " " 2 L % ; dim ( ! ) = L ' ( 2L + 1)
&
23
Initialize Filter
State and covariance estimates
x̂ o = E !" x ( 0 ) #$ = % x ( 0 )
{
Px ( 0 ) = E "# x ( 0 ) ! x̂ ( 0 ) $% "# x ( 0 ) ! x̂ ( 0 ) $%
T
}
Covariance matrix of system vector
{
Pv ( 0 ) = E "# v ( 0 ) ! v̂ ( 0 ) $% "# v ( 0 ) ! v̂ ( 0 ) $%
" Px ( 0 )
0
0
&
=& 0
Qw ( 0 )
0
&
0
Rn (0)
&# 0
T
}
$
'
'
'
'%
24
Propagate State Mean and Covariance
Incorporate disturbance sigma points
(! )
x
i k+1
= (!
t k+1
) + & f "#( ! )(t ),u (t ), ( ! )(t ),t $% dt
x
i k
tk
x
i
w
i
Ensemble average estimates of mean and covariance
!i : Weighting factor
Typically
2L
x̂ k+1 ( ! ) = $ "i # ix k+1
" 1 ( L + 1) , i = 0
$
!i = #
1 2 ( L + 1) , i = 1,2L
%$
i=0
P
x
k+1
2L
( ! ) = ( "i
i=0
{
T
$% # ix ( ! ) ! x̂ ( ! ) &' k+1 $% # ix ( ! ) ! x̂ ( ! ) &' k+1
}
25
after Wan and van der Merwe, 2001
Incorporate Measurement Error
in Output
Mean/sigma-point projections of measurement
( ! i )k +1 = h #$( " ix )k +1 , ( " in )k +1 %& ,
i = 0, 2L
Weighted estimate of measurement projection
2L
ŷ k+1 ( ! ) = $ "i # i
i=0
k+1
26
Incorporate Measurement Error
in Covariance
Prior estimate of measurement covariance
P
y
k+1
2L
( ! ) = ( "i
i=0
{$% # (! ) ! ŷ (! )&'
i
T
$ # ( ! ) ! ŷ ( ! ) &' k+1
k+1 % i
}
Prior estimate of state/measurement cross-covariance
P
xy
k+1
2L
{
T
( ! ) = ) "i $% # ix ( ! ) ! x̂ ( ! )&' k+1 $% ( i ( ! ) ! ŷ ( ! )&' k+1
i=0
27
after Wan and van der Merwe, 2001
Compute Kalman Filter Gain
Original formula (eq. 3, Lecture 18)
K k = Pk ( ! ) HTk "# H k Pk ( ! ) HTk + R k $%
!1
Sigma points version w/index change
K k +1 = P
xy
k +1
( ! ) "# P ( ! )$%
y
k +1
!1
28
}
Post-Measurement State and
Covariance Estimate
State estimate update
x̂ k +1 ( + ) = x̂ k +1 ( ! ) + K k +1 "# z k +1 ! ŷ k +1 ( ! ) $%
Covariance estimate update
Pkx+1 ( + ) = Pkx+1 ( ! ) ! K k +1Pky+1 ( ! ) KTk +1
29
Example: Simulated Helicopter UAV Flight
van der Werwe and Wan, 2004
30
Comparison
of Pitch,
Roll, and
Yaw Errors
31
Comparison of RMS Errors for EKF,
UKF, and Gauss-Hermite Filter (GHF)
EKF
UKF
GHF
CD4+ Cells
HIV Virions
Effector Cells
Banks et al, A COMPARISON OF NONLINEAR FILTERING APPROACHES
IN THE CONTEXT OF AN HIV MODEL, 2010
32
Comparison of Various Filters for
“Blind Tricyclist Problem”!
(Psiaki, 2013)
•!
•!
•!
•!
•!
Extended Kalman filter
Unscented Kalman filter
Particle filter
Batch least-squares filter
Backward-smoothing EKF
33
Observations
!! Jacobians vs. no Jacobians
!! Number of nonlinear propagation steps
!! Gaussian vs. approximate non-Gaussian
distributions
!! Best choice of averaging weights is problemdependent
!! Comparison of filters is problem-dependent
!! Are these filters better than a quasi-linear
filter? (TBD)
34
More on Nonlinear Estimators
!! Gordon, N., Salmond,D., and Smith, A. “Novel Approach to
Nonlinear/Non-Gaussian Bayesian State Estimation,” IEE Proc.-F,
(140(2), Apr 1993, pp. 107-113.
!! Banks, H., Hu, S., Kenz, Z., and Tran, H., “A Comparison of
Nonlinear Filtering Approaches in the Context of an HIV Model,”
Math. Biosci. Eng., 7 (2), Apr 2010, pp. 213-236.
!! Psiaki, M., “Backward-Smoothing Extended Kalman Filter,”
Journal of Guidance, Control, and Dynamics, 28 (5), Sep 2005, pp.
885-894.
!! Psiaki, M., “The Blind Tricyclist Problem and a Comparison of
Nonlinear Filters,” IEEE Control Systems Magazine, June 2013,
pp. 48-54.
!! Psiaki, M., Schoenberg, J., and Miller, I., “Gaussian Sum
Reapproximation for Use in a Nonlinear Filter, Journal of
Guidance, Control, and Dynamics, 38 (2), Feb 2015, pp. 292-303.
35
Next Time:!
Adaptive State Estimation!
36