Gaussian Mixture Filter in Hybrid Navigation 1 Introduction 2

Digest of TISE Seminar 2007, editor Pertti Koivisto, pages 1-5.
Gaussian Mixture Filter in Hybrid Navigation
Simo Ali-Löytty
Institute of Mathematics
Tampere University of Technology
[email protected]
http://math.tut.fi/posgroup/
1
Introduction
The GMF is the approximation of a Bayesian filter (see section 2) where prior and
posterior densities are Gaussian mixtures, i.e. convex combinations of normal density
functions. Kalman Filter, Extended Kalman Filter (EKF), Second Order Extended
Kalman Filter and Bank of Kalman Filters are some special cases of GMF [2, 4].
Hybrid navigation means that measurements used in navigation come from many different sources e.g. Global Navigation Satellite System (e.g. GPS), Inertial Measurement Unit, or local wireless networks such as a cellular network, WLAN, or Bluetooth.
Range, pseudorange, deltarange, altitude, restrictive [3] and compass measurements
are examples of typical measurements in hybrid navigation. The EKF is very commonly used in satellite based positioning and has also been applied to hybrid navigation. Unfortunately, EKF has serious consistency problem in highly nonlinear situations, which means that EKF does not work correctly [4].
2
Bayesian filter
Bayesian filtering problem formulation includes three things: initial state x0 , state
model and measurement model. State model tells how the next state xk+1 depends on
the current state xk ,
xk+1 = f (xk ) + wk
or p(xk+1 |xk ).
Measurement model tells how measurements depend on current state
yk = h(xk ) + vk
or
p(yk |xk ).
Here states x, measurements y and error terms w and v are random variables. The aim
of Bayesian filtering is to solve the state conditional probability density function (cpdf)
p(xk |y1:k ),
△
where y1:k = {y1 , . . . , yk } are past and current measurements. In the hybrid navigation
case the cpdf cannot be determined analytically. Because of this, there are many approximative solutions for example Particle Filter, Grid Based Method and GMF, which
are the topic of the next section [6, 7].
1
3
Gaussian Mixture Filter
3.1 Basics of GMF
Idea of GMF [5] is that both prior density p(xk |y1:k−1) and posterior density p(xk |y1:k )
are Gaussian mixtures
p
p(x) =
X
αi NµΣii (x),
(1)
i=1
where NµΣii (x) is the normal density function with mean µi and covariance matrix Σi ,
P
weights αi ≥ 0 and pi=1 αi = 1. The mean and covariance of a Gaussian mixture (1)
are
p
p
µ=
X
αi µi and Σ =
X
αi (Σi + (µi − µ)(µi − µ)T ).
i=1
i=1
We assume that prior is p(x) (1) and likelihood is p(y|x) =
the posterior, based on Bayes’ rule, is
p(y|x)p(x)
p(x|y) =
=
p(y)
Pm Pp
j=1
i=1
H µ
So now
x̂
j i
(y)NP̂i,j (x)
αi βj NPi,j
i,j
Hj µi
i=1 αi βj NPi,j (y)
Pm Pp
j=1
Hj x
j=1 βj NRj (y).
Pm
,
(2)
where
Pi,j = Hj Σi HTj + Rj ,
x̂i,j = µi + Σi HTj P−1
i,j (y − Hj µi ) and
T −1
P̂i,j = (I − Σi Hj Pi,j Hj )Σi .
We see that posterior is also a Gaussian mixture.
3.2 Where do mixtures come from?
Here are some situations why and when GMF may be preferred over conventional
nonlinear Kalman filter extensions, which can be considered as special (i.e. onecomponent) cases of GMF.
Models Of course it is clear that if our initial state or error models are Gaussian mixtures then GMF is an obvious solution. In hybrid navigation for example, we
can create more realistic error models using Gaussian mixture than only one
Gaussian.
2
Approximation Even if our models are not Gaussian mixtures, it is possible approximate our density functions as Gaussian mixture because it is showed that any
probability density can be approximated as closely as desired by a Gaussian mixture. In hybrid navigation for example, if we can compute likelihood peaks zj ,
then we can approximate likelihood as a Gaussian mixture
p(y|x) ≈
m
X
exp − 12 ky − h(zj ) − h′ (zj )(x − zj )k2R−1
q
det(2πR)
j=1
.
Robustifying Sometimes filters do not work correctly, usually because of approximation errors or modeling errors. One way to detect that something is wrong is to
check the normalization factor p(y). If p(y) is smaller than a threshold value
then either the prior or the measurement is wrong with some risk level. Then
γp(x) + (1 − γ)
m
X
z
αi NHj−1 RH−T (x),
j=1
′
where γ ∈ [0, 1] and H = h (zj ), may be more reasonable posterior than (2).
3.3 Components reduction
One major challenge in using GMF efficiently is keeping the number of components
as small as possible without losing significant information. There is many ways to do
so. We use three different types of mixture reduction algorithms: forgetting, merging
and resampling.
Forgetting we give zero weights to mixture components whose weights are lower than
some threshold value, for example
min 0.001, 0.01 max (αi ) .
i
After that, we normalize weights of the remaining mixture components.
Merging We merge two mixture components to one if distance between components
is lower than some threshold value. Distance is for example
αi αj
dij =
(µi − µj )T Σ−1 (µi − µj ) .
αi + αj
We merge components so that merging preserves overall mean and covariance.
This method, ”collapsing by moments” is optimal in a sense of Kullback-Leibler
distance.
Resampling If after forgetting and merging we have too many mixture components,
we can use a resampling algorithm to choose which mixture components we
use. Finally, we normalize weights of these mixture components. This approach
induces less approximation error, using L1 -norm, than merging two distant components.
3
4
Simulations
Figure 1 gives an example where GMF, which approximates likelihood as Gaussian
mixture, gives better results than EKF. Measurements are range measurements from
two base stations. More results and specific parameters will be published in [1].
True
EKF
GMFmean
GMFcomponents
100 m
Start
Figure 1: Example of an inconsistency problem of EKF and how GMF solves the
problem.
References
[1] S. Ali-Löytty and N. Sirola. Gaussian Mixture Filter in Hybrid Navigation. In
Proceedings of The European Navigation Conference GNSS 2007, (to be published)
[2] S. Ali-Löytty. Hybrid positioning algorithms. In P. Koivisto, editor, Digest of
TISE Seminar 2006, volume 5, pages 43–46. TISE, 2006.
[3] S. Ali-Löytty and N. Sirola. A modified Kalman filter for hybrid positioning. In
Proceedings of ION GNSS 2006, September 2006.
4
[4] S. Ali-Löytty, N. Sirola, and R. Piché. Consistency of three Kalman filter extensions in hybrid navigation. In Proceedings of The European Navigation Conference GNSS 2005, Munich, Germany, July 2005.
[5] D. L. Alspach and H. W. Sorenson. Nonlinear bayesian estimation using gaussian sum approximations. IEEE Transactions on Automatic Control, 17(4):439–
448, Aug 1972.
[6] N. Sirola and S. Ali-Löytty. Moving grid filter in hybrid local positioning. In
Proceedings of the ENC 2006, Manchester, May 7-10, 2006.
[7] N. Sirola, S. Ali-Löytty, and R. Piché. Benchmarking nonlinear filters. In Nonlinear Statistical Signal Processing Workshop NSSPW06, Cambridge, September 2006.
5