Map-aided fusion using evidential grids for mobile perception

Introduction Multi-grid fusion approach Results Conclusion and perspectives
Map-aided fusion using evidential grids for mobile
perception in urban environment
The 2nd International Conference on Belief Functions
Marek Kurdej, Julien Moras,
Véronique Cherfaoui, Philippe Bonnifait
[email protected]
UMR CNRS 7253 Heudiasyc, University of Technology of Compiègne
10 May 2012
Map-aided fusion using evidential grids
1/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Outline
l
Introduction
Context
m Perception
m Heterogeneous data sources
m Method
Multi-grid fusion approach
m Occupancy grids
m Incorporating prior knowledge
m Temporal fusion
m Contextual discounting
Results
m Setup
m Results
Conclusion and perspectives
m Conclusion
m Perspectives
m
l
l
l
Map-aided fusion using evidential grids
2/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Context Perception Heterogeneous data sources Method
Introduction
Context
l
l
autonomous driving
increased difficulty in urban environments
m
m
m
m
l
poor satellite visibility
vehicle trajectories hard to predict
great number of mobile objects
good scene understanding is crucial
detailed geographic databases available
Map-aided fusion using evidential grids
3/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Context Perception Heterogeneous data sources Method
Figure: Typical scene.
Map-aided fusion using evidential grids
4/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Context Perception Heterogeneous data sources Method
Figure: Wanted perception result.
Map-aided fusion using evidential grids
5/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Context Perception Heterogeneous data sources Method
Heterogeneous data sources
Data
l
proprioceptive sensors: vehicle pose
l
exteroceptive sensors: range scanner point cloud
l
vector maps
Figure: Data example.
Map-aided fusion using evidential grids
6/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Context Perception Heterogeneous data sources Method
Proposed method
l
l
2D occupancy grids model vehicle environment a
data fusion method based on Dempster–Shafer theory with
meta-knowledge from a digital map
a
Elfes, A.: Using Occupancy Grids for Mobile Robot Perception and
Navigation. In: Computer, 22(6), pp. 46–57 (1989)
Related works
l
few works on the theory of evidence for perception a
l
some research on 3D city model building exists
a
Moras, J., Cherfaoui, V., Bonnifait, P.: Moving Objects Detection by Conflict
Analysis in Evidential Grids. Int. Veh. Symp., pp. 1120–1125, Baden-Baden,
Germany (2011)
Map-aided fusion using evidential grids
7/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Occupancy grids
Rationale
l
l
l
l
a
each cell stores a mass function a
tessellated representation of spatial
information
heterogeneous data, homogeneous
representation
easier fusion, no matching needed
Figure: An example of
occupancy grid.
Elfes proposed probabilistic grids
Proposed method uses 3 grids
l
ScanGrid (SG) – instantaneous information
l
PriorGrid (PG) – prior knowledge
l
MapGrid (MG) – cumulated information
Map-aided fusion using evidential grids
8/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
ScanGrid (SG) construction
Sensor model
l
l
l
frame of discernment
ΩSG = {Free, Occupied }
each scan provides a grid in
polar coordinates
conversion to Cartesian
coordinates
Figure: ScanGrid.
Figure: Evidential sensor model.
Map-aided fusion using evidential grids
9/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
PriorGrid (PG)
Context representation
l
l
l
PG incorporates prior
knowledge about the
environment
same frame of discernment as
MapGrid (next slide)
constructed by projection of
map data, buildings and roads,
onto a 2D grid with global
coordinates.
Map-aided fusion using evidential grids
Figure: PriorGrid.
10/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
MapGrid (MG)
Cumulated information
l
l
stores fusion result
enlarged frame of discernment ΩMG = {F , C , N , S , V }
m
m
m
m
m
l
F – free space
C – mapped stationary objects (infrastructure)
N – non-mapped stationary objects
S – stopped movable objects
V – moving objects
no need to store preceding ScanGrids
Map-aided fusion using evidential grids
11/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Multi-grid fusion approach
discounting
exteroceptive sensor
proprioceptive sensor
maps
ScanGrid
(local)
ScanGrid
(global)
incorporating
ScanGrid
prior
with prior
knowledge
knowledge
fusion
MapGrid
output
PriorGrid
(global)
Figure: Method overview.
Map-aided fusion using evidential grids
12/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Incorporating prior knowledge
Combination of PriorGrid and ScanGrid
1. unify frames of discernment used in SG and PG
by defining refining rSG : 2ΩSG → 2ΩMG such that:
m rSG ({F }) = {F }
m rSG ({O }) = {C , N , S , V }
2. refine ScanGrid:
Ω
Ω
mSGMG (rSG (A)) = mSGSG (A) , ∀A ⊆ ΩSG
(1)
3. apply Dempster’s rule to combine prior information:
Ω
Ω
Ω
m0 SGMG, t = mSGMG, t ⊕ mPGMG
Map-aided fusion using evidential grids
(2)
13/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Multi-grid fusion approach
discounting
exteroceptive sensor
proprioceptive sensor
maps
ScanGrid
(local)
ScanGrid
(global)
incorporating
ScanGrid
prior
with prior
knowledge
knowledge
fusion
MapGrid
output
PriorGrid
(global)
Figure: Method overview.
Map-aided fusion using evidential grids
14/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Conflict analysis
Why?
l
static world hypothesis is no longer valid
l
conflict represents state change
l
method to distinguish moving objects (cells)
How?
l
distinguish two types of conflict
m ∅FO – a free cell in MG is fused with an occupied cell in SG, object
enters the cell
mMG, t (∅FO ) = mMG, t −1 (F ) · mSG, t (O )
m
∅OF – an occupied cell in MG fused with a free cell in SG, object
leaves the cell
mMG, t (∅OF ) = mMG, t −1 (O ) · mSG, t (F )
Map-aided fusion using evidential grids
15/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Counter of cell occupancy
l
we introduce a counter ζ in each cell to include temporal information
about the cell occupancy
ζ (t ) = min 1, ζ (t −1) + δinc
(3)
if mMG (O ) ≥ γO and mMG (∅FO ) + mMG (∅OF ) ≤ γ∅
ζ (t ) = max 0, ζ (t −1) − δdec
(4)
if mMG (∅FO ) + mMG (∅OF ) > γ∅
ζ
l
(t )
= ζ (t −1)
otherwise
parameters: δinc ∈ [0, 1], δdec ∈ [0, 1], γO , γ∅
Map-aided fusion using evidential grids
16/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
MapGrid specialisation
Definition of specialisation matrix S (·, ·)
S (A\ {V } , A) = ζ
∀A ⊆ ΩMG and {V } ∈ A
S (A, A) = 1 − ζ
∀A ⊆ ΩMG and {V } ∈ A
S (A, A) = 1
∀A ⊆ ΩMG and {V } ∈
/A
S (·, ·) = 0
otherwise
(5)
Specialisation of MapGrid
m0 MG, t (A) = S (A, B ) · mMG, t (B )
Map-aided fusion using evidential grids
(6)
17/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Fusion operator
l
combination of previous MapGrid (discounted, specialised) and
current ScanGrid (combined with PriorGrid):
mMG, t =α m0 MG, t −1 ~ m0 SG, t
(7)
l
use of modified Yager’s rule ~ adapted to mobile object detection
l
mass transfer to distinguish moving and stopped cells
(m1 ~ m2 ) (A) = (m1 ∩ m2 ) (A)
A ( Ω, A 6= V
(m1 ~ m2 ) (V ) = (m1 ∩ m2 ) (V ) + (m1 ∩ m2 ) (∅FO )
(m1 ~ m2 ) (Ω) = (m1 ∩ m2 ) (Ω) + (m1 ∩ m2 ) (∅OF )
(m1 ~ m2 ) (∅FO ) = 0
(m1 ~ m2 ) (∅OF ) = 0
Map-aided fusion using evidential grids
(8)
18/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Multi-grid fusion approach
discounting
exteroceptive sensor
proprioceptive sensor
maps
ScanGrid
(local)
ScanGrid
(global)
incorporating
ScanGrid
prior
with prior
knowledge
knowledge
fusion
MapGrid
output
PriorGrid
(global)
Figure: Method overview.
Map-aided fusion using evidential grids
19/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Occupancy grids Incorporating prior knowledge Temporal fusion Contextual discounting
Contextual discounting
Objectives
l
update obsolete information quickly: road objects θR = {F , S , V }
l
save pertinent information: infrastructure θB = {C , N }
Level of temporal persistence
For each θi , we define a level of persistence 1 − αi and corresponding
mass functions:
miΩ (θi ) = αi
miΩ (∅)
= 1 − αi
(9)
(10)
Solution
Discounted mass function is defined by a disjunctive combination:
α
Ω
mΩ = mΩ ∪ mBΩ ∪ mR
Map-aided fusion using evidential grids
(11)
20/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Setup Results
Data
Data set
l
Applanix proprioceptive sensor
l
IBEO Alaska XT exteroceptive sensor – laser-range scanner
l
vector maps from the IGNa in Paris
l
3 km overall trajectory length
a
French National Geographic Institute
Assumptions
l
precise and integral positioning
l
3D accurate maps with building models and road surface
Map-aided fusion using evidential grids
21/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Setup Results
Setup
Parameters
l
grid cell size set to 0.5 m
l
discount rates α defined empirically, can be learnt from data a
l
l
map confidence factor β calculated by ourselves, should be
provided with maps
manually tuned other parameters: counter steps δ and thresholds γ
determine the sensitiveness of mobile object detection
a
Mercier, D., Quost, B., Denoeux, T.: Refined modeling of sensor reliability in
the belief function framework using contextual discounting. J. Inf. Fusion, 9(2), pp.
246–258 (2008)
Map-aided fusion using evidential grids
22/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Setup Results
Results
F
(a)
(b)
C
N
S
(c)
V
(d)
Figure: (a) Scene. (b) PG. (c) MG. (d) MG with prior knowledge.
Map-aided fusion using evidential grids
23/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Setup Results
Results
Map-aided fusion using evidential grids
24/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Conclusion Perspectives
Conclusion
Overview
l
new mobile perception scheme based on prior map knowledge
l
exploiting geographic information to refine possible hypotheses
l
l
modified fusion rule taking into account the existence of mobile
objects
variation in information lifetime modelled by contextual discounting
Issues
l
map inconsistency
l
incomplete data (missing buildings)
l
no reference data for quantitative validation
Map-aided fusion using evidential grids
25/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Conclusion Perspectives
Perspectives
Perspectives
l
removing the hypothesis of map accuracy
l
differentiating navigable and non-navigable free space
l
validate results using reference data
l
l
choose the most appropriate fusion rule and learn algorithm
parameters
fully exploit the 3D map information, e.g. to predict object
movements
Acknowledgements
This work has been supported by ANR (French National Agency) CityVIP
project under grant ANR-07_TSFA-013-01.
Map-aided fusion using evidential grids
26/27
Introduction Multi-grid fusion approach Results Conclusion and perspectives
Conclusion Perspectives
Thank you for your attention
Map-aided fusion using evidential grids
27/27