classification

Detection, Classification and
Tracking in a Distributed
Wireless Sensor Network
Presenter: Hui Cao
Problem definition

How can we classify the target (invader) in
the sensor network.
2
We can get information from
one Node
During time
Onset time
Offset time
CPA time and value
(Closed Point of Access)
3
Summary



Sensor networks requires decision fusion
Majority voting is the most popular
decision fusion method. It assumes all
votes are equally accurate.
Not all sensor decisions are equally
accurate. Those closer to the target or
with higher SNR will have better results.
4
Summary (Cont’d)


If source location can be estimated, such
discrepancy can be exploited to improve
the decision fusion accuracy.
We formulate three different methods to
combine sensor decisions based on their
distance from the target and SNR, and we
found encouraging results.
5
Sensor Network Signal
Processing Tasks




Target Detection
(CFAR + region fusion)
Target Classification
(ML+ region fusion )
Target Localization
Target Tracking
(Kalman Filter)
400.0
300.0
200.0
road
Node1-6
Node15
100.0
Node41-42
Node46-56
Node58-61
0.0
-200.0
-150.0
-100.0
-50.0
0.0
50.0
100.0
-100.0
D. Li, K.D. Wong, Y.H. Hu, A.M. Sayeed:
Detection, Classification and Tracking of
Targets. IEEE Signal Processing Magazine
Vol. 19 Issue 2 pp. 17-29
-200.0
6
Some details……


CFAR: Constant false alarm rate
The threshold is dynamically adjusted
according to the noise variant to maintain the
CFAR
ML: Maximum likelihood
7
Not all sensors are equal …

Hypothesis: Classification accuracy is a
function of target-sensor distance and SNR



Each node’s classification rate depends on
SNR.
SNR is also roughly inversely proportional
vehicle-node distance due to acoustic energy
attenuation
Experiment: Determine classification rate
for different levels of distance, SNR
8
Distance vs.
Classification Rate
9
SNR vs.
Classification Rate
10
Classification Rate as a
Function of SNR and Distance
11
Weighted Decision Fusion


Optimal (linear) decision fusion†: perform
weighted voting of individual results (ei(x)
for node i)
Weights (wi for node i) are proportional to
classification rates
1 ei ( x)  k ,1  i  N
Ti ( x  Ck )  
otherwise
0
N
TE ( x  Ck )   wiTi ( x  Ck )
i 1
 arg max TE ( x  Ck )
1 k  K
† Z. Chair and P. Varshney “Optimal Data Fusion in Multiple Sensor Detection Systems”, IEEE Trans. AES,
Vol. 22 No. 1, Jan. 1986, pp. 98-101
12
Distance Based Decision
Fusion (DBDF)




Current system architecture allows detection
and localization prior to classification, giving
distance and SNR estimates.
Accurate localization allows for estimation of
probability of correct classification based on
distance.
Some events may be rejected by fusion
algorithm as the distance or SNR figures fall
outside the training data range. The majority
voting can be used as a backup fusion method.
Measurements: classification rate and
acceptance rates.
13
DBDF Approach 1:
Maximum A Posteriori
Decision Fusion
Weighting factor as function of distance and
SNR, determined using CFAR and EBL
information.
 We formulate a Maximum A Posteriori (MAP)
Probability Gating Network, using Bayesian
estimation:
Pˆ ( x  Ck )  P( x  Ck | x, di , si )  P( x | di , si )  P(di , si )



Parameters: SNR, Distance grouping size
P(x|d,s)P(d,s) estimated from experiment data.
14
DBDF Approach 2:
Distance Truncated Voting
Simple majority voting performed
among nodes close enough to the
target. Decisions from other nodes are
discarded.
 Parameter: max/threshold distance
 Reduces effect of localization error
 No decision will be made when vehicle
is outside the distance threshold

15
DBDF Approach 3:
Nearest Neighbor Fusion
Node closest to target will also have
highest SNR, and hence the highest
probability of correctness
 Region will assign same label as that
assigned by the node closest to the target
 Lowest computational and communicational
burden
 Accuracy of node to target distance critical
in decision making

16
Weighted Voting Schemes

This amounts to assigning weights:
1. MAP Fusion wi  P( x | di , si )  P(di , si )
2.
Distance Truncated Voting:
1 di  d max
wi  
0 otherwise
1 di  d j , j  i
3. Nearest Neighbor: wi  
otherwise
0

Baseline: simple Majority voting (wi=1
for all i)
17
Experiments







All 4 methods are compared
Data from SensIT SITEX02, November 2001
Distance groups by 20m, SNR groups by 5dB
Distance truncated voting for 50m
Nearest Neighbor
Majority Voting
Due to shortcomings in localization,
experiments are run with different error levels
in location estimate: σ = 0m, σ = 12.5m,
σ = 25m, σ = 50m.
18
Experiment Results:
Classification Rate
DTV
19
Experiment Results
 Closest
node gives highest
acceptance, classification rates for
accurate localization estimates
 MAP Fusion has smaller dependence
on localization error than other
methods
 All DBDF methods outperform simple
majority voting due to the use of
distance and SNR information.
20
Further Work




MAP Classifier allows for exclusion of those
samples with low classification rates (i.e.
only samples with wi > 0.5 are allowed).
This will allow for reduction of
communication bandwidth used for
classification fusion.
This method can be applied to other signal
processing tasks.
Website: http://www.ece.wisc.edu/~sensit
21