Robust multi-frequency sparse Bayesian

Sparse processing / Compressed sensing
Model : y = Ax + n ,
=
N ×1
measurements
x is sparse
+
N ×M
M ×1
sparse signal
• Problem : Solve for x
• Basis pursuit, LASSO (convex objective function)
• Matching pursuit (greedy method)
• Sparse Bayesian Learning (non-convex objective function)
1 / 12
Bayesian interpretation of LASSO
MAP estimate via the unconstrained -LASSO- formulation
b
xLASSO (µ) = arg min ky
x2CN
Bayes rule:
p(x|y) =
Axk22 + µkxk1
p(y|x)p(x)
p(y)
MAP estimate:
b
xMAP = arg max ln p(x|y)
x
= arg max [ln p(y|x) + ln p(x)]
x
= arg min [ ln p(y|x)
x
ln p(x)]
2 / 12
MAP estimate via the unconstrained -LASSO- formulation
Bayes rule:
p(x|y) =
p(y|x)p(x)
p(y)
MAP estimate:
b
xMAP = arg min [ ln p(y|x)
x
Gaussian likelihood:
p(y|x) / e
Laplace-like prior:
p(x) /
MAP estimate (LASSO):
⇥
b
xMAP=arg min ky
x
A. Xenaki (DTU/SIO)
N
Y
e
p
ln p(x)]
ky Axk2
2
2
(<xi )2 +(=xi )2
⌫
=e
kxk1
⌫
i=1
⇤
Axk22 + µkxk1 =b
xLASSO (µ), µ =
Paper F
2
⌫
44/40
3 / 12
Prior and Posterior densities (Ex. Murphy)
4 / 12
Sparse Bayesian Learning (SBL)
Model : y = Ax + n
=
Prior : x ∼ N (x; 0, Γ)
Γ = diag(γ1 , . . . , γM )
2
Likelihood : p(y|x) = N (y; Ax, σ IN )
Evidence : p(y) =
Z
x
2
N ×1
measurements
+
N ×M
M ×1
sparse signal
p(y|x)p(x)dx = N (y; 0, Σy )
Σy = σ IN + AΓAH
SBL solution : Γ̂ = arg max p(y)
Γ
= arg min log |Σy | + yH Σ−1
y y
Γ
M.E.Tipping, ”Sparse Bayesian learning and the relevance vector machine,” Journal of Machine Learning Research,
1
June 2001.
5 / 12
SBL overview
• SBL solution : Γ̂ = arg min
Γ
log |Σy | + yH Σ−1
y y
• SBL objective function is non-convex
• Optimization solution is non-unique
• Fixed point update using derivatives, works in practice
• Γ = diag(γ1 , . . . , γM )
new
old
= γm
Update rule : γm
2
||yH Σ−1
y am ||2
−1
aH
m Σy am
!r
Σy = σ 2 IN + AΓAH
• Multi snapshot extension : same Γ across snapshots
6 / 12
SBL overview
• Posterior : xpost = ΓAH Σ−1
y y
• At convergence, γm → 0 for most γm
• Γ controls sparsity, E(|xm |2 ) = γm
iteration #0
. (3)
150
100
50
0
-90
-70
-50
-30
-10
10
30
50
70
90
Angle 3 (°)
• Different ways to show that SBL gives sparse output
• Automatic determination of sparsity
• Also provides noise estimate σ 2
7 / 12
Applications to acoustics - Beamforming
• Beamforming
Sw
θ [◦ ]
• Direction of arrivals (DOAs)
-
80 snapshots
64 elements
θ [◦ ]
-
8 / 12
-
SBL - Beamforming example
N = 20 sensors, uniform linear array
Discretize angle space: {−90 : 1 : 90}, M = 181
Dictionary A : columns consist of steering vectors
K = 3 sources, DOAs, [−20, −15, 75]◦ , [12, 22, 20] dB
M N >K
P (dB)
20
CBF
10
0
20
. (dB)
•
•
•
•
•
SBL
10
0
-90
-70
-50
-30
-10
10
30
50
70
90
Angle 3 (°)
9 / 12
SBL - Acoustic hydrophone data processing (from Kai)
Ship of Opportunity
Eigenrays
CBF
SBL
10 / 12
Problem with Degrees of Freedom
•  As the number of snapshots (=observations) increases, so does the
number of unknown complex source amplitudes
•  PROBLEM: LASSO for multiple snapshots estimates the realizations of
the random complex source amplitudes.
•  However, we would be satisfied if we just estimated their power
γm = E{ |xml|2 }
•  Note that γm does not depend on snapshot index l.
Thus SBL is much faster than LASSO for more snapshots.
11 / 12
Example CPU Time
LASSO use CVX, CPU∝L2
SBL nearly independent on snapshots
200
10
1
SBL
SBL1
LASSO
1
0.1
1
10
100
Snapshots
1000
c)
DOA RMSE [ °]
CPU time (s)
b)
0.8
0.6
SBL
SBL1
LASSO
0.4
0.2
0
1
10
100
Snapshots
1000
12 / 12