Contact Information Key observation for the integration: a structural

S
ingle Image Segmentation
with Estimated Depth
Ryo Yonetani (Kyoto Univ)
Akisato Kimura (NTT)
Hitoshi Sakano (NTT)
Ken Fukuchi (JAIST)
Color and depth integration for object segmentation
Key observation for the integration:
a structural difference between depth maps and color images
Similar intensity distributions
(c)
Color image
Intensity distributions
Depth map
(a)
(a)
(b)
Different depth distributions
(c)
Depth distributions
(a)
(b)
(a)
Spatial intensity variation
(b)
(c)
Spatial depth variation
(c)
Object region
Using color only
(b)
Object region
A depth continuity between an object and a floor
Using color + depth
Formulation
...Extension of MRF-based framework [Boykov+,01;06;etc.]
Experimental results
Color image
Depth map
Using color only
Proposed
Minimization of the following energy function:
E( label | image)=
Σ{ a(label)+ b(pixel value | label)
+Σ( c(label, neighbor label) + d(pixel-value similarity | labels)}
b(pixel value | label) = -log p(color | label)-α log p( depth | label)
α: Scale factor (can be estimated via cross validation)
Depth map
Sampling
Depth likelihood
obj
Depth estimation
Color likelihood
obj
bkg
bkg
Result
Input image
Prior
MRF setup, MAP estimation
Depth estimation
...Estimating depth from textures via supervised learning
Intensity image
Improvement of the
precision and F-measure
under a controlled dataset
Recall
Precision
F-measure
0.92
0.47
0.61
0.88
0.76
0.80
Comparison with salient region detection (SRD)
9 Law’s masks + 6 gradient masks
SRD is a method to extract salient regions based on the spatial
non-stationarity of statistics in an image.
Proposed
*
Convolution
15 feature images
1x1-scale neighbors
3x3-scale neighbors
4 column quarters
(x2 energy types)
420d feature vector
Ridge regression
analysis
Actual depth map
[Cheng+,11]
Proposed
Comparison with state-of-the-art SRD methods is in the paper:
Global contrast based salient region detection [Cheng+,11]
Recall : 0.58 / Precision : 0.69 / F-measure : 0.62
Contact Information
w
[Cheng+,11]
Ryo Yonetani
...
Kyoto University
[email protected]
Akisato Kimura
NTT Communication Science Labs.
[email protected]