Edge-Based Color Constancy

Edge-Based Color Constancy
IEEE Transaction on Image Processing
vol. 16, no. 9, Sep. 2007
Joost van de Weijer, Theo Gevers, and Arjan Gijsenij
Presented by Ho-Gun Ha
School of Electrical Engineering and Computer Science
Kyungpook National Univ.
Abstract
‹ Definition
of color constancy
– Ability to measure color of objects independent of
the color of the light source
‹ Proposed
method
– Gray-edge hypothesis
• Achromaticity of the average edge difference
• Based on derivative structure of image
– Framework unifying a variety color constancy
algorithm
2 /28
Introduction
‹ Applications
of color constancy
– Image retrieval
– Image classification
– Color object recognition
– Object tracking
‹ Two
approaches of solving the problem
– Detection of feature which is invariant with respect to
the light source
• Image retrieval
• Unnecessary of the estimation of the light source
3 /28
– Image correction for deviation from a canonical light
source
• Estimation of the color of the light source
‹ Color
constancy methods
– Gamut mapping algorithm
• Observation of a limited set of RGB value under a given
illuminant
– Convex hull on RGB space of canonical gamut
• Derivation of illuminant color from transformation
– Transformation map an observed gamut into the canonical
gamut
• Among the best results in color constancy
4 /28
– GCIE
• Improving the gamut mapping algorithm
– Restriction of the illuminant
• Outperforming the standard gamut algorithm
– Further approaches to color constancy
• Probabilistic approaches
• Learning-based methods
5 /28
‹ Drawback
of the above-described algorithm
– Complexity of the algorithms
– Requirement of all image data sets with known light
source
Less complex color constancy algorithm is proposed
‹ Simple
and fast color constancy algorithm
– Max-RGB
– Gray-world hypothesis
6 /28
‹ Minkowski
norm
– Same algorithm applied to Interpreting different
instantiations
• Max RGB method - L∞
• Gray-world method - L1
7 /28
Gray-World Hypothesis
‹ Gray-world
hypothesis
– Illuminant of single light source
• Image values f
• f = (R , G , B )
T
f = ∫ e(λ ) s (λ ) c(λ ) dλ
(1)
ω
where
and
e(λ ) is the light source.
s (λ ) is the surface reflectance.
c(λ ) is the camera sensitivity function.( c(λ ) = (R(λ ), G (λ ), B(λ ) ) )
ω
is the visible spectrum.
8 /28
– Estimation of the light source color
⎛ Re ⎞
⎜ ⎟
e = ⎜ Ge ⎟ =
⎜B ⎟
⎝ e⎠
∫ω e(λ ) c(λ )
dλ
(2)
– Another assumption of the gray-world hypothesis
• To avoid making further assumptions
– Camera sensitivities, Surface reflectances, Light source spectra
∫ s (λ , x)
∫ dx
where
dx
= g (λ ) = k
(3)
x is the spatial coordinate in the image.
9 /28
• Constant value k
– Range of k
(0~1)
» 0 : no reflectance
» 1 : total reflectance of incident light
∫ f ( x ) dx = 1
e ( λ ) s ( λ , x ) c ( λ ) dλ dx
∫
∫
∫ dx
∫ dx ω
⎛ s ( λ , x ) dx ⎞
∫
⎟ dλ
= ∫ e( λ ) c( λ ) ⎜
⎜
⎟
d
x
ω
∫
⎝
⎠
= k ∫ e(λ ) c(λ ) dλ = ke
ω
(4)
(5)
(6)
where we applied the theorem of Fubini to exchange the order of integration.
10 /28
• Normalized light source color
eˆ = ke
‹ Max-RGB
ke
method
– Assuming that the reflectance which is achieved for
each of three channel is equal
max f (x) = ke
(7)
x
where the max operation is executed on the separate channels.
max f (x) = ( max R (x), max G (x), max B (x) )
x
x
x
(8)
x
11 /28
– White-patch hypothesis
• Equal value between reflected light and incident light in the
white-patch
• Finding maximum RGB values in the second way
‹ More
general color constancy algorithm
– Minkowski norm
⎛
⎜
⎜
⎝
where
∫ ( f ( x) )
∫
dx ⎞⎟
⎟
dx
⎠
p
1
p
= ke
(9)
P = 1 : It is equal to gray-world assumption.
P=
∞ : It is equal to max- RGB.
P = 6 : The best results are obtained.
12 /28
‹ Extension
of the gray-world algorithm
– Considering local average
• Reducing the influence of noise
• Use of local correlation ( local smoothing )
– Gaussian filter
⎛
⎜
⎜
⎜
⎝
where
σ
∫
f x (x) dx ⎞⎟
⎟
⎟
∫ dx
⎠
σ
p
1
p
= ke
(10)
is the standard deviation.
G σ is the Gaussian filter.
and
fσ = f
Gσ
13 /28
Gray-Edge Hypothesis
‹ Gray-edge
hypothesis
– Achromaticity of the average of the reflectance
differences in a scene
∫
where
sxσ (λ , x) dx
∫ dx
= g (λ ) = k
(11)
the subscript x indicates the spatial derivative at scale σ .
14 /28
– Derivation of light source color from the average
color derivative
∫
f x (x) dx
∫ dx
=
=
1
∫
e(λ )
∫
∫
dx ω
∫ω
⎛ sx ( λ , x ) dx ⎞
∫
⎟ c(λ ) dλ
e(λ ) ⎜
⎜
⎟
dx
∫
⎝
⎠
sx (λ , x) c(λ ) dλdx
= k ∫ e(λ ) c(λ ) dλ = ke
(12)
(13)
(14)
ω
where
f x (x ) = ( Rx (x ) , G x ( x ) , Bx ( x ) )T .
15 /28
– Opponent color space
• Forming relatively regular, ellipsoid-like shape of derivative
distribution of image
• Coincidence light source color with long axis of opponent
color space
– White-light direction : O3X
• Component of opponent color space
O1X =
O2X =
O3X =
Rx − Gx
2
Rx + Gx − 2 Bx
6
Rx + Gx + Bx
(15)
3
16 /28
Fig. 1. Three acquisition of the same scene under different light sources[19]. In the
bottom row, the color derivative distributions are shown, where the axes are the
opponent color derivations and the surfaces indicate derivative values with equal
occurrence and darker surfaces indicating a more dense distribution. Note the shift
of the orientation of the derivatives with the changing of the light source.
17 /28
‹ Minkowski
norm in gray-edge hypothesis
– Assuming that the pth Minkowski norm of the
derivative of the reflectance in a scene is achromatic
⎛
⎜
⎜
⎜
⎝
‹ General
∫
f x (x) dx ⎞⎟
⎟
d
x
⎟
∫
⎠
σ
p
1
p
= ke
(16)
hypothesis
– Including high order based color constancy
⎛
⎜
⎜
⎝
∫
∂ nf σ ( x )
∂x n
p
⎞
dx ⎟
⎟
⎠
1
p
= ke n , p ,σ
(17)
18 /28
– Description of three variables
1) The order n of the image structure is the parameter determining
if the method is a gray-world or a gray-edge algorithm.
2) The Minkowski norm p which determines the relative weights of the
multiple measurements from which the final illuminant color is estimated.
3) The scale of the local measurements as denoted by
σ.
• Demanding low computational operation
– Minkowski norm
– derivatives
19 /28
TableⅠ. Overview of the different Illuminant estimations methods together with their
hypotheses. These Illuminant estimations are all instantiations of (17).
20 /28
Experimental Results
‹ Performance
test in various parameter settings
– Controlled indoor image set
– Real-world image set
‹ Angular
error
– Angular error between the estimated light source
and the actual light source
– Considering a performance of color constancy
algorithm
angular error = cos −1 ( eˆ l ⋅ eˆ e )
(18)
where the (⋅ˆ ) indicates a normalized vector.
21 /28
‹ Controlled
indoor image set
– 11 varying light source of 30 different scenes
– Summarizing results of multiple methods in table 2
Fig. 2. Examples of the images in data set.
22 /28
Table Ⅱ. Median angular error (degree) on indoor image
data set for various color constancy methods
Fig. 3. Median angular error of the general gray-world, first-order, and second-order gray-edge
method as a function of the Minkowski norm and local smoothing. The angular error axis is
23 /28
inverted for visualization purposes.
Table Ⅲ. Parameter settings for which the performance remains
within 10% of optimal performance as given in Table Ⅱ
‹ Real-world
image set
– Extracted image from digital video
– A wide variety of locations
Fig. 4. Examples of the image from the real-world data set.
24 /28
Fig. 5. Color constancy results of gray-world, general gray-edge, and second-order grayedge on real-world data set. The angular error is indicated in the right bottom corner. The
first row depicts a failure of the edge-based approaches, whereas the gray-world methods
give acceptable results. The second and third rows show example where the gray-world
methods fail and the gray-edge methods obtain superior results.
25 /28
Table Ⅳ. Median angular error (degree) for various color constancy
methods on real-world image set
Gray-edge performs best on this set of real-world image.
26 /28
Discussion
‹ Discussion
of previous experiment
– Obtaining comparable results to complex color
constancy algorithm
– Needing optimal parameter setting for various
situation
‹ Further
works
– High order structure of images
– More elaborate ways to combine the low-level
building blocks
– Automatic estimation of parameters per image
27 /28
Conclusion
‹
Proposed novel algorithm
– Edge-based color constancy
– Higher order structure algorithm
‹ Advantage
of the proposed method
– Similar results with more simple method
– Better results in real world image comparing with
gray-world method
28 /28