RadiometricF - mitchelldatafusion.com

Ben-Gurion University of the Negev
Radiometric
Normalization
Spring 2009
Instructor
• Dr. H. B Mitchell
email: [email protected]
Sensor Fusion
Spring 2009
Radiometric Normalization




Radiometric Normalization ensures that all input
measurements use the same measurement scale.
We shall concentrate on statistical relative radiometric
normalization.
These methods do not require spatial alignment
although they assume the images are more-or-less
aligned.
Other methods will be discussed throughout the course
Sensor Fusion
Spring 2009
Histogram Matching



Input: Reference image A and test image B.
Normalization: Transform B such that
(pdf of B) is same
as
(pdf of A), i.e. find a function
such that
The solution is
where
Sensor Fusion
Spring 2009
Histogram Matching

Easy if B has distinct gray-levels

Let

Suppose A has pixels with a gray-level

Then all pixels in A with
rank
are assigned gray-level
rank
are assigned gray-level
etc
be histogram of B
Sensor Fusion
Spring 2009
Histogram Matching

If gray-levels are not distinct may break ties randomly. Better to use
“exact histogram specification”.
Sensor Fusion
Spring 2009
Exact Histogram Specification

Convolve input image with 6 masks e.g.

Resolve ties using
Resolve ties using
etc


. If no ties exist, stop
. If no ties exist, stop
Sensor Fusion
Spring 2009
Midway Histogram Equalization

Warp both input histograms
to a common histogram

The common histogram is defined to be as similar as possible to

A solution: Define

Implementation is difficult. Fast algorithm (dhw) is available using
dynamic programming.
by its cumulative histogram
Sensor Fusion
:
Spring 2009
Midway Histogram Equalization
Optical flow with and without histogram equalization
Sensor Fusion
Spring 2009
Midway Histogram Equalization
If input images have unique gray-levels (use exact histogram
specification) then midway histogram is trivial:
where
is kth largest gray levels in A and B
Sensor Fusion
Spring 2009
Ranking



Ranking may also be used as a robust method of radiometric
normalization.
Very effective on small images, less so on large images with many
ties.
Solutions?
exact histogram specification.
fuzzy ranking
Sensor Fusion
Spring 2009
Ranking. Classical

Classical ranking works as follows:
M crisp numbers
Compare each with
Result is

The crisp ranks are

where

Note: We may make the eqns symmetrical by redefining



Sensor Fusion
.
:
Spring 2009
Ranking. Classical

Example.
Sensor Fusion
Spring 2009
Ranking. Fuzzy


Fuzzy ranking is a generalization of classical ranking.
In place of M crisp numbers we have M membership functions

Compare each
.

Result is

The fuzzy ranks are

where
with “extended min” and “extended max”
Sensor Fusion
Spring 2009
Thresholding




Thresholding is mainly used to segment an image into background
and foreground
Also used as a normalization method.
A few unsupervised thresholding algorithms are:
Otsu
Kittler-Illingworth
Kapur,Sahoo and Wong etc
Example. KSW thresholding. Consider image as two sources
foreground (A) and background (B) according to threshold t.
Optimum threshold=maximum sum of
the entropies of the two sources
Sensor Fusion
Spring 2009
Thresholding



Advantage: Unsupervised thresholding methods automatically
adjust to input image.
Disadvantage: Quantization is very coarse
May overcome? by using fuzzy thresholding
Classical
Fuzzy
t
Sensor Fusion
Spring 2009
Aside: Fuzzy Logic

From this viewpoint may regard fuzzy logic as a method of
normalizing an input x in M different ways:
We have M membership functions

which represent different physical qualities eg “hot”, “cold”, “tepid”.
Then represent x as three values

which represent the degree to which x is hot, x is cold and x is tepid.
Degree to which x is
regarded as hot
x
Sensor Fusion
Spring 2009
Likelihood



Powerful normalization is to convert the measurements
to a likelihood
Widely used for normalizing feature maps.
Requires a ground truth which may be difficult.
Sensor Fusion
Spring 2009
Likelihood. Edge Operators




Example. Consider multiple edge operators
Canny edge operator.
Sobel edge operator.
Zero-crossing edge operator
The resulting feature maps all measure the same
phenomena (i.e. presence of edges).
But the feature maps have different scales. Require
radiometric normalization.
Can use methods such as histogram matching etc. But
better to use likelihood. Why?
Sensor Fusion
Spring 2009
Likelihood. Edge and Blob Operators




Example. Consider edge and blob operators
Feature maps measure very different phenomena.
Radiometric normalization is therefore of no use.
However theory of ATR suggests edge and blob are
casually linked to presence of a target.
Edge and Blob may therefore be normalized by
semantically aligning them, i.e. interpreting them as
giving the likelihood of the presence of a target.
Sensor Fusion
Spring 2009
Likelihood. Edge and Blob Operators






Edge map E(m,n) measures strength of edge at (m,n)
Blob map B(m,n) measures strength of blob at (m,n)
Edge likelihood
measures likelihood of target
existing at (m,n) given E(m,n)
Blob likelihood
measures likelihood of target
existing at (m,n) given B(m,n).
Calculation of the likelihoods requires ground truth data.
Three different approaches to calculating the likelihoods.
Sensor Fusion
Spring 2009
Likelihood. Platt Calibration


Given training data (ground truth):
K examples of edge values:
and K indicator flags (which describe presence or
absence of true target):

Suppose the function which describes likelihood of a
target given an edge value x is sigmoid in shape:

Find optimum values of
and
Sensor Fusion
by maximum likelihood
Spring 2009
Likelihood. Platt Calibration

Maximum likelihood solution is
If too few training samples have
or
then liable to overfit. Correct for this by using modified
Sensor Fusion
Spring 2009
Likelihood. Histogram





Platt calibration assumes a likelihood function of known
shape
If we do not know the shape of the function we have
may simply define it as a discrete curve or histogram.
In this case we quantize the edge values and place
them in histogram bins.
In a given bin we count the number of edge values
which fall in the bin and the number of times a target is
detected there.
Then the likelihood function is
Sensor Fusion
Spring 2009
Likelihood. Isotonic Regression



Isotonic regression assumes likelihood curve is
monotonically increasing (or decreasing).
It therefore represents a intermediate case between
Platt calibration and Histogram calibration.
A simple algorithm for isotonic curve fitting is PAV (PairAdjacent Violation Algorithm).
Monotonically increasing likelihood
curve of unknown shape
Sensor Fusion
Spring 2009
Likelihood. Isotonic Regression

Find montonically increasing function f(x) which
minimizes

Use PAV algorithm. This works iteratively as follows:
Arrange the such that
If f is isotonic then f*=f and stop
If f is not isotonic then there must exist a label l such that




Eliminate this pair by creating a single entry with
which is now isotonic.
Sensor Fusion
Spring 2009
Likelihood. Isotonic Regression
In first iteration entries 12 and
13 are removed by pooling
the two entries together and
giving them a value of 0.5.
This introduces a new
violation between entry 11
and the group 12-13, which
are pooled together formin a
pool of 3 entries with value
0.33
#
score init
Sensor Fusion
iterations
Spring 2009
Likelihood. Isotonic Regression



So far have considered pairwise likelihood estimation.
How can we generalize to multiple classes with more than two
classes?
Project.
Sensor Fusion
Spring 2009