Convolution Pyramids - Mathematical Image Analysis Group

Convolution Pyramids
Zeev Farbman, Raanan Fattal and Dani Lischinski
Motivation
SIGGRAPH Asia Conference (2011)
Convolution Pyramids
Application 1 -
presented by:
Gaussian Kernels
Julian Steil
Application 2 Boundary Interpolation
supervisor:
Application 3 -
Prof. Dr. Joachim Weickert
Gradient Integration
Summary
Fig. 1.1: Gradient integration example
Seminar - Milestones and Advances in Image Analysis
Prof. Dr. Joachim Weickert, Oliver Demetz
Mathematical Image Analysis Group
Saarland University
13th of November, 2012
Fig. 1.2: Reconstruction result of Fig. 1.1
Overview
Motivation
1. Motivation
Convolution Pyramids
2. Convolution Pyramids
Application 1 -
3. Application 1 - Gaussian Kernels
Application 2 -
4. Application 2 - Boundary Interpolation
5. Application 3 - Gradient Integration
6. Summary
Gaussian Kernels
Boundary Interpolation
Application 3 Gradient Integration
Summary
Overview
Motivation
1. Motivation
Convolution
Gaussian Pyramid
Gaussian Pyramid - Example
From Gaussian to Laplacian Pyramid
Convolution
Gaussian Pyramid
Gaussian Pyramid Example
From Gaussian to Laplacian
Pyramid
Convolution Pyramids
2. Convolution Pyramids
Application 1 Gaussian Kernels
3. Application 1 - Gaussian Kernels
Application 2 Boundary Interpolation
4. Application 2 - Boundary Interpolation
Application 3 Gradient Integration
5. Application 3 - Gradient Integration
Summary
6. Summary
2 / 22
Motivation
Convolution
Motivation
Two-Dimensional Convolution:
Convolution
• discrete convolution of two images
Gaussian Pyramid
Gaussian Pyramid -
g = (gi,j )i,j∈Z and w = (wi,j )i,j∈Z :
XX
(g ∗ w)i,j :=
gi−k,j−` wk,`
Example
From Gaussian to Laplacian
Pyramid
(1)
k∈Z `∈Z
Convolution Pyramids
Application 1 -
• components of convolution kernel w can be regarded as mirrored
weights for averaging the components of g
• the larger the kernel size the larger the runtime
• ordinary convolution implementation needs O(n2 )
Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
3 / 22
Motivation
Gaussian Pyramid
• sequence of images g0 , g1 , ..., gn
• computed by a filtering procedure equivalent to convolution with a
Motivation
Convolution
local, symmetric weighting function
Gaussian Pyramid
=⇒ e.g. a Gaussian kernel
Example
Gaussian Pyramid -
From Gaussian to Laplacian
Pyramid
Procedure:
• image initialised by array g0 which contains C columns and R rows
Convolution Pyramids
• each pixel represents the light intensity I between 0 and 255
Application 1 -
=⇒ g0 is the zero level of Gaussian Pyramid
• each pixel value in level i is computed as a weighting average of
level i − 1 pixel values
Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
Fig. 2: One-dimensional graphic representation of the Gaussian pyramid
4 / 22
Motivation
Gaussian Pyramid - Example
Motivation
Convolution
Gaussian Pyramid
Gaussian Pyramid Example
From Gaussian to Laplacian
Pyramid
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Fig. 3: First six levels of the Gaussian pyramid for the “Lena” image. The original image, level 0, measures 257x257 pixels =⇒ level 5 measures just 9x9 pixels
Application 3 Gradient Integration
Summary
Remark:
density of pixels is reduced by half in one dimension and by fourth in
two dimensions from level to level
5 / 22
Motivation
From Gaussian to Laplacian Pyramid
Motivation
Convolution
Gaussian Pyramid
Gaussian Pyramid Example
From Gaussian to Laplacian
Pyramid
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Fig. 4: First four levels of the Gaussian and Laplacian pyramid of Fig.3.
Summary
• each level of Laplacian pyramid is the difference between the
corresponding and the next higher level of the Gaussian pyramid
• full expansion is used in Fig. 4 to help visualise the contents the
pyramid images
6 / 22
Overview
Motivation
1. Motivation
Convolution Pyramids
2. Convolution Pyramids
Approach
Forward and Backward Transform
Flow Chart and Pseudocode
Optimisation
Approach
Forward and Backward
Transform
Flow Chart and
Pseudocode
Optimisation
Application 1 Gaussian Kernels
3. Application 1 - Gaussian Kernels
Application 2 Boundary Interpolation
4. Application 2 - Boundary Interpolation
Application 3 Gradient Integration
5. Application 3 - Gradient Integration
Summary
6. Summary
6 / 22
Convolution Pyramids
Approach
Task:
• approximate effect of convolution with large kernels
Motivation
Convolution Pyramids
Approach
=⇒ higher spectral accuracy + translation-invariant operation
• Is it also possible in O(n)?
Forward and Backward
Transform
Flow Chart and
Pseudocode
Idea:
• use of repeated convolution with small kernels on multiple scales
• disadvantage: not translation-invariant due to subsampling
operation to reach O(n) performance
Optimisation
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 -
Method:
• pyramids rely on a spectral “divide-and-conquer” strategy
Gradient Integration
Summary
• no subsampling of the decomposed signal increases the
translation-invariance
• use finite impulse response filters to achieve some spacial
localisation and runtime O(n)
7 / 22
Convolution Pyramids
Approach
Task:
• approximate effect of convolution with large kernels
Motivation
Convolution Pyramids
Approach
=⇒ higher spectral accuracy + translation-invariant operation
• Is it also possible in O(n)?
Forward and Backward
Transform
Flow Chart and
Pseudocode
Idea:
• use of repeated convolution with small kernels on multiple scales
• disadvantage: not translation-invariant due to subsampling
operation to reach O(n) performance
Optimisation
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 -
Method:
• pyramids rely on a spectral “divide-and-conquer” strategy
Gradient Integration
Summary
• no subsampling of the decomposed signal increases the
translation-invariance
• use finite impulse response filters to achieve some spacial
localisation and runtime O(n)
7 / 22
Convolution Pyramids
Approach
Task:
• approximate effect of convolution with large kernels
Motivation
Convolution Pyramids
Approach
=⇒ higher spectral accuracy + translation-invariant operation
• Is it also possible in O(n)?
Forward and Backward
Transform
Flow Chart and
Pseudocode
Idea:
• use of repeated convolution with small kernels on multiple scales
• disadvantage: not translation-invariant due to subsampling
operation to reach O(n) performance
Optimisation
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 -
Method:
• pyramids rely on a spectral “divide-and-conquer” strategy
Gradient Integration
Summary
• no subsampling of the decomposed signal increases the
translation-invariance
• use finite impulse response filters to achieve some spacial
localisation and runtime O(n)
7 / 22
Convolution Pyramids
Forward and Backward Transform
Forward Transform - Analysis Step:
Motivation
Convolution Pyramids
• convolve a signal with a first filter h1
Approach
• subsample the result by a factor of two
Forward and Backward
• process is repeated on the subsampled data
Transform
• an unfiltered and unsampled copy of the signal is kept at each level
Pseudocode
Flow Chart and
Optimisation
al0
l+1
a
l
(2)
=a
l
= ↓ (h1 ∗ a )
(3)
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 -
Backward Transform - Synthesis Step:
Gradient Integration
• upsample by inserting a zero between every two samples
Summary
• convolve the result with a second filter h2
• combine upsampled signal with the signal stored at each level after
convolving with a third filter g
âl = h2 ∗ (↑ âl+1 ) + g ∗ al0
(4)
8 / 22
Convolution Pyramids
Forward and Backward Transform
Forward Transform - Analysis Step:
Motivation
Convolution Pyramids
• convolve a signal with a first filter h1
Approach
• subsample the result by a factor of two
Forward and Backward
• process is repeated on the subsampled data
Transform
• an unfiltered and unsampled copy of the signal is kept at each level
Pseudocode
Flow Chart and
Optimisation
al0
l+1
a
l
(2)
=a
l
= ↓ (h1 ∗ a )
(3)
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 -
Backward Transform - Synthesis Step:
Gradient Integration
• upsample by inserting a zero between every two samples
Summary
• convolve the result with a second filter h2
• combine upsampled signal with the signal stored at each level after
convolving with a third filter g
âl = h2 ∗ (↑ âl+1 ) + g ∗ al0
(4)
8 / 22
Convolution Pyramids
Flow Chart and Pseudocode
Motivation
Convolution Pyramids
Approach
Forward and Backward
Transform
Flow Chart and
Pseudocode
Optimisation
Application 1 Fig. 5: Flow Chart to visualise pyramid structure, source taken from [1]
Algorithm 1 Multiscale Transform
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
12:
Determine the number of levels L
{Forward transform (analysis)}
a0 = a
for each level l = 0...L − 1 do
al0 = al
al+1 = ↓ (h1 ∗ al )
end for
{Backward transform (synthesis)}
âL = g ∗ aL
for each level l = L − 1...0 do
âl = h2 ∗ (↑ âl+1 ) + g ∗ al0
end for
Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
9 / 22
Convolution Pyramids
Optimisation
Motivation
Kernel Determination:
Convolution Pyramids
• target kernel f is given
Approach
• seek a set of kernels F = {h1 , h2 , g} that minimise
arg mink
F
â0F
|{z}
result of
multiscale
transform
Forward and Backward
Transform
Flow Chart and
−
f
∗ |{z}
a k
|{z}
target
kernel
(5)
input
signal
• kernels in F should be small and separable
• use larger and/or non-separable filters increase accuracy
=⇒ specific choice depends on application requirements
• remarkable results using separable kernels in F for non-separable
Pseudocode
Optimisation
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
target filters f
• target filters f with rotational and mirroring symmetries enforce
symmetry on h1 , h2 , g
10 / 22
Overview
Motivation
1. Motivation
Convolution Pyramids
2. Convolution Pyramids
Application 1 Gaussian Kernels
Gaussian Kernel
3. Application 1 - Gaussian Kernels
Gaussian Kernel Convolution
Example - Gaussian Filter
Example - Scattered Data Interpolation
Convolution
Example - Gaussian Filter
Example - Scattered Data
Interpolation
Application 2 Boundary Interpolation
4. Application 2 - Boundary Interpolation
Application 3 Gradient Integration
5. Application 3 - Gradient Integration
Summary
6. Summary
10 / 22
Application 1 - Gaussian Kernels
Gaussian Kernel Convolution
Motivation
Task:
Convolution Pyramids
• approximate Gaussian kernels e
kxk2
2σ 2
at the original fine grid in O(n)
Application 1 Gaussian Kernels
• no truncated filter support
Gaussian Kernel
Convolution
Example - Gaussian Filter
Determination of F = {h1 , h2 , g}:
Example - Scattered Data
Interpolation
arg mink
F
â0F
|{z}
result of
multiscale
transform
−
f
|{z}
target
Gaussian
kernel
∗
a
|{z}
k
image
to
convolve
(5)
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
Problem:
• Gaussians are rather efficient low-pass filters
• pyramid contains high-frequent components coming from finer
levels introduced by convolution with g
11 / 22
Application 1 - Gaussian Kernels
Gaussian Kernel Convolution
Motivation
Task:
Convolution Pyramids
• approximate Gaussian kernels e
kxk2
2σ 2
at the original fine grid in O(n)
Application 1 Gaussian Kernels
• no truncated filter support
Gaussian Kernel
Convolution
Example - Gaussian Filter
Determination of F = {h1 , h2 , g}:
Example - Scattered Data
Interpolation
arg mink
F
â0F
|{z}
result of
multiscale
transform
−
f
|{z}
target
Gaussian
kernel
∗
a
|{z}
k
image
to
convolve
(5)
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
Problem:
• Gaussians are rather efficient low-pass filters
• pyramid contains high-frequent components coming from finer
levels introduced by convolution with g
11 / 22
Application 1 - Gaussian Kernels
Gaussian Kernel Convolution
Motivation
Task:
Convolution Pyramids
• approximate Gaussian kernels e
kxk2
2σ 2
at the original fine grid in O(n)
Application 1 Gaussian Kernels
• no truncated filter support
Gaussian Kernel
Convolution
Example - Gaussian Filter
Determination of F = {h1 , h2 , g}:
Example - Scattered Data
Interpolation
arg mink
F
â0F
|{z}
result of
multiscale
transform
−
f
|{z}
target
Gaussian
kernel
∗
a
|{z}
k
image
to
convolve
(5)
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
Problem:
• Gaussians are rather efficient low-pass filters
• pyramid contains high-frequent components coming from finer
levels introduced by convolution with g
11 / 22
Application 1 - Gaussian Kernels
Example - Gaussian Filter
Solution:
Motivation
• modulation of g at each level l
Convolution Pyramids
• higher w l at the levels closest to the target size
• for different σ different sets of kernels F are necessary
Application 1 used kernels
Gaussian Kernels
Gaussian Kernel
Convolution
Example - Gaussian Filter
Example - Scattered Data
Interpolation
Application 2 Boundary Interpolation
Application 3 Fig. 6.1: Original image,
source: taken from [1]
Fig. 6.2: Exact convolution with a Gaussian filter
(σ = 4), source: taken from [1]
Fig. 6.3: Convolution using optimization
approach for σ = 4, source: taken from [1]
Gradient Integration
Summary
Fig. 7.1: Exact kernels (in red) with
approximated kernels (in blue),
source: taken from [1]
Fig. 7.2: Exact Gaussian (red), approximation
using 5x5 kernels (blue) and 7x7 kernel
(green) , source: taken from [1]
Fig. 7.3: Magnification of Fig. 7.2 shows better
accuracy of larger kernels,
source: taken from [1]
12 / 22
Application 1 - Gaussian Kernels
Example - Scattered Data Interpolation
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Gaussian Kernel
Convolution
Example - Gaussian Filter
Example - Scattered Data
Fig. 8.1: Horizontal slice through exact
wider Gaussian (red) and
approximation (blue),
source: taken from [1]
Fig. 8.4: Approximation with
wider Gaussian,
source: taken from [1]
Fig. 8.5: Approximation with
narrower Gaussian,
source: taken from [1]
Interpolation
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Fig. 8.3: Scattered data
interpolation input ,
source: taken from [1]
Fig. 8.2: Horizontal slice through exact
narrower Gaussian (red) and
approximation (blue),
source: taken from [1]
Summary
Fig. 8.6: Exact results
corresponding to red wider
Gaussian , source: taken from
[1]
Fig. 8.7: Exact results
corresponding to red narrower
Gaussian,
source: taken from [1]
13 / 22
Overview
Motivation
1. Motivation
Convolution Pyramids
2. Convolution Pyramids
Application 1 -
3. Application 1 - Gaussian Kernels
Application 2 -
Gaussian Kernels
Boundary Interpolation
How to use boundary
4. Application 2 - Boundary Interpolation
How to use boundary interpolation?
Example - Seamless Cloning
interpolation?
Example - Seamless
Cloning
Application 3 Gradient Integration
5. Application 3 - Gradient Integration
Summary
6. Summary
13 / 22
Application 2 - Boundary Interpolation
How to use boundary interpolation?
Seamless Image Cloning:
Motivation
• formulation as boundary value problem
Convolution Pyramids
• effectively solved by constructing a smooth membrane
Application 1 Gaussian Kernels
• interpolation of differences along a seam between two images
Application 2 Boundary Interpolation
Shepard’s Method:
How to use boundary
• Ω is region of interest and boundary values are given by b(x)
interpolation?
Example - Seamless
• smoothly interpolates boundary values to all grid points inside Ω
Cloning
• defines interpolant r at x as weighted average of boundary values:
r(x) =
Application 3 Gradient Integration
Pn
wk (x)b(xk )
w ∗ r̂
j=0 w(xi , xj )r̂(xj )
k
P
(6)
=⇒ r(xi ) = Pn
=
w
(x)
w(x
,
x
)χ
(x
)
w
∗ χr̂
i
j
j
k
r̂
k
j=0
P
Summary
• xk = boundary points, b(xk ) = boundary values
• weight function wk (x) is given by
wk (x) = w(xk , x) =
1
d(xk , x)3
(7)
• strong spike at xk and decays rapidly away from it
• computational cost O(Kn), K boundary values and n points in Ω
14 / 22
Application 2 - Boundary Interpolation
How to use boundary interpolation?
Seamless Image Cloning:
Motivation
• formulation as boundary value problem
Convolution Pyramids
• effectively solved by constructing a smooth membrane
Application 1 Gaussian Kernels
• interpolation of differences along a seam between two images
Application 2 Boundary Interpolation
Shepard’s Method:
How to use boundary
• Ω is region of interest and boundary values are given by b(x)
interpolation?
Example - Seamless
• smoothly interpolates boundary values to all grid points inside Ω
Cloning
• defines interpolant r at x as weighted average of boundary values:
r(x) =
Application 3 Gradient Integration
Pn
wk (x)b(xk )
w ∗ r̂
j=0 w(xi , xj )r̂(xj )
k
P
(6)
=⇒ r(xi ) = Pn
=
w
(x)
w(x
,
x
)χ
(x
)
w
∗ χr̂
i
j
j
k
r̂
k
j=0
P
Summary
• xk = boundary points, b(xk ) = boundary values
• weight function wk (x) is given by
wk (x) = w(xk , x) =
1
d(xk , x)3
(7)
• strong spike at xk and decays rapidly away from it
• computational cost O(Kn), K boundary values and n points in Ω
14 / 22
Application 2 - Boundary Interpolation
Example - Seamless Cloning
Determination of F = {h1 , h2 , g}:
arg mink
F
â0F
|{z}
Motivation
−
result of
multiscale
transform
f ∗a
| {z }
k
(5)
exact
membrane
r(x)
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
How to use boundary
interpolation?
Example - Seamless
Cloning
Application 3 Gradient Integration
Fig. 9.1: Source image,
source: taken from [2]
Fig. 9.2: Membrane mask,
source: taken from [2]
Fig. 9.3: Target image,
source: taken from [2]
Summary
Used Kernels
Fig. 9.4: Approximated membrane
source: taken from [1]
Fig. 9.5: Superimposed image with a cloned
patch, source: taken from [1]
Fig. 9.6: Result of applying Fig. 9.4 to Fig. 9.5,
source: taken from [1]
15 / 22
Overview
Motivation
1. Motivation
Convolution Pyramids
2. Convolution Pyramids
Application 1 -
3. Application 1 - Gaussian Kernels
Application 2 -
4. Application 2 - Boundary Interpolation
Gaussian Kernels
Boundary Interpolation
Application 3 Gradient Integration
Kernel Detection
5. Application 3 - Gradient Integration
Kernel Detection
Example - Gradient Integration
How does the target filter look like?
Reconstruction of Target Filter
Example - Gradient
Integration
How does the target filter
look like?
Reconstruction of Target
Filter
Summary
6. Summary
15 / 22
Application 3 - Gradient Integration
Kernel Detection
Determination of F = {h1 , h2 , g}:
Motivation
• choose a natural image I
Convolution Pyramids
• a is the divergence of its gradient field:
Application 1 Gaussian Kernels
a = div ∇I
(8)
I =f ∗a
(9)
Application 2 Boundary Interpolation
Application 3 -
arg mink
F
â0F
|{z}
result of
multiscale
transform
− f ∗a k
| {z }
natural
image
I
(5)
Gradient Integration
Kernel Detection
Example - Gradient
Integration
How does the target filter
look like?
Reconstruction of Target
Filter
Summary
Fig. 10.1: Natural image I ,
source: taken from [1]
Fig. 10.2: Corresponding gradient image a of
Fig. 10.1, source: taken from [1]
16 / 22
Application 3 - Gradient Integration
Example - Gradient Integration
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Fig. 11.1: Gradient image of Fig 11.4,
source: taken from [1]
Fig. 11.2: Reconstruction of Fig. 11.1 with
F5,3 , source: taken from [1]
Fig. 11.3: Reconstruction of Fig. 11.1 with
F7,5 , source: taken from [1]
Gradient Integration
Kernel Detection
Example - Gradient
Used Kernels
Integration
How does the target filter
look like?
Reconstruction of Target
Filter
Summary
Fig. 11.4: Original image (512x512),
source: taken from [1]
Fig. 11.5: Absolute errors of Fig. 11.2
(magnified by x50), source: taken from [1]
Fig. 11.6: Absolute errors of Fig. 11.3
(magnified by x50), source: taken from [1]
17 / 22
Application 3 - Gradient Integration
How does the target filter look like?
Task:
Motivation
• recover image u (here: u = â0F ) by solving the Poisson equation
4u = div v
(10)
Convolution Pyramids
Application 1 Gaussian Kernels
• v = gradient field
Application 2 -
Solution:
Boundary Interpolation
• Green’s functions
Application 3 -
1
1
G(x, x0 ) = G(kx − x0 k) =
log
2π kx − x0 k
Gradient Integration
(11)
Kernel Detection
Example - Gradient
Integration
define fundamental solutions to the Poisson equation
How does the target filter
look like?
4G(x, x0 ) = δ(x, x0 )
(12)
Reconstruction of Target
Filter
• δ = discrete delta function
Summary
• (10) is defined over an infinite domain with no boundary constraints
=⇒ Laplace operator becomes spatially invariant
=⇒ Green’s function becomes translation invariant
• solution of (10) is given by the convolution
u = G ∗ div v
(13)
18 / 22
Application 3 - Gradient Integration
How does the target filter look like?
Task:
Motivation
• recover image u (here: u = â0F ) by solving the Poisson equation
4u = div v
(10)
Convolution Pyramids
Application 1 Gaussian Kernels
• v = gradient field
Application 2 -
Solution:
Boundary Interpolation
• Green’s functions
Application 3 -
1
1
G(x, x0 ) = G(kx − x0 k) =
log
2π kx − x0 k
Gradient Integration
(11)
Kernel Detection
Example - Gradient
Integration
define fundamental solutions to the Poisson equation
How does the target filter
look like?
4G(x, x0 ) = δ(x, x0 )
(12)
Reconstruction of Target
Filter
• δ = discrete delta function
Summary
• (10) is defined over an infinite domain with no boundary constraints
=⇒ Laplace operator becomes spatially invariant
=⇒ Green’s function becomes translation invariant
• solution of (10) is given by the convolution
u = G ∗ div v
(13)
18 / 22
Application 3 - Gradient Integration
Reconstruction of Target Filter
Target Filter Determination:
Motivation
• using results of previous F = {h1 , h2 , g}
Convolution Pyramids
• a is a centered delta function
Application 1 Gaussian Kernels
a = div ∇I
(8)
I =f ∗a
(9)
• Green’s function provides a suitable result for f
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Kernel Detection
Example - Gradient
Integration
How does the target filter
look like?
Reconstruction of Target
Filter
Summary
Fig. 12.1: Reconstruction of the Green’s function,
source: taken from [1]
Fig. 12.2: space invariant corresponding kernel of Fig. 12.1,
source: taken from [1]
19 / 22
Overview
Motivation
1. Motivation
Convolution Pyramids
2. Convolution Pyramids
Application 1 -
3. Application 1 - Gaussian Kernels
Application 2 -
4. Application 2 - Boundary Interpolation
5. Application 3 - Gradient Integration
Gaussian Kernels
Boundary Interpolation
Application 3 Gradient Integration
Summary
Summary
6. Summary
Summary
19 / 22
Summary
Summary
Motivation
• approximation of large convolution filters in O(n)
=⇒ using kernels of small support F = {h1 , h2 , g}
+ multiscale pyramid scheme
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
• kernel determination by optimization:
Application 3 Gradient Integration
arg mink
F
â0F
|{z}
result of
multiscale
transform
−
f
∗ |{z}
a k
|{z}
target
kernel
input
signal
Summary
Summary
• suitable for different applications like...
• gradient integration
• seamless cloning
• scattered data interpolation
20 / 22
References
References
Motivation
[1] Z EEV FARBMAN , R AANAN FATTAL , DANI L ISCHINSKI
Convolution pyramids
Proc. 2011 SIGGRAPH Asia Conference, Article No. 175
The Hebrew University (2011)
[2] C OMPUTER G RAPHICS & C OMPUTATIONAL P HOTOGRAPHY L AB
Supplementary Materials of the paper “Convolution pyramids”
The Hebrew University (2011)
http://www.cs.huji.ac.il/labs/cglab/projects/convpyr/
[3] M ATHEMATICAL I MAGE A NALYSIS G ROUP
Lecture notes of the “Image Processing and Computer Vision” lecture
Saarland University. Winter term (2011)
http://www.mia.uni-saarland.de/Teaching/ipcv06.shtml
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
[4] P ETER J. B URT, E DWARD H. A DELSON
The Laplacian Pyramid as a Compact Image Code
IEEE Transcriptions on Communications
Vol. COM-31, No. 4, (April 1983)
21 / 22
References
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Thank you for your attention!
Summary
22 / 22
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
I
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
II
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
III
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
IV
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
V
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
VI
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
VII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
VIII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
IX
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
X
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XI
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XIII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XIV
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XV
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XVI
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XVII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XVIII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XIX
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XX
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXI
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXIII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXIV
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXV
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXVI
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXVII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXVIII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXIX
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXX
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXXI
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXXII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXXIII
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXXIV
Motivation
Convolution Pyramids
Application 1 Gaussian Kernels
Application 2 Boundary Interpolation
Application 3 Gradient Integration
Summary
XXXV