GerardAnandPakiamMFKE2007TTT

SUPER RESOLUTION FOR SURVEILLANCE APPLICATION
GERARD ANAND PAKIAM
This thesis submitted in partial fulfillment
of the requirements for the award of the
Degree of Masters of Electrical Engineering (Electronics and Telecommunications)
Electrical Engineering Faculty
Univesiti Teknologi Malaysia
MAY 2007
iii3
Dedicated to my brothers and sisters…
iv4
ACKNOWLEDGEMENT
I would like to take this opportunity to thank all those who have contributed
to the successful completion of this project.
Firstly I would like to express my gratitude to project supervisor, PM
Dr.Syed Abdul Rahman b. Syed Abu Bakar for providing the much needed
supervision, guidance, encouragement and resources. If not for him this project
would have never materialized.
My appreciation also goes out to Usman Ullah Sheikh and Muhammad Amir
bin As’ari from CVVIP for sharing their helpful ideas. I would also like to thank all
my friends, colleagues and family for all kind support.
v5
ABSTRACT
Surveillance activities often require zooming into a region of interest (ROI) in
an image such as a face of a suspect or the number plate of a vehicle. However
because of hardware limitations of image acquisition devices, the zoom would
contain a lot of pixel artifacts and insufficient detail. Super resolution (SR) is an
image processing technique of reconstructing a high resolution (HR) image from
several low resolution (LR) images. The methodology taken to achieve this can be
divided into preprocessing and image processing. In preprocessing colour image
frames will be selected from video footage of a scene with object movements. These
images would be cropped to isolate ROI and also minimize processing time. The SR
processing used is a frequency domain approach. The RGB (Red Green Blue) images
will be processed as individual components and then concatenated using several
function from the MATLAB Image Processing Toolbox (IPT) and also several
standard MATLAB functions. The resulting SR image shows an increase of 177% in
pixel count. The image also contains more detail, that can be exploited for zooming
in surveillance applications. From the result analysis the optimum number of LR
input images required for generating a SR image is between four to six images. This
is a cut off in terms of computational efficiency and also reconstructed image quality.
vi6
ABSTRAK
Aktiviti pengawasan selalunya memerlukan pengezuman ke atas sesuatu
kawasan tumpuan (ROI) pada sesebuah imej contohnya seperti pada muka seorang
suspek atau pada nombor plat kenderaan. Walaubagaimana pun disebabkan oleh
limitasi pada perkakasan pemporolehan imej, kawasan yang dizum pada imej akan
mengandungi banyak artifak piksel dan keperincian yang kurang. Super
Kebezajelasan (SR) adalah suatu teknik pemprosesan imej untuk membina semula
sesebuah imej kebezajelasan tinggi (HR) daripada beberapa imej kebezajelasan
rendah (LR). Metodologi yang diambil untuk mencapai matlamat ini boleh
dibahagikan kepada prapemprosesan dan pemprosesan imej. Dalam prapemprosesan,
imej warna dipilih daripada video yang menunjukan babak objek bergerak. Imej-imej
ini dipotong untuk mengasingkan ROI dan juga mengurangkan masa pemprosesan.
Pemprosesan SR yang diguna ialah kaedah domain frekuensi. Imej Merah Hijau Biru
(RGB) akan diproses mengunakan beberapa fungsi daripada MATLAB Image
Processing Toolbox (IPT) dan juga beberapa fungsi lazim MATLAB. Imej hasil
daripada teknik SR menunjukkan peningkatan sebanyak 177% jumlah piksel. Imej
tersebut juga mengandungi lebih banyak perincian yang boleh dieksploitasi untuk
pegezuman dalam aktiviti pengawasan. Daripada analisis hasil yang diperolehi,
jumlah imej RR optimum yang diperlukan untuk menjana sebuah imej RH ialah
antara empat hingga enam imej. Ini adalah keseimbangan diantara kecekapan
pengkomputeran dan kualiti imej yang dibina semula.
vii
TABLE OF CONTENTS
CHAPTER
1
2
TITLE
PAGE
TITLE PAGE
i
DECLARATION
ii
DEDICATION
iii
ACKNOLWLEDGEMENTS
iv
ABSTRACT
v
ABSTRAK
vi
TABLE OF CONTENTS
vii
LIST OF FIGURES
ix
LIST OF ABBREVIATIONS
x
INTRODUCTION
1
1.1
Problem Statement
1
1.2
Solution
2
1.3
Objective
3
1.4
Scope
4
LITERATURE RESEARCH & THEORY
6
2.1
Super Resolution Concept
6
2.2
Image Acquisition System
8
2.3
Observation Model
9
2.4
Non-uniform Interpolation Approach
12
2.5
Frequency Domain Approach
14
viii
3
2.6
Frequency Domain Registration
15
2.7
Planar Motion Estimation
17
2.7.1 Rotation Estimation
19
2.7.2 Shift Estimation
20
2.7.1 Aliasing
21
METHODOLOGY
23
3.1
Preprocessing
23
3.1.1 Video Sequence Acquisition
24
3.1.2 Image Extraction
25
SR Image Processing
31
3.2.1 Tukey Window
32
3.2.2 Fourier Transform
34
3.2.3 Rotation Estimation
35
3.2.4 Shift Estimation
38
3.2.5 Image Reconstruction
39
Super Resolution Application
40
3.2
3.3
4
5
RESULTS
42
4.1
Input Images
42
4.2
Comparison of Results
45
4.3
Analysis of Results
48
4.4
Discussion of Results
49
CONCLUSION
51
5.1
Conclusion
51
5.2
Proposed Future Works
52
REFERENCES
54
APPENDIX
56
ix
LIST OF FIGURES
FIGURE NO.
TITLE
PAGE
2.1
Basic concept for Super Resolution (SR).
7
2.2
Typical Image Acquisition System
8
2.3
Observation Model
9
2.4
Interpolation in HR Grid Sensor.
10
2.5
Low-resolution sensor PSF.
11
2.6
Scheme for SR in Non-uniform Interpolation Approach
13
2.7
Registration-interpolation-based reconstruction.
13
2.8
Aliasing relationship between LR image and HR image.
15
3.1
Video Acquisition Setup at CVVIP
24
3.2
Image Acquisition Equipments.
26
3.3
Easy Grab Software
26
3.4
McFunSoft Video Capture Solution. V6.7
27
3.5
VirtualDub 1.6.15
28
3.6
Microsoft Office Picture Viewer
29
3.7
Image Preprocessing Flowchart
30
3.8
SR Image Processing Flowchart.
31
3.9
Polar Coordinates
36
3.10
Super Resolution Application
40
4.1
Input Images
44
4.2
Comparison of LR and SR image
44
4.3
Comparison of LR and 4 input SR Images
45
4.4
Comparison of LR and 5 input SR Images
46
4.5
Comparison of LR and 6 input SR Images
46
4.6
Comparison of LR and 7 input SR Images
47
.
x10
LIST OF ABBREVIATIONS
CMOS
Complementary Metal Oxide Semi-conductor
DFT
Discreet Fourier Transform
FD
Frequency Domain
FFT
Fast Fourier Transform
FT
Fourier Transform
HR
High Resolution
IPT
Image Processing Toolbox
LR
Low Resolution
MATLAB
Matrix Laboratories
ROI
Region Of Interest
SR
Super Resolution
TD
Time Domain
1
CHAPTER 1
INTRODUCTION
1.1
Problem Statement
Security personnel involved in surveillance often need a very High Resolution
(HR) digital image close to that of an Analog 35mm film that has no visible artifacts
when an image is magnified. Therefore, finding a way to increase the current resolution
level is very much needed.
The most direct solution to increase spatial resolution is to increase the number
of pixels per unit area where by the pixel size is in effect reduced. This is done by the
sensor manufacturers themselves, of course at a higher cost that would later be passed to
consumers.
However, as the pixel size decreases, the amount of light available also
decreases. This generates shot noise that degrades the image quality severely.
2
To reduce the pixel size without suffering the effects of shot noise, therefore,
there exists the limitation of the pixel size reduction. The optimally limited pixel size is
estimated at about 40 µm2 for a 0.35 µm CMOS process.
1.2
Solution
The approach studied in this project, is to use signal processing techniques to
obtain an HR image from multiple observed low-resolution (LR) images.
These signal processing techniques have been heavily studied by several research
engineers like S.C Park, Capel David, Deepu Rajan et all. The term Super Resolution
(SR) has been used by them with regard to the above mentioned Signal Processing
Techniques.
The major advantage of the signal processing approach is that it may cost less
and the existing LR imaging systems can be still utilized.
The SR image reconstruction is useful in cases where multiple frames of the
same scene can be obtained. One application is to reconstruct a higher quality digital
image from LR images obtained with an inexpensive LR camera/camcorder for
surveillance purposes such as frame freeze or printing.
3
Synthetic zooming of Region of Interest (ROI) is another important application
for surveillance or forensic purposes. Common situations such as to magnify objects in a
scene such as the face of a criminal or the license plate of a car.
1.3
Objective
The main objective of this project is to apply Super Resolution (SR) techniques
to enhance surveillance images. The aim of this project is to develop software
algorithms for current surveillance systems.
Therefore the hardware of the current system can still be maintained and used but
with the capability to produce much higher resolution images. Providing a more cost
efficient system as opposed to an over all hardware upgrade.
The intent in this project is to develop an efficient application specific algorithm
using minimum computation resources such memory and processing power. This would
further lower overall cost.
Super resolution techniques are used for three specific purposes namely; to
increase pixel density of image, increase number of vertical & horizontal pixels and
increase size of a low resolution image.
4
Therefore creating a high resolution image containing more detail or the term
coined by Vandewalle (2005) containing more resolving power than the previous low
resolution images.
The specific use of super resolution image for surveillance application in this
project is to obtain a clearer and more detailed zoom of a vehicle number plate.
1.4
Scope
The basic premise for increasing the spatial resolution in SR techniques is the
availability of multiple LR images captured from the same scene. If the LR (Low
Resolution) images have different subpixel shifts from each other and if aliasing is
present, then new information contained in each LR image can be exploited to obtain an
HR (High Resolution) image.
The scope of this project is to use the information gathered from the LR images
to obtain HR by concentrating efforts to eliminate or minimize image distortion due to
warping and aliasing.
Warping distortion of concern are those resulting from translation of image
pixels in the x and y axis due to movement of target object. Here target object refers to a
moving vehicle.
5
Aliasing distortion results from the inaccurate interpolation of pixels of an image
when zoomed in. This results in an unsatisfactory appearance of aliasing artifact in the
form of ripples and stair like edges.
The colour image will be acquired from video captured using an analog Video
CCTV (Closed Circuit Television) Camera. SR techniques will be used to take
advantage of relative scene motions existing from frame to frame of the video sequence.
Images will be selected from relevant frames of the digitized recorded video
sequence to be processed offline and not in real time. Since these processing would only
be done upon request by surveillance personnel in a real life situation.
Images will be processed using several functions from the MATLAB Image
Processing Toolbox, and also several standard MATLAB functions.
6
CHAPTER 2
LITERATURE RESEACH & THEORY
2.1
Super Resolution Concept
Super Resolution (SR) is the term used to refer to the image processing done to
obtain a High Resolution (HR) image from multiple Low Resolution (LR) images. SR
techniques are applied on multiple LR image captured from the same scene in order to
increase spatial resolution for a new image of that same scene.
That is, LR images are sub sampled (aliased) as well as shifted with sub pixel
precision. If the LR images are shifted by integer units, then each image contains the
same information, and then there is no new information that can be used to reconstruct a
HR image.
If the LR images have different sub pixel shifts from each other and if aliasing is
present, then each image cannot be obtained from the others. In this case, the new
information contained in each LR image can be exploited to obtain a HR image.
7
To obtain different looks at the same scene, some relative scene motions must
exist from frame to frame via multiple scenes or video sequences. Multiple scenes can
be obtained from one camera with several captures or from multiple cameras located in
different positions.
Frames also can be obtained of one scene but from a video sequence. This will
be the chosen method used in this project.
If these scene motions are known or can be estimated within sub pixel accuracy
and then, by combining these LR images, SR image reconstruction is possible as
illustrated in Figure 2.1. (CP Sung et al, 2003).
Figure 2.1
Basic concept for Super Resolution (SR). (CP Sung et al, 2003).
8
2.2
Image Acquisition System
In the process of recording a digital image, there is a natural loss of spatial
resolution caused by the optical distortions (out of focus, diffraction limit, etc.), motion
blur due to limited shutter speed, noise that occurs within the sensor or during
transmission, and insufficient sensor density as shown in Figure 2.2
Figure 2 .2
Typical Image Acquisition System (CP Sung et al, 2003).
Although the main concern of an SR algorithm is to reconstruct HR images from
under sampled LR images, it covers image restoration techniques that produce high
quality images from noisy, blurred images. The goal of image restoration is to recover a
degraded image, but it does not change the size of image. SR techniques however, does
increase the size of the image because of image interpolation.
9
2.3
Observation model
The first step to comprehensively analyze the SR image reconstruction problem
is to formulate an observation model that relates the original HR image to the observed
LR images. The observation model presented here are for still images obtained from
video sequence.
kth warped HR Image xk
Desired HR Image x
Continuous
Scene
Sampling
Continuous
to Discrete
w/o Aliasing
Warping
Blur
- Translation
- Rotation
- Optical
- Motion
- Sensor PSF
Down Sampling
Under
sampling
(L1, L2)
kth Observed
LR Image yk
Noise nk
Figure 2.3
Observation Model
Figure 2.3 shows the Observation Model, relating an observed Low Resolution
(LR) image to desired High Resolution (HR) image. As we go through the block
diagram from left to right it shows how a desired HR image is corrupted into an
observed LR image. The reverse shall be applied in order to obtain a HR resolution
image from several LR images.
The observation model can be summed up in the equation below.
Yk = DBk Mk X + Nk for 1 ≤ k ≤ p
10
Where X = [x1, x2,…, xN]T is the desired HR image vector of size N = L1 N1 x
L2 N2 written in lexicographical notation. Parameters L1 and L2 represent the downsampling factors in the observation model for the horizontal and vertical directions,
respectively. Yk = [Yk,1, Y k,2,…, Y k,M]T is the observed LR image vector of size
M = N1 x N2 .
Mk is a warp matrix of size L1 N1 L2 N2 x L1 N1 L2 N2 ,Bk represents a L1 N1 L2
N2 x L1 N1 L2 N2 blur matrix, D is a (N1 N2) 2 × L1 N1 L2 N2 subsampling matrix, and
Nk represents a lexicographically ordered noise vector
The motion that occurs during the image acquisition is represented by warp
matrix Mk . It may contain global or local translation, rotation, and so on. Since this
information is generally unknown, needs to be estimated the scene motion for each
frame with reference to one particular frame. The warping process performed on HR
image x is actually defined in terms of LR pixel spacing when we estimate it. Thus, this
step requires interpolation when the fractional unit of motion is not equal to the HR
sensor grid.
∆
∆
∆
∆
∆
∆
∆
∆
∆
Figure 2.4
Legend
: Original HR Grid
: Original HR Pixels
∆
: Shifted Hr Pixels
Interpolation in HR Grid Sensor.
11
An example for global translation is shown in Figure 2.4. Where, a diamond does
not need interpolation, but a triangle should be interpolated from x since it is not located
on the HR grid.
Blurring may be caused by an optical system (e.g., out of focus, diffraction limit,
aberration, etc.), relative motion between the imaging system and the original scene, and
the point spread function (PSF) of the LR sensor. It can be modeled as linear space
invariant (LSI) or linear space variant (LSV), and its effects on HR images are
represented by the matrix Bk . In single image restoration applications, the optical or
motion blur is usually considered.
HR Pixel
LR Pixel
HR Grid
a0
LR Grid
a2
a1
a3
HR Image
Figure 2.5
3
∑a
i =0
i
4
LR Image
Low-resolution sensor PSF.
In the SR image reconstruction, however, the finiteness of a physical dimension
in LR sensors is an important factor of blur. This LR sensor PSF is usually modeled as a
spatial averaging operator as shown in Figure 2.5. In the use of SR reconstruction
methods, the characteristics of the blur are assumed to be known. However, if it is
12
difficult to obtain this information, blur identification should be incorporated into the
reconstruction procedure.
The subsampling matrix D generates aliased LR images from the warped and
blurred HR image. Although the size of LR images is the same here, in more general
cases, the different size of LR images can be addressed by using a different subsampling
matrix (e.g., Dk for 1 ≤ k ≤ p).
Based on the observation model existing SR algorithms are presented in the
following methodology section. Brief theory and application in surveillance will be
discussed for each algorithm.
2.4
Non-uniform Interpolation Approach
This approach is the most intuitive method for SR image reconstruction.
Therefore it will be used to explain the basics of super resolution before proceeding into
the main approached used in this project, which is frequency domain approach.
The three stages presented in Figure 2.6 are performed successively in the
following approach:
Firstly, estimation of relative motion, which is known as registration if the
motion information is not known which is usually the typical case.
13
Next, a non-uniform interpolation is done to produce an improved resolution
image. Finally, deblurring process is done depending on the observation model used to
model the system.
y1
y2
yp-1
yp
Figure 2.6
…
…
Motion
Estimation /
Registration
Interpolation
onto an HR
grid
Restoration
for Blur &
Noise
Removal
x
Scheme for Super Resolution in Non-uniform Interpolation Approach.
The pictorial example is shown in Figure 2.7. With the relative motion
information estimated, the HR image on non uniformly spaced sampling points is
obtained. Then, the direct or iterative reconstruction procedure is followed to produce
uniformly spaced sampling points.
Figure 2.7
Registration-interpolation-based reconstruction. (CP Sung et al, 2003).
Once an HR image is obtained by non uniform interpolation, the restoration
problem can be addressed to remove blurring and noise. Restoration can be performed
by applying any deconvolution method that considers the presence of noise.
14
2.5
Frequency Domain Approach
The frequency domain approach makes explicit use of the aliasing that exists in
each LR image to reconstruct an HR image. Based on a derived system equation that
describes the relationship between LR images and a desired HR image by using the
relative motion between LR images.
The frequency domain approach is based on the following three principles: i)
the shifting property of the Fourier transform,
ii)
the aliasing relationship between the continuous Fourier transform (CFT)
of an original HR image and the discrete Fourier transform (DFT) of
observed LR images,
iii)
and the assumption that an original HR image is bandlimited.
These properties make it possible to formulate the system equation relating the
aliased DFT coefficients of the observed LR images to a sample of the CFT of an
unknown image. For example, let us assume that there are two 1-D LR signals sampled
below the Nyquist sampling rate. From the above three principles, the aliased LR signals
can be decomposed into the unaliased HR signal as shown in Figure 2.8.
15
Aliasing
Aliased LR Signal (CFT)
Decompose aliased signal into Dealiased signal
Dealiased HR Signal (DFT)
Figure 2.8
2.6
Aliasing relationship between LR image and HR image.
Frequency Domain Registration
In order to achieve a higher resolving power for the generated super resolution
image aliasing ambiguity needs to be removed during image registration. To do so the
difference between the low-resolution input images needs to be known precisely.
Images that differ by a planar motion will be considered. Therefore, the focus
corresponds to having a precise knowledge of the motion parameters.
The frequency domain algorithm is used to register not just low resolution, but
also aliased images. A planar motion model is used to estimate motion parameters.
16
When a series of images is taken in a short amount of time with only small object
motion between the images, we assume that the motion can be described with such a
model. In general, a planar model is simpler and has less parameters making it often
more robust in the presence of noise.
The planar shift motion model also extend to include planar rotations, because
they are often part of the camera motion. Even a small rotation has a large influence on
final registration. Our rotation estimation algorithm is computationally efficient and
adapted to work with aliased images.
The algorithm is tested on real sequences of aliased images. The results from
these tests validate the assumptions made about the motion. They show, visually, that
the algorithm whenever directionality is present in the images.
17
2.7
Planar Motion Estimation
A frequency domain algorithm is used to estimate the motion parameters
between the reference image and each of the other images. As done by Vandewall 2005
whereby only planar motion parallel to the image plane is considered. The motion can be
described as a function of three parameters: horizontal and vertical shifts, ∆x1 and ∆x2,
and a planar rotation angle φ.
The frequency domain approach allows estimation of the horizontal and vertical
shift and the (planar) rotation separately. Assuming we have a reference signal f1(x) and
its shifted and rotated version f2(x):
f2(x)
=
f1(R(x + ∆x)), with
⎡x ⎤
x = ⎢ 1 ⎥ , ∆x =
⎣ x2 ⎦
⎡ ∆x1 ⎤
⎢∆x ⎥ , R =
⎣ 2⎦
⎡cos ϕ
⎢ sin ϕ
⎣
− sin ϕ ⎤
cos ϕ ⎥⎦
(1)
This can be expressed in the Fourier domain as
F2(u) =
∫∫
x
f1(R(x + ∆x)) e-j2πuTx dx
(2)
With F2(u) the Fourier transform of f2(x) and the coordinate transformation
x’ = x + ∆x. After another transformation x’’ = R’, the relation between the amplitudes
of the Fourier transforms can be computed as
18
| F2(u)| =
=
|
∫∫
x ''
f1( x’’) e-j2π(Ru)Tx’’ dx’’ |
(3)
| F1(Ru)|
where |F2(u)| is a rotated version of |F1(u)| over the same angle φ as the spatial domain
rotation. |F1(u)| and |F2(u)| do not depend on the shift values ∆x, because the spatial
domain shifts only affect the phase values of the Fourier transforms. Therefore the
rotation angle φ can be estimated first from the amplitudes of the Fourier transforms
|F1(u)| and |F2(u)|. After compensation for the rotation, the shift ∆x can be computed
from the phase difference between F1(u) and F2(u).
19
2.7.1
Rotation Estimation
The rotation angle between |F1(u)| and |F2(u)| can be computed as the angle θ
for which the Fourier transform of the reference image |F1(u)| and the rotated Fourier
transform of the image to be registered |F2(Rθu)| have maximum correlation.
This implies the computation of a rotation of |F2(u)| for every evaluation of the
correlation, which is computationally heavy and thus practically difficult.
If |F1(u)| and |F2(u)| are transformed in polar coordinates, the rotation over the
angle φ is reduced to a (circular) shift over φ. The Fourier transform of the spectra
|F1(u)| and |F2(u)| can be computed, followed by φ as the phase shift between the two
This requires a transformation of the spectrum to polar coordinates. The data from the
regular
x1, x2-grid need to be interpolated to obtain a regular r, θ- grid.
The approach of Vandewalle 2005 is computationally more efficient First of all,
he computes the frequency content h as a function of the angle α by integrating over
radial lines:
α + ∆α / 2 ∞
h(α)
=
∫ ∫
|F(r, θ) | dr dθ
(4)
α − ∆α / 2 0
In practice, |F(r, θ)| is a discrete signal. Therefore, the discrete function h(α) is
computed as the average of the values on the rectangular grid that have an angle
α − ∆α/2 < θ <α + ∆α/2. The rotation angle is computed with a precision of 0.1 degrees,
h(α) is computed every 0.1 degrees. To get a similar number of signal values |F(r, θ)| at
every angle, the average is only evaluated on a circular disc of values for which r < ρ
20
(where ρ is the image radius or half the image size). Finally, as the values for low
frequencies are very large compared to the other values and are very
coarsely sampled as a function of the angle, the values for which r < ερ are discarded,
with ε = 0.1. Thus, h(α) is computed as the average of the frequency values on a discrete
grid with α − ∆α/2 < θ < α+ ∆α/2 and ερ < r < ρ.
The exact rotation angle can then be computed as the value for which their
correlation reaches a maximum. Note that only a one-dimensional correlation has to be
computed.
2.7.2
Shift Estimation
A shift of the image parallel to the image plane can be expressed in Fourier
domain as a linear phase shift:
F2(u) =
e-j2πuT∆x F1(u)
(5)
It is well known that the shift parameters ∆x can thus be computed as the slope
of the phase difference ∠ (F2(u)/ F1(u)). To make the solution less sensitive to noise, a
plane is fitted through the phase differences using a least squares me
21
2.7.3
Aliasing
If the low-resolution images are aliased, the methods described earlier do not
result in precise registration anymore. This is due to the difference in frequency content
of the low resolution images caused by the aliasing. In this case, (2), (3), and (5) no
longer hold. Instead of (5), a shift is now expressed as:
K
F2(u) =
∑
k =− K
e-j2π(u-ku)T∆x F1(u - kus)
(6)
with us the sampling frequency and 2K +1 overlapping spectrum copies at frequency u.
Aliasing terms disturb the linear phase relation between F1(u) and F2(u). However, in
cases of limited aliasing, it is still possible to use the above methods, by considering
only the frequencies that are free of aliasing or only marginally affected by aliasing.
Assuming for a one-dimensional, bandlimited signal f (x) (with maximum
frequency umax), which is sampled at a frequency umax < us < 2umax. This does not satisfy
the Nyquist criterion, and the sampled signal f [k] will have aliasing artifacts . f (x)
cannot be perfectly reconstructed from the samples f [k].
Consider two sampled signals, f1[k] and f2[k], sampled at 0,T, 2T, . . . , kT, . . .
and ∆x,T + ∆x, 2T + ∆x, . . . , kT + ∆x, . . . , respectively (with T = 1/us the sampling
period). Due to the aliasing, their Fourier transforms differ by more than just a linear
phase shift, and the shift estimation method described above does not work any more.
However, the values at frequencies −us + umax < u < us−umax are free of aliasing
and thus the same for the two sampled signals f1[k] and f2[k] (up to a linear phase shift).
22
So if a low-pass filter is applied to f1[k] and f2[k], the resulting signals f1,low[k] and
f2,low[k] are exactly the same up to their shift ∆x. This shift can then be derived using a
correlation operator in time domain or by estimating the linear phase difference in
frequency domain.
An extension to two dimensions is straightforward. The two sampled signals
f1[k] and f2[k] are first low-pass filtered (with cutoff frequency us − umax) in horizontal
and vertical dimensions. The filtered images are identical up to their registration
parameters and can be registered using the methods described in rotation estimation and
shift estimation..
As both methods are applied in the Fourier domain, the filtering step can be
avoided by applying the registration algorithms immediately to the low frequencies. The
rotation estimation is then based on the frequencies for which ερ < r < ερmax
(with ρmax = min((us − umax)/us)), and the horizontal and vertical shifts are estimated
from the phase differences for −us + umax < u < us − umax.
Using this approach, high-frequency noise is removed together with the aliasing,
which results in more accurate registration.
2.8 Reconstruction
When the low-resolution images are accurately registered, the samples of the
different images can be combined to reconstruct a high-resolution image. Sampling
assumes using ideal Dirac.
23
CHAPTER 3
METHODOLOGY
The methodology used in this project can be divided into two parts , namely
preprocessing and image processing. As mentioned in the scope of project , image
processing is done offline. In the context of a request from security personnel.
3.1
Preprocessing
Even in preprocessing the process can be divided into two processes, namely
video sequence acquisition and image extraction from the video sequence. These two
processes involve both hardware and software manipulation.
24
3.1.1
Video Sequence Acquisition
The first step in preprocessing the image is the acquisition of video sequence of a
moving vehicle. Figure 3.1 shows the equipment setup up at the Computer Vision,
Video and Image Processing (CVVIP) Lab in UTM. The Lab is situated at 4th floor of
P05 building in the electrical engineering faculty.
The placement of the camera is ideal, mimicking a camera placed in a high place
facing a car park. It is easy to identify the number plate of vehicles parked close by, it is
difficult for vehicles parked further away.
Still obtained from video
Interface to PC via
video capture card
Focus on moving object
Figure 3.1
Video Acquisition Setup at CVVIP
25
The video acquisition equipments used are shown in Figure 3.2 and are listed
as following: -
Video Adapter Card (EureCard – Picolo Pro2)
-
Analog CCTV Camera (GAN Z)
-
TV zoom Lens (SPACECOM H851VG)
-
Camera Tripod
Video
Card
Camera
Lens
Tripod
Figure 3.2
3.1.2
Image Acquisition Equipments.
Image Extraction
The process of extracting each image that would be later processed using
super resolution techniques will be detailed here. It begins with first interfacing the
hardware and software. The hardware being the analog CCTV camera (GANZ) and the
video adapter card (EureCard – Picolo Pro2). The interface software provided by video
card manufacturer is called Easy Grab. As shown in Figure 3.3.
26
Figure 3.3
Easy Grab Software
Easy Grab is use full in the setting up of the camera system. It is important
here to set the video card at the highest image capture setting possible which is 768 x
576 pixels per frame.
The next step is video recording to computer hard disk. Here McFunSoft
Video Capture Solution V6.7 is used as shown in Figure 3.4. This software is a propriety
software from McFunSoft Inc.
27
Figure 3.4
McFunSoft Video Capture Solution. V6.7
Using this video capture software allows the user to determine whether or not
to use compression for recording purposes if hard disk space is a concern. Compression
will result in slight losses but usually is used in most practical cases.
However for this project no compression setting was used in order to take
advantage of the VirtualDub 1.6.15 frame selection software.
It is also visually more pleasing to set the number of per second to the most
minimum which is 25 frames per second this is to avoid a jerking sensation in the
capture video and also to minimize any motion blurring in fast moving objects.
28
Next is the step in the preprocessing is the using of VirtualDub 1.6.15 to
remove frames with no significant movements. This makes it easier to do frame
selection and also helps to conserve hard disk space. Figure 3.5 shows this software.
Figure 3.5
VirtualDub 1.6.15
The final step in preprocessing is to crop the selected image frames to isolate
only the region of interest (ROI) or in this case part of the picture that features only the
moving car. This is done to minimize super resolution computation time and also system
resources. We don’t want the computer to hang while computation is going on..
This is done using Microsoft Office Picture Viewer or any other picture
viewer software. Figure 3.6 show this process.
29
Figure 3.6
Microsoft Office Picture Viewer
The entire preprocessing part of the methodology can be summarized as a
flow chart as shown in Figure 3.7.
30
START
Camera Setup
Acquire
Video
Sequence
Scene
Selection
Crop Image
END
Figure 3.7
Image Preprocessing Flowchart
31
3.2
SR Image Processing
Super resolution processing of the low resolution images can be briefly described
in the following basic flow chart shown in Figure 3.8. This should give the reader an
understanding of the main procedures involved in super resolution.
START
Input LR Images
Multiply by Tukey window
FT
Rotation Estimation
Shift Estimation
Image Reconstruction
END
Figure 3.8
SR Image Processing Flowchart
After at least four images are inputted in to the system, each image which is in its
RGB (Red Green Blue) format is separated into its individual colour components and
then the data type of each component is changed from unsigned 8 bit integer (uint8) to
double precision floating point (double).
32
The image colour components will then be processed separately before it is
finally recombined using concatenate functions in MATLAB.
3.2.1
Tukey Window
The input images, a set of M low-resolution images fLR,m (m = 1, 2, . . . ,M) are
multiplied with a Tukey window for more accurate registration by making them
circularly symmetric. The Tukey window also known as the tapered cosine window has
a mathematical expression as below :
w(n)
=
⎧
⎪
⎪
⎪
⎪1.0
⎪
⎨
⎪
N
⎛
⎪ ⎛⎜
n −α
⎜
⎪ 1 ⎜1 +cos⎜ π
2
⎪2 ⎜
⎜ 2(1 − α ) N
⎜
⎪ ⎜
2
⎝
⎩ ⎝
0≤ n ≤α
⎞⎞
⎟⎟
⎟⎟
⎟⎟
⎟⎟
⎠⎠
α
N
2
N
N
0≤ n ≤
2
2
The window length L = N+1. The windowed images are called fLR,w,m. for the
purpose of documentation.
33
In MATLAB code it is implemented as below:
s_im=size(im{1});
w1=window(@tukeywin,s_im(1),0.25);
w2=window(@tukeywin,s_im(2),0.25)';
w=w1*w2;
for i=1:IMAGESNUMBER
im{i} = [zeros(32,s_im(2)+64); ...
zeros(s_im(1),32) im{i}.*w zeros(s_im(1),32); ...
zeros(32,s_im(2)+64)];
end
34
3.2.2
Fourier Transform
The 2-D, Discrete Fourier Transform (DFT) of an image f(x,y), denoted by
F(u,v), is given by equation
M −1 N −1
F(u,v) =
∑∑
f(x,y)e-j2π(ux/M + vy/N)
x =0 y =0
This complex mathematical transform can easily be implemented for image
processing in MATLAB as follows.
After being multiplied with a Tukey window the images are transformed from
the time domain to the frequency domain using Fast Fourier Transform. It is
implemented in MATLAB Image Processing Toolbox (IPT) by calling the function
fftshift which shift zero-frequency component to center of spectrum. The syntax is
A{k} = fftshift(abs(fft2(a{k})));
a(k) is the input image colour component. fft2 does the process of multiplying a(x,y)
by (-1)x+y and the output data arranged with the origin of the data arranged at the top left
and with four quarter periods meeting at the centre of the frequency rectangle.
Padding is not required for this calling of function fft2 since the input image is
already multiplied with the Tukey window. The Fourier transforms FLR,w,m of all lowresolution images is computed in this manner.
35
The power spectrum is obtained using the function abs which computes the
magnitude by means of square root of the sum of the squares of real and imaginary parts
of each element of the array. The Fourier Spectrum is defined mathematically as :
| F(u,v) |
=
[R2(u,v) + I2(u,v)]1/2
Finally the origin of the power spectrum is moved to the center of the frequency
rectangle to obtain A(k).
3.2.3
Rotation Estimation
Rotation estimation: the rotation angle between every image fLR,w,m
(m = 2, . . . ,M) and the reference image fLR,w,1 is estimated by calling the following
function:
[rot_angle, c] = estimate_rotation(a,dist_bounds,precision)
The input parameters dist_bounds gives the minimum and maximum radius to
be used, precision gives the precision with which the rotation angle is computed and
input images a are specified as a{1}, a{2}, etc.
The first step in rotation estimation is to compute the polar coordinates (r, θ) of
the image samples. This is done by calling function
[th,ra] = cart2pol(x,y);
36
transforms two-dimensional Cartesian coordinates stored in corresponding elements of
arrays X and Y into polar coordinates . The mapping from two-dimensional Cartesian
coordinates to polar coordinates is based on the following formulas to calculate θ, which
is the angle of a point p to the x-axis and ρ, which is the radius magnitude of point p to
origin. In MATLAB the functions are
thetha
=
atan2(x,y)
rho
=
sqrt(x.^2 + y.^2)
It is illustrated in Figure 3.9 below
Y
P
ρ
θ
Figure 3.9
X
Polar Coordinates
Next for every angle α, compute the average value hm(α) of the Fourier
coefficients for which α−1 < θ < α+1 and 0.1ρ < r < ρmax. The angles are expressed in
degrees and hm(α) is evaluated every 0.1 degrees. A typical value used for ρmax is 0.6.
After that it is necessary to find the maximum of the correlation between h1(α)
and hm(α) between −30 and 30 degrees. This is the estimated rotation angle φm. The
colur image component is rotated in frequency domain easily in the polar coordinate
plane.
37
The rotated colour component image is then converted back into time domain
using inverse fast fourier transform using MATLAB function
h_C = real(ifft(H_C));
Notice only the real part is taken into consideration for the next step of rotation
compensation.
After an image rotation has been successfully estimated, we need to compensate
for any excess rotation. This is done using a IPT function called imrotate which basically
rotates image fLR,w,m by −φm to cancel the rotation.
The function call is as below:
s2{i} = imrotate(s{i},-phi_est(i),'bicubic','crop');
this rotates the time domain image s{i} by -phi_est(i) in the
counterclockwise direction or by phi_est(i) in the clock wise direction.
'bicubic' shows that the type of interpolation used. In MATLAB the block
uses the weighted average of four translated pixel values for each output pixel value.
The image size is increased automatically by padding to fit the rotation. When
'crop' is included in the argument, the central part of the rotated image is cropped to
the same size as the original. This portion is done in time domain.
38
3.2.4
Shift Estimation
Shift estimation occurs when the horizontal and vertical shifts between every
image fLR,w,m (m = 2, . . . ,M) and the reference image fLR,w,1 are estimated by calling the
function:
delta_est = estimate_shift(s,n)
The input parameters s are specified as S{1}, S{2}, etc. input images and n
specifies the number of low frequency pixels to be used. Output delta_est is an M-by2 matrix with M the number of images
Using the Fast Fourier Transform to get the frequency domain representation as
discussed previously, it is then possible to compute the phase difference between image
m and the reference image as ∠ (FLR,w,m / FLR,w,1).
Next we determine the central part of the frequency spectrum to be used after
which we compute x and y coordinates of the pixels For all frequencies
−us + umax < u < us − umax write the linear equation describing a plane through the
computed phase difference with unknown slopes ∆x.
Find the shift parameters ∆xm as the least squares solution of the equations.
Whereby the mathematical representation is
residual, ri
=
yi – ŷi
Where yi is observed response value and ŷi is fitted response value
39
n
summed square of residuals, S
=
∑
i =1
( yi – ŷi)2
where n is the number of data points included in the fit and S is the sum of squares error
estimate.
3.2.5 Image Reconstruction
A high-resolution image fHR is reconstructed from the registered images fLR,m
(m = 1, . . . ,M). For every image fLR,m, the coordinates of its pixels in the coordinate
frame of f LR,1 are computed using the estimated registration parameters.
From these known samples, the values on a regular high-resolution grid are
imterpolated using bicubic interpolation because of its low computational complexity
and good results. The output pixel value is a weighted average of pixels in the nearest 4by-4 neighborhood.
As mentioned earlier all processing of the colour image were done in their
individual RGB components. Therefore to obtain a final super resolution image these
components will be concatenated into one high resolution image. This can be
demonstrated in the following MATLAB code.
imRed = interpolation(imRed,delta_est,phi_est,factor);
imGreen = interpolation(imGreen,delta_est,phi_est,factor);
imBlue = interpolation(imBlue,delta_est,phi_est,factor);
40
im_result(:,:,1) = imRed;
im_result(:,:,2) = imGreen;
im_result(:,:,3) = imBlue;
where the final super resolution image is called im_result.
3.3
Super Resolution Application
The graphical user interface for the super resolution application is shown in
Figure 3.10.
Figure 3.10
Super Resolution Application
41
The first thing to do is to give some source images to the algorithm. At least four
images are needed to do the generation of the super-resolution image. Click on the
“Add” button and select the images you want to add. They have to be the same size
and the same type.
Once the images are added, they can previewed by clicking on their name. If a
mistake was made, or another images needs to be selected, the image or all the images
can be removed by clicking on the corresponding button.
Now the super-resolution image can be generated with the parameters that have
been chosen. Calculation can take a lot of time. When the calculation is concluded, the
super resolution image is displayed in the result frame.
Now, parts of the image can be zoomed in by clicking the “zoom image” button.
The image will be opened in a new window. Then the results can be saved by clicking
the “save” button.
The estimated motion parameters are displayed also in the graphical user
interface for the user’s reference.
42
CHAPTER 4
RESULTS
4.1
Input Images
The input image were captured from the CVVIP lab where the camera was
facing the electrical engineering faculty lecturers parking area. They were taken at mid
afternoon and therefore the target vehicle was in a well lit environment.
The vehicle movement was from a stationary position (parked condition) to
slowly moving away from the parking area. Therefore vehicle movement was relatively
slow and there was no motion blur to worry about.
Figure 4.1 shows the input images along with their frame numbers. Frames 430
to 437 were selected for image processing because ambient conditions were best, in
terms of lighting, no shadows and no obstructions. Also these images have a direct line
of sight to vehicle number plate.
43
Frame #430
Frame # 432
Frame #434
Frame #431
Frame #433
Frame #435
44
Frame #437
Frame #436
Figure 4.1
Input Images
These images are each 418 x 316 pixels in size. Figure 4.2 shows the comparison
in size for a single input image and a super resolution image generated by four input
images. The super resolution image is 554 x 421 pixels in size.
Low Resolution Input Image
Figure 4.2
Super Resolution Output Image
Comparison of LR and SR image
45
4.2
Comparison of Results
After going through all the processes mentioned previously it is good to get a
visual justification of differences between the input image and the output image. Figures
4.3 through 4.7 shows the zoom of these pictures as well as the improvements or lack of
when the number of input images are increased
Starting with the minimum 4 input images and stopping at 8 input images when
processing time becomes too long and improvement in visual quality actually start to
degrade.
LR Image
Figure 4.3
4 input SR Image
Comparison of LR and 4 input SR Images
46
LR Image
Figure 4.4
Comparison of LR and 5 input SR Images
LR Image
Figure 4.5
5 input SR Image
6 input SR Image
Comparison of LR and 6 input SR Images
47
LR Image
Figure 4.6
Comparison of LR and 7 input SR Images
LR Image
Figure 4.7
7 input SR Image
8 input SR Image
Comparison of LR and 8 input SR Images
48
4.3
Analysis of Results
One of the parameters taken into account in judging the improvements obtained
through super resolution techniques, is a quantitative measure of the amount of pixel
increment between the low resolution input image and the super resolution output image.
The percentage of pixel increment , P is calculated below.
P
=
# outputSRpixels
x 100%
# inputLRpixels
=
554 x 421
x 100%
418 x316
=
176.57 %
The qualitative analysis of the input and output images as well as the comparison
of output images between varying input images can best be done with visual inspection.
Though sometimes subjective a general consensus usually proves otherwise.
49
4.4
Discussion of Results
The output super resolution image has almost twice the number of pixel than
input low resolution image. This is concurrent with the theoretical limit of this super
resolution application. The SR image also contains more detail than LR image. This
shows a higher resolving power, which is obtained when aliasing ambiguity in an image
is removed.
Another important question forwarded during the presentation .What is the
optimal number of images to use when reconstructing a high-resolution image? The
exact answer to this question depends on many parameters, such as the registration
accuracy, imaging model, total frequency content, and so forth. Intuitively, two effects
need to be balanced.
On one hand, the more images there are, the better the reconstruction should be.
However this is not the case, based on the experiments conducted for this project.
On the other hand, there is a limit to the improvements that can be obtained:
Even from a very large number of very low-resolution images of a scene, it will not be
possible to reconstruct a sharp, high-resolution image. Blur, noise, and inaccuracies in
the signal model limit the increase in resolving power that can be obtained
The performance increases rapidly with the first six images, but the improvement
is marginal beyond that. By the seventh image there were already artifacts like ripples
present in the image. Finally by the eight image there are too many artifacts which look
like noise this only severely degrades the image quality.
50
Assuming the low resolution images were sub sampled by almost two, this is the
theoretical limit for which our algorithm should be able to reconstruct an image of
almost double resolution. In other words, four images are a minimum to have a welldetermined system when up sampling by two. The maximum on the other hand should
be limited to six.
51
CHAPTER 5
CONCLUSION
5.1
Conclusion
The frequency domain approach used in this project has managed to fulfill the
objectives of this project well. It has managed increase the pixel density of the image by
increase the number of vertical and horizontal pixels. Also increasing the size of image.
Not only that the resolving power of the generated super resolution image is also
increased as it contains more details than the low resolution input images.
The algorithm used for this frequency domain approach is also efficient provided
at least four to six low resolution images are provided. Computation time is also greatly
reduced if only the region of interest is provided by cropping the larger image. So as can
be seen in the results there is a good cut off between image quality and computational
constraints.
52
It also can be concluded that the approach adopted for this project is well suited
for surveillance application because LR images are sufficiently large and have
directionality in object movement.
Example applications of this system would be car parks which are notorious for
kidnappings and theft. Since vehicles move slowly from stationary position and there
many speed bumps some that they can’t speed. When vehicles move slowly there will
be no motion blur in the stills taken making super resolution process simpler.
5.2
Proposed Future Works
Apart from the suggested car park surveillance application, this frequency
approach domain super resolution approach can be adapted to other surveillance
applications with slight modifications to the system model.
One surveillance application would be where vehicle move at a fast speed such
as at traffic lights, when vehicles do not stopping for red. Compared to conventional
systems already available, the video camera could be located at distance or at some tall
building in the vicinity.
In order to do this however would need an incorporation of a motion blur model
into the registration process of SR system. After compensating for this with the
necessary rotation and shift estimation.
53
Another surveillance application would be in poorly lit areas, also notorious for
all sorts of bad thing. Insufficient lighting conditions can generate shot noise in sensors
of camera. It is possible to incorporate a shot noise removal model into the registration
process of SR system.
54
REFERENCE
[1]
Sung Cheol Park, Min Kyu Park & Moon Gi Kang, “Super-Resolution Image
Reconstruction: A Technical Overview ,” IEEE Signal Processing Magazine Vol.
20, No. 3 pp.21-36, May 2003.
[2]
Patrick Vandewalle, Sabine Süsstrunk, Martin Vetterli, A Frequency
DomainApproach to Registration of Aliased Images with Application to SuperResolution, EURASIP Journal on Applied Signal Processing, 2005.
[3]
Capel David & Zisserman Andrew, “Computer Vision Applied to Super
Resolution,” IEEE Signal Processing Magazine Vol. 20, No. 3 pp.75-86, May
2003.
[4]
B. Marcel, M. Briot, and R. Murrieta, Calculation of Rotation using Fourier
Transform, Treatment of Signal, vol. 14, no. 2, pp. 135-149,1997.
[5]
L. Lucchese and G. M. Cortelazzo, A Noise-Robust Frequency Domain
Technique for Estimating Planar Roto-Translations, IEEE Transactions on Signal
Processing, vol. 48, no. 6, pp. 1769-1786, 2000
[6]
D. Keren, S. Peleg, and R. Brada, Image Sequence Enhancement using Sub-Pixel
Displacement, Proc. IEEE Conference on Computer Vision and Pattern
Recognition, pp. 742-746, 1988.
55
[7]
Gonzales R.C., Woods R.E and Eddins S.L. (2004) Digital Image Processing
using MATLAB. Upper Saddle River NJ : Prentice Hall
[8]
Roberts M.J.,(2004) Signals and Systems : Analysis using transform method and
MATLAB. NY : McGraw Hill.
56
APPENDIX
CODE LISTING
%
%
%
%
%
%
%
SUPERRESOLUTION - Graphical User Interface for Super-Resolution Imaging
This program allows a user to perform registration of a set of low
resolution input images and reconstruct a high resolution image from them.
Multiple image registration and reconstruction methods have been
implemented. As input, the user can either select existing images, or
generate a set of simulated low resolution images from a high resolution
image.
function varargout = superresolution(varargin)
% Begin GUIinitialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name',
mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @superresolution_OpeningFcn, ...
'gui_OutputFcn', @superresolution_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback',
[]);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
%%%%%%%%%%%%%%%%%%
% Initialisation %
%%%%%%%%%%%%%%%%%%
% --- Executes just before superresolution is made visible.
function superresolution_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject
handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% varargin
command line arguments to superresolution (see VARARGIN)
% Global variables of the program
global IMAGESNUMBER; % number of input images
global IMAGESDATA; % name and path of the images
global IMAGESSTRING; % list of images names for the list of images
global IMAGES; % input images
global IMAGESINFO; % informations about the images, like the bit depth,
etc...
global PARAMETERSWILLBEGIVEN; % true if the user wants to give the
parameters
global ISPARAMETERSGIVENBYUSER; % true if the user has chosen parameters
global PARAMETERSGIVENBYUSER; % parameters given by the user
global TEMPRESULT;
IMAGESNUMBER = 0;
57
IMAGESDATA = [];
IMAGESSTRING = {};
IMAGES = {};
IMAGESINFO = {};
PARAMETERSWILLBEGIVEN = false;
ISPARAMETERSGIVENBYUSER = false;
PARAMETERSGIVENBYUSER = {};
TEMPRESULT = [];
% Initialize imagePreview
axes(handles.imagePreview);
set(handles.imagePreview,'Visible', 'off');
% Initialize imageResult
axes(handles.imageResult);
set(handles.imageResult,'Visible', 'off');
% Choose default command line output for superresolution
handles.output = hObject;
% Update handles structure
guidata(hObject, handles);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Add a new image and save its informations %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function addImageFile(handles)
global IMAGESNUMBER;
global IMAGESDATA;
global IMAGESSTRING;
global PARAMETERSWILLBEGIVEN;
global ISPARAMETERSGIVENBYUSER;
global PARAMETERSGIVENBYUSER;
% get the image
[fnameMult, pname] = uigetfile('*.tif', 'Choose an image', 'MultiSelect',
'on');
if iscell(fnameMult)
nbr = size(fnameMult,2);
elseif isnumeric(fnameMult)
if fnameMult == 0
nbr = 0;
end
elseif isstr(fnameMult)
nbr = 1;
fnameMult = {fnameMult};
end
if nbr > 0
for i = 1:nbr
fname = fnameMult{1,i};
% activation of the list
set(handles.listImages, 'Max', 1);
set(handles.listImages, 'Enable', 'on');
set(handles.listImages, 'Value', 1);
% save the informations
IMAGESNUMBER = IMAGESNUMBER + 1;
IMAGESDATA(IMAGESNUMBER).name = fname;
IMAGESDATA(IMAGESNUMBER).path = pname;
IMAGESSTRING{IMAGESNUMBER} = fname;
set(handles.listImages, 'String', IMAGESSTRING);
% activate the selection of the image if it is the first image
if IMAGESNUMBER == 1
selectImage(handles, fname, pname);
end
% earase parameters
removeParameters(handles);
% activate the button for removing images
set(handles.removeButton, 'Enable', 'on');
58
set(handles.removeAllButton, 'Enable', 'on');
set(handles.removeMenu, 'Enable', 'on');
set(handles.removeAllMenu, 'Enable', 'on');
% activate the other buttons if enough images
if IMAGESNUMBER >= 4
set(handles.radiobutton1, 'Enable', 'on');
set(handles.radiobutton2, 'Enable', 'on');
set(handles.popupmenu2, 'Enable', 'on');
if(PARAMETERSWILLBEGIVEN)
set(handles.popupmenu1, 'Enable', 'off');
if(ISPARAMETERSGIVENBYUSER)
set(handles.removeparameters, 'Enable', 'on');
set(handles.resultButton, 'Enable', 'on');
set(handles.resultMenu, 'Enable', 'on');
else
set(handles.enterparameters, 'Enable', 'on');
set(handles.resultButton, 'Enable', 'off');
set(handles.resultMenu, 'Enable', 'off');
end
else
set(handles.popupmenu1, 'Enable', 'on');
set(handles.resultButton, 'Enable', 'on');
set(handles.resultMenu, 'Enable', 'on');
end
end
end
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Remove an image and its informations %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function removeImageFile(handles, toRemove)
global IMAGESNUMBER;
global IMAGESDATA;
global IMAGESSTRING;
% earase parameters
removeParameters(handles);
% remove the image from the data list
IMAGESDATA(:,toRemove) = [];
% remove the image from the string list
charString = char(IMAGESSTRING);
charString(toRemove,:) = [];
IMAGESSTRING = cellstr(charString);
IMAGESNUMBER = IMAGESNUMBER - 1;
% activate the selection of the nearest image
if IMAGESNUMBER >= 1
if toRemove >= IMAGESNUMBER
toDisplay = IMAGESNUMBER;
else
toDisplay = toRemove;
end
set(handles.listImages, 'Value', toDisplay);
set(handles.listImages, 'String', IMAGESSTRING);
fname = IMAGESDATA(toDisplay).name;
pname = IMAGESDATA(toDisplay).path;
selectImage(handles, fname, pname);
else
set(handles.listImages, 'String', IMAGESSTRING);
set(handles.listImages, 'Max', 2);
set(handles.listImages, 'Value', []);
set(handles.listImages, 'Enable', 'inactive');
set(handles.imageName, 'String', '');
59
set(handles.imageSize, 'String', '');
% displayImage;
axes(handles.imagePreview);
image([]);
set(handles.imagePreview,'Visible', 'off');
end
% desactivate the button for the calculation if not enough images
if IMAGESNUMBER < 4
set(handles.resultButton, 'Enable', 'off');
set(handles.resultMenu, 'Enable', 'off');
set(handles.enterparameters, 'Enable', 'off');
set(handles.removeparameters, 'Enable', 'off');
set(handles.popupmenu1, 'Enable', 'off');
set(handles.radiobutton1, 'Enable', 'off');
set(handles.radiobutton2, 'Enable', 'off');
set(handles.popupmenu2, 'Enable', 'off');
end
% desactivate the button for removing images if no image left
if IMAGESNUMBER == 0
set(handles.removeButton, 'Enable', 'off');
set(handles.removeAllButton, 'Enable', 'off');
set(handles.removeMenu, 'Enable', 'off');
set(handles.removeAllMenu, 'Enable', 'off');
set(handles.zoomPreview, 'Enable', 'off');
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Remove all images and their informations %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function removeAllImages(handles)
global IMAGESNUMBER;
global IMAGESDATA;
global IMAGESSTRING;
global IMAGES;
IMAGESNUMBER = 0;
IMAGESDATA = [];
IMAGESSTRING = {};
IMAGES = [];
removeParameters(handles);
set(handles.resultButton, 'Enable', 'off');
set(handles.resultMenu, 'Enable', 'off');
set(handles.listImages, 'Max', 2);
set(handles.listImages, 'Value', []);
set(handles.listImages, 'String', IMAGESSTRING);
set(handles.listImages, 'Enable', 'inactive');
set(handles.imageName, 'String', '');
set(handles.imageSize, 'String', '');
axes(handles.imagePreview);
image([]);
set(handles.imagePreview,'Visible', 'off');
set(handles.removeButton, 'Enable', 'off');
set(handles.removeAllButton, 'Enable', 'off');
set(handles.removeMenu, 'Enable', 'off');
set(handles.enterparameters, 'Enable', 'off');
set(handles.removeparameters, 'Enable', 'off');
set(handles.radiobutton1, 'Enable', 'off');
set(handles.radiobutton2, 'Enable', 'off');
set(handles.popupmenu2, 'Enable', 'off');
set(handles.zoomPreview, 'Enable', 'off');
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Display an image in an axe %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function displayImage(handles, filePath, destination)
handles.imSource = imread(filePath);
axes(destination);
60
image(handles.imSource);
colormap(gray);
set(destination,'Visible', 'off');
set(handles.zoomPreview, 'Enable', 'on');
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Check that the images are correct %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function test = testImages(handles)
global IMAGESNUMBER;
global IMAGESDATA;
global IMAGES;
global IMAGESINFO;
global IMAGESSTRING;
test = true;
IMAGES = {};
IMAGESINFO = {};
IMAGES{1} = imread(strcat(IMAGESDATA(1).path, IMAGESDATA(1).name));
IMAGESINFO{1} = imfinfo(strcat(IMAGESDATA(1).path, IMAGESDATA(1).name));
for i=2:IMAGESNUMBER
tempImage = imread(strcat(IMAGESDATA(i).path, IMAGESDATA(i).name));
tempInfo = imfinfo(strcat(IMAGESDATA(i).path, IMAGESDATA(i).name));
if not(size(IMAGES{1}) == size(tempImage))
errordlg(['Image ' IMAGESSTRING{i} ' has not the same size as image
' IMAGESSTRING{1}], 'Size error');
test = false;
break;
elseif not(IMAGESINFO{1}.BitDepth == tempInfo.BitDepth)
errordlg(['Image ' IMAGESSTRING{i} ' is not the same type as image '
IMAGESSTRING{1}], 'Size error');
test = false;
break;
else
IMAGES{i} = tempImage;
IMAGESINFO{i} = tempInfo;
end
end
test;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Select an image by clicking on it %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function selectImage(handles, fname, pname)
displayImage(handles, [pname filesep fname], handles.imagePreview);
set(handles.imageName, 'String', fname);
info = imfinfo([pname filesep fname]);
set(handles.imageSize, 'String', [num2str(info.Width) ' * '
num2str(info.Height)]);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Enter the motion parameters %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function enterParameters(handles)
global IMAGESNUMBER;
global ISPARAMETERSGIVENBYUSER
global PARAMETERSGIVENBYUSER
tempTest = true;
tempRotations = {};
tempTranslationx = {};
tempTranslationy = {};
canceled = false;
try
% getting the rotation parameters
for i=1:IMAGESNUMBER
prompt(i) = {strcat('Rotation image', int2str(i))};
61
end
title = 'Please, enter the rotation parameters';
lines = 1;
tempRotations = inputdlg(prompt,title,lines);
% test rotation parameters
if size(tempRotations,1) > 0
for i=1:size(tempRotations,1)
if(size(tempRotations{i},1) == 0 | str2num(tempRotations{i}) >
10 | str2num(tempRotations{i}) < -10)
tempTest = false;
end
end
if(tempTest)
PARAMETERSGIVENBYUSER{1} = tempRotations;
end
else
tempTest = false;
canceled = true;
end
% getting the x translation parameters
if(tempTest)
for i=1:IMAGESNUMBER
prompt(i) = {strcat('Translation x of image', int2str(i))};
end
title = 'Please, enter the translation x parameters';
lines = 1;
tempTranslationx = inputdlg(prompt,title,lines);
% test rotation parameters
if size(tempTranslationx,1) > 0
for i=1:size(tempTranslationx,1)
if(size(tempTranslationx{i},1) == 0 |
str2num(tempTranslationx{i}) > 10 | str2num(tempTranslationx{i}) < -10)
tempTest = false;
end
end
if(tempTest)
PARAMETERSGIVENBYUSER{2} = tempTranslationx;
end
else
tempTest = false;
canceled = true;
end
end
% getting the y translation parameters
if(tempTest)
for i=1:IMAGESNUMBER
prompt(i) = {strcat('Translation y of image', int2str(i))};
end
title = 'Please, enter the translation y parameters';
lines = 1;
tempTranslationy = inputdlg(prompt,title,lines);
% test translation y parameters
if size(tempTranslationy,1) > 0
for i=1:size(tempTranslationy,1)
if(size(tempTranslationy{i},1) == 0 |
str2num(tempTranslationy{i}) > 10 | str2num(tempTranslationy{i}) < -10)
tempTest = false;
end
end
if(tempTest)
PARAMETERSGIVENBYUSER{3} = tempTranslationy;
end
else
tempTest = false;
canceled = true;
end
62
end
if(tempTest)
set(handles.rotationlist, 'String', PARAMETERSGIVENBYUSER{1});
set(handles.shiftxlist, 'String', PARAMETERSGIVENBYUSER{2});
set(handles.shiftylist, 'String', PARAMETERSGIVENBYUSER{3});
ISPARAMETERSGIVENBYUSER = true;
set(handles.enterparameters, 'Enable', 'off');
set(handles.removeparameters, 'Enable', 'on');
set(handles.popupmenu1, 'Enable', 'off');
set(handles.resultButton, 'Enable', 'on');
set(handles.resultMenu, 'Enable', 'on');
for i=1:IMAGESNUMBER
phi(i) = str2num(PARAMETERSGIVENBYUSER{1}{i});
deltax(i,1) = str2num(PARAMETERSGIVENBYUSER{2}{i});
deltay(i,1) = str2num(PARAMETERSGIVENBYUSER{3}{i});
end
PARAMETERSGIVENBYUSER{1} = phi;
PARAMETERSGIVENBYUSER{2} = deltax;
PARAMETERSGIVENBYUSER{3} = deltay;
else
if (size(tempRotations,1) > 0 & (canceled == false))
errordlg('One parameter in not correct');
PARAMETERSGIVENBYUSER = {};
end
end
catch
errordlg('One parameter in not correct');
PARAMETERSGIVENBYUSER = {};
end
%%%%%%%%%%%%%%%%%%%%%%%%%
% Remove all parameters %
%%%%%%%%%%%%%%%%%%%%%%%%%
function removeParameters(handles)
global ISPARAMETERSGIVENBYUSER;
global PARAMETERSGIVENBYUSER;
global PARAMETERSWILLBEGIVEN;
PARAMETERSGIVENBYUSER = {};
ISPARAMETERSGIVENBYUSER = false;
set(handles.rotationlist, 'String', {});
set(handles.shiftxlist, 'String', {});
set(handles.shiftylist, 'String', {});
set(handles.removeparameters, 'Enable', 'off');
set(handles.resultButton, 'Enable', 'off');
set(handles.resultMenu, 'Enable', 'off');
if (PARAMETERSWILLBEGIVEN)
set(handles.enterparameters, 'Enable', 'on');
else
set(handles.enterparameters, 'Enable', 'off');
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Generate the output image %
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function generateImage(handles)
global IMAGESNUMBER;
global IMAGESDATA;
global ISPARAMETERSGIVENBYUSER;
global PARAMETERSWILLBEGIVEN;
global PARAMETERSGIVENBYUSER;
global TEMPRESULT;
global IMAGES;
global IMAGESINFO;
% choose the case
63
if (ISPARAMETERSGIVENBYUSER == true & get(handles.radiobutton1, 'Value') ==
0)
registration = 'manual';
end
switch get(handles.popupmenu1,'Value')
case 1
registration = 'vandewalle';
case 2
registration = 'marcel';
case 3
registration = 'lucchese';
case 4
registration = 'keren';
otherwise
disp('Unknown registration method.');
end
switch get(handles.popupmenu2,'Value')
case 1
reconstruction = 'interpolation';
case 2
reconstruction = 'papoulis-gerchberg';
otherwise
disp('Unknown registration method.');
end
barNumber = 4;
h = waitbar(0/barNumber, 'Please wait...');
set(h, 'Name', 'Preprocessing images...');
% preprocessing images
for i=1:IMAGESNUMBER
im{i} = IMAGES{i};
im{i} =
double(im{i})/(2^(IMAGESINFO{i}.BitDepth/IMAGESINFO{i}.SamplesPerPixel));
imColor{i} = im{i};
if (size(size(IMAGES{1}), 2) == 3)
im{i} = rgb2gray(im{i});
end
%% select part of the image or full image to reconstruct
% full image
if (get(handles.radio_rec_full, 'Value') == 1)
im_part{i} = imColor{i};
else
% only part of the high resolution image will be reconstructed
% because of memory limitations and to show the details appropriately
part = get(handles.text_rec_part, 'String');
part = str2double(part);
half_rem_size = round(size(im{i})*(1-part/100)/2);
im_part{i} = ...
imColor{i}(1+half_rem_size(1):end-half_rem_size(1),...
1+half_rem_size(2):end-half_rem_size(2),:);
end
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
waitbar(1/barNumber);
set(h, 'Name', 'Multiplying images with Tukey window...');
% multiply the images with a Tukey window for more accurate registration
% (by making them circularly symmetric)
s_im=size(im{1});
w1=window(@tukeywin,s_im(1),0.25);
w2=window(@tukeywin,s_im(2),0.25)';
w=w1*w2;
64
for i=1:IMAGESNUMBER
im{i} = [zeros(32,s_im(2)+64); ...
zeros(s_im(1),32) im{i}.*w zeros(s_im(1),32); ...
zeros(32,s_im(2)+64)];
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
waitbar(2/barNumber);
set(h, 'Name', 'Motion estimation...');
% motion estimation
switch registration
case 'manual'
% user
for i=1:IMAGESNUMBER
phi_est = PARAMETERSGIVENBYUSER{1};
delta_est = [PARAMETERSGIVENBYUSER{2} PARAMETERSGIVENBYUSER{3}];
end
case 'vandewalle'
% patrick
[delta_est, phi_est] = estimate_motion(im,0.6,25);
case 'marcel'
% marcel
[delta_est, phi_est] = marcel(im,2);
case 'lucchese'
% lucchese
[delta_est, phi_est] = lucchese(im,2);
case 'keren'
% keren
[delta_est, phi_est] = keren(im);
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
waitbar(3/barNumber);
set(h, 'Name', 'Image reconstruction...');
%% signal reconstruction
% get interpolation factor
factor = get(handles.text_interp_factor, 'String');
factor = str2double(factor);
% reconstruct high resolution image
switch reconstruction
case 'interpolation'
if (size(size(IMAGES{1}), 2) == 3)
% color image
for i=1:IMAGESNUMBER
imRed{i} = im_part{i}(:,:,1);
imGreen{i} = im_part{i}(:,:,2);
imBlue{i} = im_part{i}(:,:,3);
end
imRed = interpolation(imRed,delta_est,phi_est,factor);
imGreen = interpolation(imGreen,delta_est,phi_est,factor);
imBlue = interpolation(imBlue,delta_est,phi_est,factor);
im_result(:,:,1) = imRed;
im_result(:,:,2) = imGreen;
im_result(:,:,3) = imBlue;
else
% grayscale image
im_result = interpolation(im_part,delta_est,phi_est,factor);
end
case 'papoulis-gerchberg'
% only shift estimation is taken into account here!
if (size(size(IMAGES{1}), 2) == 3)
% color image
65
for i=1:IMAGESNUMBER
imRed{i} = im_part{i}(:,:,1);
imGreen{i} = im_part{i}(:,:,2);
imBlue{i} = im_part{i}(:,:,3);
end
imRed = papoulisgerchberg(imRed,delta_est,factor);
imGreen = papoulisgerchberg(imGreen,delta_est,factor);
imBlue = papoulisgerchberg(imBlue,delta_est,factor);
im_result(:,:,1) = imRed;
im_result(:,:,2) = imGreen;
im_result(:,:,3) = imBlue;
else
% grayscale image
im_result = papoulisgerchberg(im_part,delta_est,factor);
end
end
TEMPRESULT = im_result;
set(handles.zoomButton, 'Enable', 'on');
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
close(h)
axes(handles.imageResult);
imshow(im_result);
set(handles.imageResult,'Visible', 'off');
% display parameters
set(handles.rotationlist, 'String', phi_est);
set(handles.shiftxlist, 'String', delta_est(:,1));
set(handles.shiftylist, 'String', delta_est(:,2));
set(handles.rotationlist, 'Enable', 'on');
set(handles.shiftxlist, 'Enable', 'on');
set(handles.shiftylist, 'Enable', 'on');
PARAMETERSGIVENBYUSER{1} = phi_est;
PARAMETERSGIVENBYUSER{2} = delta_est(:,1);
PARAMETERSGIVENBYUSER{3} = delta_est(:,2);
PARAMETERSWILLBEGIVEN = true;
set(handles.popupmenu1, 'Enable', 'off');
ISPARAMETERSGIVENBYUSER = true;
set(handles.removeparameters, 'Enable', 'on');
set(handles.radiobutton2, 'Value', 1.0);
%%%%%%%%%%%%%%%%%%%%%%%%%
% Generate input images %
%%%%%%%%%%%%%%%%%%%%%%%%%
function generateInputImages(handles)
generation();
%%%%%%%%%%%%%%%%%%%%%%
% Callback functions %
%%%%%%%%%%%%%%%%%%%%%%
% --- Outputs from this function are returned to the command line.
function varargout = superresolution_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject
handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Get default command line output from handles structure
varargout{1} = handles.output;
% --- Executes on selection change in listbox1.
66
function listbox1_Callback(hObject, eventdata, handles)
% hObject
handle to listbox1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns listbox1 contents as cell
array
%
contents{get(hObject,'Value')} returns selected item from listbox1
% --- Executes during object creation, after setting all properties.
function listbox1_CreateFcn(hObject, eventdata, handles)
% hObject
handle to listbox1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on button press in addButton.
function addButton_Callback(hObject, eventdata, handles)
% hObject
handle to addButton (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
addImageFile(handles);
% --- Executes on button press in removeButton.
function removeButton_Callback(hObject, eventdata, handles)
% hObject
handle to removeButton (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
imageNumber = get((handles.listImages), 'Value');
removeImageFile(handles, imageNumber);
% --- Executes on selection change in listImages.
function listImages_Callback(hObject, eventdata, handles)
% hObject
handle to listImages (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns listImages contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
%
listImages
if get(hObject,'Value') >=1
global IMAGESDATA;
fname = IMAGESDATA(get(hObject,'Value')).name;
pname = IMAGESDATA(get(hObject,'Value')).path;
selectImage(handles, fname, pname);
end
% --- Executes during object creation, after setting all properties.
function listImages_CreateFcn(hObject, eventdata, handles)
% hObject
handle to listImages (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
67
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on button press in resultButton.
function resultButton_Callback(hObject, eventdata, handles)
% hObject
handle to resultButton (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
if testImages(handles)
generateImage(handles);
end
% --- Executes on button press in togglebutton1.
function togglebutton1_Callback(hObject, eventdata, handles)
% hObject
handle to togglebutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of togglebutton1
% --- Executes on button press in zoomButton.
function zoomButton_Callback(hObject, eventdata, handles)
% hObject
handle to zoomButton (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
global TEMPRESULT;
figure;
imshow(TEMPRESULT);
% --- Executes on button press in zoomPreview.
function zoomPreview_Callback(hObject, eventdata, handles)
% hObject
handle to zoomPreview (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
if get(handles.listImages,'Value') >=1
global IMAGESDATA;
fname = IMAGESDATA(get(handles.listImages,'Value')).name;
pname = IMAGESDATA(get(handles.listImages,'Value')).path;
figure;
imshow([pname fname]);
end
% --- Executes on selection change in listbox3.
function listbox3_Callback(hObject, eventdata, handles)
% hObject
handle to listbox3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns listbox3 contents as cell
array
%
contents{get(hObject,'Value')} returns selected item from listbox3
% --- Executes during object creation, after setting all properties.
function listbox3_CreateFcn(hObject, eventdata, handles)
% hObject
handle to listbox3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
68
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on selection change in listbox5.
function listbox5_Callback(hObject, eventdata, handles)
% hObject
handle to listbox5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns listbox5 contents as cell
array
%
contents{get(hObject,'Value')} returns selected item from listbox5
% --- Executes during object creation, after setting all properties.
function listbox5_CreateFcn(hObject, eventdata, handles)
% hObject
handle to listbox5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on selection change in popupmenu1.
function popupmenu1_Callback(hObject, eventdata, handles)
% hObject
handle to popupmenu1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns popupmenu1 contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
popupmenu1
% --- Executes during object creation, after setting all properties.
function popupmenu1_CreateFcn(hObject, eventdata, handles)
% hObject
handle to popupmenu1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: popupmenu controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on selection change in popupmenu2.
function popupmenu2_Callback(hObject, eventdata, handles)
% hObject
handle to popupmenu2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns popupmenu2 contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
popupmenu2
69
% --- Executes during object creation, after setting all properties.
function popupmenu2_CreateFcn(hObject, eventdata, handles)
% hObject
handle to popupmenu2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: popupmenu controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on button press in pushbutton5.
function pushbutton5_Callback(hObject, eventdata, handles)
% hObject
handle to pushbutton5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% --- Executes on button press in removeAllButton.
function removeAllButton_Callback(hObject, eventdata, handles)
% hObject
handle to removeAllButton (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
removeAllImages(handles);
% --- Executes on button press in enterparameters.
function enterparameters_Callback(hObject, eventdata, handles)
enterParameters(handles);
% --- Executes on button press in removeparameters.
function removeparameters_Callback(hObject, eventdata, handles)
% hObject
handle to removeparameters (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
removeParameters(handles);
% --- Executes on selection change in rotationlist.
function rotationlist_Callback(hObject, eventdata, handles)
% hObject
handle to rotationlist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns rotationlist contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
rotationlist
% --- Executes during object creation, after setting all properties.
function rotationlist_CreateFcn(hObject, eventdata, handles)
% hObject
handle to rotationlist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
70
% --- Executes on selection change in shiftxlist.
function shiftxlist_Callback(hObject, eventdata, handles)
% hObject
handle to shiftxlist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns shiftxlist contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
shiftxlist
% --- Executes during object creation, after setting all properties.
function shiftxlist_CreateFcn(hObject, eventdata, handles)
% hObject
handle to shiftxlist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes during object creation, after setting all properties.
function imagePreview_CreateFcn(hObject, eventdata, handles)
% hObject
handle to imagePreview (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: place code in OpeningFcn to populate imagePreview
% --- Executes on button press in radiobutton1.
function radiobutton1_Callback(hObject, eventdata, handles)
% hObject
handle to radiobutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of radiobutton1
global PARAMETERSWILLBEGIVEN;
PARAMETERSWILLBEGIVEN = false;
set(handles.enterparameters, 'Enable', 'off');
set(handles.removeparameters, 'Enable', 'off');
set(handles.resultButton, 'Enable', 'on');
set(handles.resultMenu, 'Enable', 'on');
set(handles.popupmenu1, 'Enable', 'on');
set(handles.rotationlist, 'Enable', 'off');
set(handles.shiftxlist, 'Enable', 'off');
set(handles.shiftylist, 'Enable', 'off');
% --- Executes on button press in radiobutton2.
function radiobutton2_Callback(hObject, eventdata, handles)
% hObject
handle to radiobutton2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of radiobutton2
global PARAMETERSWILLBEGIVEN;
global ISPARAMETERSGIVENBYUSER;
PARAMETERSWILLBEGIVEN = true;
set(handles.popupmenu1, 'Enable', 'off');
set(handles.rotationlist, 'Enable', 'on');
set(handles.shiftxlist, 'Enable', 'on');
71
set(handles.shiftylist, 'Enable', 'on');
if(ISPARAMETERSGIVENBYUSER)
set(handles.removeparameters, 'Enable', 'on');
set(handles.resultButton, 'Enable', 'on');
set(handles.resultMenu, 'Enable', 'on');
else
set(handles.enterparameters, 'Enable', 'on');
set(handles.resultButton, 'Enable', 'off');
set(handles.resultMenu, 'Enable', 'off');
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Menu Callback functions %
%%%%%%%%%%%%%%%%%%%%%%%%%%%
% -------------------------------------------------------------------function addMenu_Callback(hObject, eventdata, handles)
% hObject
handle to addMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
addImageFile(handles);
% -------------------------------------------------------------------function removeMenu_Callback(hObject, eventdata, handles)
% hObject
handle to removeMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
imageNumber = get((handles.listImages), 'Value');
removeImageFile(handles, imageNumber);
% -------------------------------------------------------------------function removeAllMenu_Callback(hObject, eventdata, handles)
% hObject
handle to removeAllMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
removeAllImages(handles);
% -------------------------------------------------------------------function exitMenu_Callback(hObject, eventdata, handles)
% hObject
handle to exitMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
close all;
% -------------------------------------------------------------------function generateInputMenu_Callback(hObject, eventdata, handles)
% hObject
handle to generateInputMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
generateInputImages(handles);
% -------------------------------------------------------------------function resultMenu_Callback(hObject, eventdata, handles)
% hObject
handle to resultMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
generateImage(handles);
% -------------------------------------------------------------------function documentationMenu_Callback(hObject, eventdata, handles)
% hObject
handle to documentationMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
72
% handles
structure with handles and user data (see GUIDATA)
text = {'Graphical User Interface for Super-Resolution Imaging',
'',
'This program allows a user to perform registration of a set of low',
'resolution input images and reconstruct a high resolution image from
them.',
'Multiple image registration and reconstruction methods have been',
'implemented. As input, the user can either select existing images, or',
'generate a set of simulated low resolution images from a high
resolution',
'image.',
'More information about the implemented methods is available online:',
'http://lcavwww.epfl.ch/reproducible_research/VandewalleSV05/',
'',
'',
'To use the program:',
'1. Input images',
'
Click on the "Add" button to add at least 4 images or choose the',
'
"Generate input images" from the "Processing" menu to generate',
'
input images.',
'2. Algorithm',
'
Choose the algorithm you want to use, or enter the motion
parameters',
'
if you already know them.',
'3. Result generation',
'
Once you have given all the necessary information, click on the',
'
"Generate result" button to generate the super-resolution image.',
'4. View results',
'
You can now click on the "Zoom the image" button to zoom the
image,',
'
and be able to save it',
};
helpdlg(text,'Documentation');
% -------------------------------------------------------------------function aboutMenu_Callback(hObject, eventdata, handles)
% hObject
handle to aboutMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
text = {'Super-Resolution Application',
'Graphical User Interface for Super-Resolution Image Registration',
'and Reconstruction.',
'More information at ',
'http://lcavwww.epfl.ch/reproducible_research/VandewalleSV05/',
'',
'Copyright (C) 2005 Laboratory of Audiovisual Communications (LCAV)',
'Ecole Polytechnique Federale de Lausanne (EPFL)',
'Station 14, CH-1015 Lausanne, Switzerland',
'',
'Send comments to [email protected]'
};
helpdlg(text,'About');
% -------------------------------------------------------------------function processingMenu_Callback(hObject, eventdata, handles)
% hObject
handle to processingMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% -------------------------------------------------------------------function helpMenu_Callback(hObject, eventdata, handles)
% hObject
handle to helpMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
73
function fileMenu_Callback(hObject, eventdata, handles)
% hObject
handle to fileMenu (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% --- Executes on selection change in rotationylist.
function rotationylist_Callback(hObject, eventdata, handles)
% hObject
handle to rotationylist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns rotationylist contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
rotationylist
% --- Executes during object creation, after setting all properties.
function rotationylist_CreateFcn(hObject, eventdata, handles)
% hObject
handle to rotationylist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
% --- Executes on selection change in shiftylist.
function shiftylist_Callback(hObject, eventdata, handles)
% hObject
handle to shiftylist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: contents = get(hObject,'String') returns shiftylist contents as
cell array
%
contents{get(hObject,'Value')} returns selected item from
shiftylist
% --- Executes during object creation, after setting all properties.
function shiftylist_CreateFcn(hObject, eventdata, handles)
% hObject
handle to shiftylist (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: listbox controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc
set(hObject,'BackgroundColor','white');
else
set(hObject,'BackgroundColor',get(0,'defaultUicontrolBackgroundColor'));
end
function text_rec_part_Callback(hObject, eventdata, handles)
% hObject
handle to text_rec_part (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of text_rec_part as text
%
str2double(get(hObject,'String')) returns contents of text_rec_part
as a double
74
% --- Executes during object creation, after setting all properties.
function text_rec_part_CreateFcn(hObject, eventdata, handles)
% hObject
handle to text_rec_part (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: edit controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
function text_interp_factor_Callback(hObject, eventdata, handles)
% hObject
handle to text_interp_factor (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
structure with handles and user data (see GUIDATA)
% Hints: get(hObject,'String') returns contents of text_interp_factor as
text
%
str2double(get(hObject,'String')) returns contents of
text_interp_factor as a double
% --- Executes during object creation, after setting all properties.
function text_interp_factor_CreateFcn(hObject, eventdata, handles)
% hObject
handle to text_interp_factor (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles
empty - handles not created until after all CreateFcns called
% Hint: edit controls usually have a white background on Windows.
%
See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end
function [delta_est, phi_est] = estimate_motion(s,r_max,d_max)
% ESTIMATE_MOTION - shift and rotation estimation using algorithm by
Vandewalle et al.
%
[delta_est, phi_est] = estimate_motion(s,r_max,d_max)
%
R_MAX is the maximum radius in the rotation estimation
%
D_MAX is the number of low frequency components used for shift
estimation
%
input images S are specified as S{1}, S{2}, etc.
if (nargin==1) % default values
r_max = 0.8;
d_max = 8;
end
% rotation estimation
[phi_est, c_est] = estimate_rotation(s,[0.1 r_max],0.1);
% rotation compensation, required to estimate shifts
s2{1} = s{1};
nr=length(s);
for i=2:nr
s2{i} = imrotate(s{i},-phi_est(i),'bicubic','crop');
end
% shift estimation
delta_est = estimate_shift(s2,d_max);
75
function [rot_angle, c] = estimate_rotation(a,dist_bounds,precision)
% ESTIMATE_ROTATION - rotation estimation using algorithm by Vandewalle et al.
%
[rot_angle, c] = estimate_rotation(a,dist_bounds,precision)
%
DIST_BOUNDS gives the minimum and maximum radius to be used
%
PRECISION gives the precision with which the rotation angle is computed
%
input images A are specified as A{1}, A{2}, etc.
nr=length(a);
d=1*pi/180; % width of the angle over which the average frequency value is
computed
s = size(a{1})/2;
center=[floor(s(1))+1 floor(s(2))+1]; % center of the image and the frequency
domain matrix
x = ones(s(1)*2,1)*[-1:1/s(2):1-1/s(2)]; % X coordinates of the pixels
y = [-1:1/s(1):1-1/s(1)]'*ones(1,s(2)*2); % Y coordinates of the pixels
x=x(:);
y=y(:);
[th,ra] = cart2pol(x,y); % polar coordinates of the pixels
for k=1:nr
A{k} = fftshift(abs(fft2(a{k}))); % Fourier transform of the image
for i=-180/precision:180/precision % for every angle, compute the average
value
i_=i*pi*precision/180; % angle
% use only the part between angle-d and angle+d, and between dist_bounds(1)
and dist_bounds(2)
v = (th>i_-d)&(th<i_+d)&(ra>dist_bounds(1))&(ra<dist_bounds(2));
if ~(sum(v)>0) % if no values are found that satisfy the above test
h_A{k}(i+180/precision+1) = 0;
else
h_A{k}(i+180/precision+1) = mean(A{k}(v));
end
end
end
% compute the correlation between h_A{1} and h_A{2-4} and set the estimated
rotation angle
% to the maximum found between -30 and 30 degrees
H_A = fft(h_A{1});
rot_angle(1)=0;
c{1}=[];
for k=2:nr
H_Binv = fft(h_A{k}(end:-1:1));
H_C = H_A.*H_Binv;
h_C = real(ifft(H_C));
[m,ind] = max(h_C(150/precision+1:end-150/precision));
rot_angle(k) = (ind-30/precision-1)*precision;
c{k} = h_C;
end
76
function delta_est = estimate_shift(s,n)
% ESTIMATE_SHIFT - shift estimation using algorithm by Vandewalle et al.
%
delta_est = estimate_shift(s,n)
%
estimate shift between every image and the first (reference) image
%
N specifies the number of low frequency pixels to be used
%
input images S are specified as S{1}, S{2}, etc.
%
DELTA_EST is an M-by-2 matrix with M the number of images
nr = length(s);
delta_est=zeros(nr,2);
p = [n n]; % only the central (aliasing-free) part of NxN pixels is used for
shift estimation
sz = size(s{1});
S1 = fftshift(fft2(s{1})); % Fourier transform of the reference image
for i=2:nr
S2 = fftshift(fft2(s{i})); % Fourier transform of the image to be registered
S2(S2==0)=1e-10;
Q = S1./S2;
A = angle(Q); % phase difference between the two images
% determine the central part of the frequency spectrum to be used
beginy = floor(sz(1)/2)-p(1)+1;
endy = floor(sz(1)/2)+p(1)+1;
beginx = floor(sz(2)/2)-p(2)+1;
endx = floor(sz(2)/2)+p(2)+1;
%
x
x
y
y
v
v
compute x and y coordinates of the pixels
= ones(endy-beginy+1,1)*[beginx:endx];
= x(:);
= [beginy:endy]'*ones(1,endx-beginx+1);
= y(:);
= A(beginy:endy,beginx:endx);
= v(:);
% compute the least squares solution for the slopes of the phase difference
plane
M_A = [x y ones(length(x),1)];
r = M_A\v;
delta_est(i,:) = -[r(2) r(1)].*sz/2/pi;
end
77
function rec = interpolation(s,delta_est,phi_est,factor)
% INTERPOLATION - reconstruct a high resolution image using bicubic
interpolation
%
rec = interpolation(s,delta_est,phi_est,factor)
%
reconstruct an image with FACTOR times more pixels in both dimensions
%
using bicubic interpolation on the pixels from the images in S
%
(S{1},...) and using the shift and rotation information from DELTA_EST
%
and PHI_EST
n=length(s);
ss = size(s{1});
if (length(ss)==2) ss=[ss 1]; end
center = (ss+1)/2;
phi_rad = phi_est*pi/180;
% compute the coordinates of the pixels from the N images, using DELTA_EST and
PHI_EST
for k=1:ss(3) % for each color channel
for i=1:n % for each image
s_c{i}=s{i}(:,:,k);
s_c{i} = s_c{i}(:);
r{i} = [1:factor:factor*ss(1)]'*ones(1,ss(2)); % create matrix with row
indices
c{i} = ones(ss(1),1)*[1:factor:factor*ss(2)]; % create matrix with column
indices
r{i} = r{i}-factor*center(1); % shift rows to center around 0
c{i} = c{i}-factor*center(2); % shift columns to center around 0
coord{i} = [c{i}(:) r{i}(:)]*[cos(phi_rad(i)) sin(phi_rad(i)); sin(phi_rad(i)) cos(phi_rad(i))]; % rotate
r{i} = coord{i}(:,2)+factor*center(1)+factor*delta_est(i,1); % shift rows
back and shift by delta_est
c{i} = coord{i}(:,1)+factor*center(2)+factor*delta_est(i,2); % shift
columns back and shift by delta_est
rn{i} = r{i}((r{i}>0)&(r{i}<=factor*ss(1))&(c{i}>0)&(c{i}<=factor*ss(2)));
cn{i} = c{i}((r{i}>0)&(r{i}<=factor*ss(1))&(c{i}>0)&(c{i}<=factor*ss(2)));
sn{i} =
s_c{i}((r{i}>0)&(r{i}<=factor*ss(1))&(c{i}>0)&(c{i}<=factor*ss(2)));
end
s_ = []; r_
for i=1:n %
s_ = [s_;
r_ = [r_;
c_ = [c_;
end
clear s_c r
= []; c_ = []; sr_ = []; rr_ = []; cr_ = [];
for each image
sn{i}];
rn{i}];
cn{i}];
c coord rn cn sn
% interpolate the high resolution pixels using cubic interpolation
rec_col =
griddata(c_,r_,s_,[1:ss(2)*factor],[1:ss(1)*factor]','cubic',{'QJ'}); % option
QJ added to make it work
rec(:,:,k) = reshape(rec_col,ss(1)*factor,ss(2)*factor);
end
rec(isnan(rec))=0;
78
function s_low = lowpass(s,part)
% LOWPASS - low-pass filter an image by setting coefficients to zero in
frequency domain
%
s_low = lowpass(s,part)
%
S is the input image
%
PART indicates the relative part of the frequency range [0 pi] to be kept
% set coefficients to zero
if size(s,1)>1 % 2D signal
S = fft2(s); % compute the Fourier transform of the image
S(round(part(1)*size(S,1))+1:round((1-part(1))*size(S,1))+1,:) = 0;
S(:,round(part(2)*size(S,2))+1:round((1-part(2))*size(S,2))+1) = 0;
s_low = real(ifft2(S)); % compute the inverse Fourier transform of the
filtered image
else % 1D signal
S = fft(s); % compute the Fourier transform of the image
S(round(part*length(S))+1:round((1-part)*length(S))+1) = 0;
s_low = real(ifft(S)); % compute the inverse Fourier transform of the
filtered signal
end
79
function im2 = shift(im1,x1,y1)
% SHIFT - shift an image over a non-integer amount of pixels
%
im2 = shift(im1,x1,y1)
%
shift an image over X1 in horizontal direction and Y1 in vertical
%
direction and set the added pixels to 0
[y0,x0,z0]=size(im1);
x1int=floor(x1); x1dec=x1-x1int;
y1int=floor(y1); y1dec=y1-y1int;
im2=im1;
for z=1:z0
if y1>=0
for y=-y0:-y1int-2
im2(-y,:,z)=(1-y1dec)*im2(-y1int-y,:,z)+y1dec*im2(-y1int-y-1,:,z);
end
if y1int<y0
im2(y1int+1,:,z)=(1-y1dec)*im2(1,:,z);
end
for y=max(-y1int,-y0):-1
im2(-y,:,z)=zeros(1,x0);
end
else
if y1dec==0
y1dec=y1dec+1;
y1int=y1int-1;
end
for y=1:y0+y1int
im2(y,:,z)=y1dec*im2(-y1int+y-1,:,z)+(1-y1dec)*im2(-y1int+y,:,z);
end
if -y1int<=y0
im2(y0+y1int+1,:,z)=y1dec*im2(y0,:,z);
end
for y=max(1,y0+y1int+2):y0
im2(y,:,z)=zeros(1,x0);
end
end
if x1>=0
for x=-x0:-x1int-2
im2(:,-x,z)=(1-x1dec)*im2(:,-x1int-x,z)+x1dec*im2(:,-x1int-x-1,z);
end
if x1int<x0
im2(:,x1int+1,z)=(1-x1dec)*im2(:,1,z);
end
for x=max(-x1int,-x0):-1
im2(:,-x,z)=zeros(y0,1);
end
else
if x1dec==0
x1dec=x1dec+1;
x1int=x1int-1;
end
for x=1:x0+x1int
im2(:,x,z)=x1dec*im2(:,-x1int+x-1,z)+(1-x1dec)*im2(:,-x1int+x,z);
end
if -x1int<=x0
im2(:,x0+x1int+1,z)=x1dec*im2(:,x0,z);
end
for x=max(1,x0+x1int+2):x0
im2(:,x,z)=zeros(y0,1);
end
end
end
80
function impolar = c2p(im)
% IMPOLAR - compute the polar coordinates of the pixels of an image
%
impolar = c2p(im)
%
convert an image in cartesian coordinates IM
%
to an image in polar coordinates IMPOLAR
[nrows, ncols] = size(im);
% create the regular rho,theta grid
r = ones(nrows,1)*[0:nrows-1]/2;
th = [0:nrows-1]'*ones(nrows,1)'*2*pi/nrows-pi;
% convert
[xx,yy] =
xx = xx +
yy = yy +
the polar coordinates to cartesian
pol2cart(th,r);
nrows/2+0.5;
nrows/2+0.5;
% interpolate using bilinear interpolation to produce the final image
partx = xx-floor(xx); partx = partx(:);
party = yy-floor(yy); party = party(:);
impolar = (1-partx).*(1-party).*reshape(im(floor(yy)+nrows*(floor(xx)1)),[nrows*ncols 1])...
+ partx.*(1-party).*reshape(im(floor(yy)+nrows*(ceil(xx)-1)),[nrows*ncols
1])...
+ (1-partx).*party.*reshape(im(ceil(yy)+nrows*(floor(xx)-1)),[nrows*ncols
1])...
+ partx.*party.*reshape(im(ceil(yy)+nrows*(ceil(xx)-1)),[nrows*ncols 1]);
impolar = reshape(impolar,[nrows ncols]);