Investigating Circles in a Square Packing Problems as a Realistic

MIC 2009: The VIII Metaheuristics International Conference
id-1
Investigating Circles in a Square Packing Problems as a Realistic
Benchmark for Continuous Metaheuristic Optimization
Algorithms
Marcus Gallagher∗
1
of Information Technology and Electrical Engineering, The University of Queensland
Brisbane, 4072. Australia
[email protected]
Introduction
20
09
∗ School
In recent years, there has been a growing interest in the development of experimental methodology
in metaheuristics. In the field, researchers continue to develop and propose new algorithms at a
rapid rate. While theoretical analysis has made progress and continues to develop, it is often the
case that algorithms are evaluated and analyzed via empirical techniques. A vital component of
experimental algorithmic research is the development of benchmarks or sets of test problems that
can be used to evaluate and compare the performance of algorithms.
IC
Artificial problems based on mathematical functions are most commonly used to evaluate continuous metaheuristic optimization algorithms. The advantages of using such test functions are that
they are simple to implement and have a specified geometric structure (as well as information such
as the location and value of the global optimum). On the other hand, there is typically no known
(or intended) relationship between artificial test functions and any real-world optimization problem.
This makes it dangerous to predict that an algorithm which performs extremely well on a set of
artificial problems would also perform well if eventually applied to a real optimization problem.
M
Applying newly developed metaheuristics immediately to real-world problems is a possibility
but, from the point of view of algorithm evaluation, also comes with disadvantages. From the
perspective of the metaheuristics researcher, significant time must be invested in understanding a
(typically complex) real-world problem domain and in defining and formulating the problem in terms
of an objective function, constraints, etc. such that the algorithm can be applied. Furthermore, the
complexity or difficulty of the problem may not be well-understood. This means that the results of
the experiment may not shed much light on the workings of the algorithm: it may be difficult to
explain why the algorithm works well or not, or if the problem could have been solved equally well
or better by established methods.
A compromise between using purely artificial and real-world optimization problems for benchmarking is to formulate benchmark problem suites that are based on or inspired by real-world
problems, but abstracted and simplified to some extent to make them easier and more suitable for
Hamburg, Germany, July 13–16, 2009
id-2
MIC 2009: The VIII Metaheuristics International Conference
use in experimental algorithmic research. In combinatorial optimization such test suites exist for
generic classes of problems (e.g. Travelling Salesman, bin packing, constraint satisfaction), however
these kinds of benchmark suites are rare in continuous optimization. The hope is that results on
such “realistic” benchmark suites will be a better reflection of real-world performance while at the
same time giving the researcher the ability to control test problems in order to perform systematic
and statistically significant experimental results. This will lead to an improved understanding of
when and why new and existing algorithms work well (or not).
2
20
09
The contributions of this paper are to develop a benchmark problem set for the evaluation
and comparison of continuous metaheuristic optimization algorithms, based on circles in a square
packing problems. Section 2 describes the problem, surveys relevant literature and discusses the
suitability of this problem set for benchmarking continuous metaheuristics. Some analysis of the
general properties of circles in a square problems is presented in Section 3. To illustrate the usage
of the problem set, a simple continuous Estimation of Distribution Algorithm is applied to a range
of problems. This is discussed in Section 4 and results are presented in Section 5, including results
for the Nelder-Mead Simplex algorithm. Conclusions are presented in Section 6 .
The Circles in a Square Packing Problem and Benchmarking
Metaheuristics
IC
Circles in a square (CiaS) is one of the most well-studied geometric packing problems. Given the
unit square defined in a 2D Euclidean space and a pre-specified number of circles, nc , constrained
to be of equal size, the problem is to find an optimal packing; i.e. to position the circles and
compute the radius length of the circles such that the circles occupy the maximum possible area
within the square. All circles must remain fully enclosed within the square, and cannot overlap.
Mathematically, the problem can be stated as follows [1]. Let C(zi , r) be the circle with radius r
and center zi = (y1i , y2i ) ∈ IR2 . Then the optimization problem is:
C
int
i
rn = max r
(1)
2
(2)
i
C(z , r) ⊆ [0, 1] , i = 1, . . . , nc
(z , r) ∩ C
int
j
(z , r) = ∅ ∀ i 6= j
(3)
M
where C int is the interior of a circle. Alternatively, the problem can be reformulated as finding the
positions of nc points inside the unit square such that their minimum pairwise distance is maximized.
In this case the problem (and constraint) can be restated as:
dn = max min k wi − wj k2
(4)
wi
(5)
i6=j
2
∈ [0, 1] , i = 1, . . . , nc
It is known that a solution to (4) can be transformed into a solution to (1) using the following
relation:
dn
rn =
.
2(dn + 1)
From the point of view of evaluating metaheuristic optimization algorithms, the problem given by
(4) is convenient because generating a feasible candidate solution simply requires placing a set of n
Hamburg, Germany, July 13–16, 2009
MIC 2009: The VIII Metaheuristics International Conference
id-3
points within the unit square. Note that the optimization problem is over 2nc continuous variables
(the coordinates of each point wi in the unit square).
CiaS packing problems represent a challenging class of optimization problems. In general, they
cannot be solved using analytical approaches or via gradient-based mathematical optimization.
These problems are also believed to generally contain an extremely large number of local optima.
For the related problem of packing equal circles into a larger circular region, Grosso et al. use a
computational approach to estimating the number of local optima by repeatedly running a local
optimization algorithm over a large number of trials [9]. Although a conservative estimate, this
indicates that the number of local optima grows unevenly but steadily, with at least 4000 local
optima for nc = 25 and more than 16000 local optima for nc = 40.
20
09
CiaS packing problems have received a large amount of attention in the mathematical, optimization and operations research literature (see [4] for a recent overview). For most values of nc
below 60 and for certain other values, provably optimal packings have been found using either
theoretical or computational approaches (see [15] and the references therein). For larger values of
nc , finding provably optimal packings in general becomes increasingly difficult and time-consuming.
The Packomania website [13] maintains an extensive list of the optimal (or best known) packings
for nc = 2, . . . , 300 along with references and other related resources.
Previous work has also applied heuristics and global optimization algorithms to CiaS problems
with the aim of finding good solutions but without a guarantee of global optimality. Some generalpurpose metaheuristics such as simulated annealing have been applied [16] as well as special-purpose
metaheuristics designed for the problem (see [1, 4, 15]). However, there have been very few, if any
previous applications of population-based or other recently developed metaheuristics to the CiaS
problem (in our survey of the literature, we were unable to find a single publication). From the point
of view of experimental evaluation and comparison of such metaheuristics, CiaS problems therefore
represent a new and potentially useful source of continuous benchmark optimization problems.
IC
Castillo et al.[4] also present a survey of industrial problems and application areas that involve
circle packing: including cutting, container loading, cylinder packing, facility dispersion, communication networks and facility and dashboard layout problems. The CiaS packing problem can be
considered as a simplified version of such real-world problems.
M
Previous work has argued for the importance of experimental research in the development and
evaluation of metaheuristics [17, 7]. There has been some recognition of the need to improve the
standards of experimental studies in this area. This has encouraged the development of more
rigorous approaches to experimental evaluation (e.g. using statistical techniques [3, 2, 6]) and the
proposal of new classes of test problems [8, 11]. In recent years, competitions have been held at major
conferences (e.g. the Genetic and Evolutionary Computation Conference (GECCO) and Congress
on Evolutionary Computation (CEC)) to further encourage larger experimental comparisons of
algorithms.
The CiaS problem set has many of the properties previously identified as being desirable from the
point of view of evaluating algorithms [17]. CiaS problems are in general difficult to solve with simple
optimization methods. They are also nonlinear, nonseparable and nonsymmetric (the problem
variables clearly have significant dependencies with respect to solution quality). The problem is also
clearly scalable in dimensionality (via the number of circles nc , the only parameter of the problem
set). In addition, globally optimal (or near) solutions and objective function values are available
Hamburg, Germany, July 13–16, 2009
id-4
MIC 2009: The VIII Metaheuristics International Conference
for nc < 300 and certain other larger values [13], to facilitate the evaluation of experimental results.
The formulation of the problem from (4) above is simple and convenient for use in experiments with
metaheuristic algorithms. Finally, the objective function is easy to implement and fast to evaluate.
Therefore, the CiaS problems are potentially a very useful source of benchmark problems for the
evaluation of metaheuristics.
3
Landscape Analysis
20
09
All benchmark problems come with advantages and disadvantages (which is one reason why
algorithms should be evaluated on a variety of different problem sets). The CiaS problem set
resembles a real-world problem and as a consequence, precise information regarding the structure
of the “fitness landscape” is not available (see next Section). It is expected that the CiaS problems
would generally become more difficult (with respect to a given algorithm) as nc increases (producing
higher-dimensional optimization problems), but this is unlikely to be strictly true due to symmetries
for certain problem sizes.
It is possible to gain a number of insights into the structure of CiaS problems using ideas previously
developed in the context of fitness landscape/search-space analysis (e.g. statistical or geometric
properties of the objective function and/or the space of feasible solution variable values under the
objective function). Such analyses may provide indications of problem difficulty or suggest ideas
for problem-specific algorithmic development.
3.1
Distribution of Random Solutions
M
IC
A simple way to explore the solution space of an optimization problem is to look at the distribution
of objective function values from a sample of randomly generated solutions. Figure 1 (left) shows
histograms for CiaS problems with nc = 20, 40, 60, 80 and 100, each constructed from samples of
106 solutions. The optimal objective function values for these problem sizes [13] are approximately
0.2866, 0.1882, 0.1495, 0.1296 and 0.11461 respectively. The histograms have a similar shape with a
single mode and a long right tail extending towards better solutions. As n increases, the distribution
shifts left (towards worse packings) and becomes narrower. One insight into problem difficulty is
to examine the ratio of the optimal (or best known) solution value to the median of the randomly
generated solutions. Figure 1 (right) shows the value of this ratio for nc = 10, 20, . . . , 100. The
trend is linear, suggesting that, for CiaS problems, randomly generated solutions can be expected
to grow proportionally worse than the optimal solution as n increases. This can be thought of as a
“curse of dimensionality” result (though apparently only at a linear rate in this range of problem
sizes).
3.2
Local Optima
As mentioned in Section 2, CiaS problems seem to have a large number of local optima that
grows rapidly with the problem dimensionality. Similarly to the results presented in Section 3.1 for
random solutions, it is possible to examine the distribution of objective function values of (apparent)
1
Best known solution value for nc = 100.
Hamburg, Germany, July 13–16, 2009
MIC 2009: The VIII Metaheuristics International Conference
id-5
4
x 10
18
Median(random solution values)/Global Optimum
3.5
3
Frequency
2.5
2
1.5
1
0.5
0
0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
Objective Function Value
0.08
0.09
0.1
16
14
12
10
8
6
4
10
20
30
40
50
60
70
80
90
100
Number of circles (nc)
20
09
Figure 1: LEFT - Histograms showing the distribution of objective functions values of random
solutions on the CiaS problem for nc = 20, 40, 60, 80 and 100. The distributions appear in this
order moving from right to left in the graph, also becoming increasingly narrow. Different line
thickness is used to allow the curves to be more easily distinguished. RIGHT - Ratio of the optimal
(or best known) solution to the median of randomly generated solutions as a function of nc .
local optima. To do so, the Nelder-Mead Simplex algorithm (as implemented via the standard
Matlab fminsearch function) was used to sample local optima for several CiaS problem sizes (nc =
3, 7, 11, 15, 19, 23). For each problem, the algorithm was run for 1000 trials with termination criteria
of either 106 function evaluations or until a precision of 10−12 on either objective function or solution
values had been reached.
M
IC
Figure 2 shows histograms of apparent local optima objective function values. For nc = 3
(Figure 2 (left)) (global optimum value ≈ 1.0353 ), the algorithm finds solutions quite close to
the global optimum on almost 50% of trials (note however that a large proportion of trials find
a nearby solution of 1.0. A very small number of other solutions with various objective values
are found. When we move to nc = 7 (Figure 2 (centre)), the picture changes considerably. A
noticeable peak representing around 9% of solutions is found around the value 0.5, with a few
solutions found between 0.5 and the globally optimum value (≈ 0.5359), however no solutions are
very close to the optimal value (e.g. within more than a couple of decimal places of precision).
The majority of solutions found are spread widely across different objective function values. It was
found that, over the values of nc examined, the distribution of local optima values took on more of
a symmetric unimodal shape as dimensionality increased (Figure 2 (right) shows a representative
histogram for nc = 23). The distributions also shift away from the optimal value, suggesting that
the distribution of local optima behaves somewhat similarly to the distribution of random solutions
presented above. In other words, an average local optimum becomes steadily worse (proportionally)
as the problem dimensionality increases. This suggests that the problem may be best approached
via global optimization algorithms, a fact supported by the direction of the literature on the CiaS
problem.
3.3
Solution Space Symmetry
It is also known that the solution spaces of CiaS problems have a large degree of redundancy in
the sense that multiple, geometrically equivalent solutions exist for any candidate solution [12]. In
Hamburg, Germany, July 13–16, 2009
id-6
MIC 2009: The VIII Metaheuristics International Conference
90
500
450
80
400
70
70
60
50
250
200
Frequency
60
300
Frequency
Frequency
350
50
40
40
30
30
150
20
100
20
50
10
10
0
0.5
0.6
0.7
0.8
0.9
Objective Function Value
1
1.1
0
0.2
1.2
0.25
0.3
0.35
0.4
0.45
Objective Function Value
0.5
0.55
0
0.08
0.1
0.12
0.14
0.16
Objective Function Value
0.18
0.2
Figure 2: Histograms showing the distribution of objective functions values of apparent local optima
on the CiaS problem. LEFT - nc = 3. CENTRE - nc = 7. RIGHT - nc = 23.
20
09
fact, there are 8nc ! equivalent solutions for any packing specified as a solution to the optimization
problem in (4). Firstly, the ordering or labelling of point coordinates (and consequently circle
positions) in a candidate solution vector is arbitrary: there are n! equivalent solutions corresponding
to permutations of these labels. Secondly, a packing of circles in a square can be rotated or reflected,
leading to 8 equivalent configurations.
The permutation symmetry of CiaS solution spaces is analogous to symmetry in the weight
space of multi-layer perceptron neural networks [5] and is similar to the “label switching” problem
in mixture models [14]. Such symmetries have no direct effect on the application of an optimization
algorithm to solving the problem. Nevertheless, they give an insight into the complexity of the
solution space and may be a property of other real-world optimization problems. It would also be
possible to analyze a set of solutions found by an algorithm to see whether or not different distinct
packings are found, or if the algorithm is rediscovering the same solution multiple times.
Applying UMDAGc to Circles in a Square Problems
IC
4
M
In this Section experimental results are presented to illustrate the usage of CiaS problems as a benchmarking tool using the Univariate Marginal Distribution Algorithm UMDAG
c [10] (using Gaussian
distributions for continuous problems), a simple continuous Estimation of Distribution Algorithm.
The intention is to run a simple metaheuristic yielding results that can serve as a baseline for future
experimental work. In UMDAG
c , new individuals/candidate solutions are generated using a factorized product of univariate Gaussian distributions. Following truncation selection, the parameters of
these distributions are updated via their maximum likelihood estimates over the selected individuals. The algorithm is summarized in Table 1. Using the formulation given in (4), the feasible search
space of a CiaS problem is defined by the unit hypercube [0, 1]2nc ⊂ IR2nc . The UMDAG
c algorithm
starts with an initial population generated uniformly across the search space. It is clearly easy to
achieve this for CiaS problems and this is the approach taken here. Note however that the use of
heuristics exploiting problem knowledge may lead to improved starting populations.
The feasible search space for CiaS problems is similar to the simple (often symmetric) box
boundary constraint assumed on many commonly used mathematical test functions for benchmarking continuous metaheuristics. However for CiaS problems, any candidate solution with one or
more coordinate values outside this region will be infeasible (with an undefined objective function
value), in contrast to an artificial test function (e.g. mathematical equation) where the objective
Hamburg, Germany, July 13–16, 2009
MIC 2009: The VIII Metaheuristics International Conference
id-7
Table 1: General pseudocode for UMDAG
c .
Given: population size M , selection parameter τ
BEGIN (set t = 0) Generate M individuals uniformly random in feasible search space
REPEAT for t = 1, 2, . . . until stopping criterion is met
Select Msel < M individuals via truncation selection
Estimate model parameters µt , σt2 via Max. likelihood (i.e. sample mean and variance)
Sample M individuals from N (µt , σt2 )
t=t+1
ENDREPEAT
20
09
function can still be evaluated outside the feasible search space. A naive application of UMDAG
c will
thus result in large numbers of infeasible solutions being generated, since all univariate Gaussian
distributions within the model are capable of generating component solution values in the range
[−∞, ∞]. While it is possible to employ a general constraint handling technique (e.g. creating a
penalty function), a simple approach is taken here by repairing infeasible solutions utilizing a small
amount of prior knowledge about the problems. Given that the objective of the problem (in Eqn. 4)
is to maximize the minimum pairwise distance of the nc points to be positioned in the unit square,
it is to be expected that optimal solutions for any size problem will involve positioning a subset
of points on the boundary of the square2 . Therefore, to facilitate the generation of such candidate
solutions, any value in a solution vector generated during a run of UMDAG
c that lies outside the
i
feasible region is reset to the (nearest) boundary. That is, ∀ w = (w1 , w2 ), i = 1, . . . , nc , if w1 < 0,
then set w1 = 0 or if w1 > 1 then set w1 = 1, with identical conditions for w2 . This simple check is
performed on every generated individual and guarantees the feasibility of every candidate solution.
Experiments
5.1
IC
5
Parameter Settings and Details
M
Experiments were conducted on CiaS problems of size nc = 2, . . . , 100 (i.e. of dimensionality
4, 6, . . . , 200). UMDAG
c was applied with a selection parameter of 0.8, population size of 1000 and
executed for 2000 generations. The Nelder-Mead Simplex algorithm via the Matlab fminsearch
function was also tested on this problem set. The method described above for repairing infeasible
solutions in UMDAG
c was also used for fminsearch. By default, fminsearch terminates after (200
× the number of problem dimensions) function evaluations. We assumed this would be adequate for
a local search algorithm, but increased the default tolerance value (on the solution variable values)
from 10−4 to 10−12 . Each algorithm was run for 30 randomly initialized trials on each problem.
5.2
Results
Figure 3 shows the results of UMDAG
c in terms of the distribution of performance using boxplots.
dn
The x-axis denotes the problem size (nc ) while the y-axis is a performance ratio given by f (x
b)
2
Equivalently, an optimal packing of circles will always contain circles that touch the boundary of the square.
Hamburg, Germany, July 13–16, 2009
id-8
MIC 2009: The VIII Metaheuristics International Conference
1.8
1.7
1.6
1.5
1.4
1.3
1.2
1.1
Objective Function Value (Distance)
3.2
Objective Function Value (Distance)
Objective Function Value (Distance)
2
1.9
3
2.8
2.6
2.4
2.2
2
1.8
1.6
5
4.5
4
3.5
3
2.5
1
2
2 3 4 5 6 7 8 9 10111213141516171819202122232425262728293031323334
353637383940414243444546474849505152535455565758596061626364656667
6869707172737475767778798081828384858687888990919293949596979899100
Number of Circles (nc)
Number of Circles (n )
Number of Circles (n )
c
c
4.5
20
09
Figure 3: Performance distribution for UMDAG
c on CiaS problems over 30 trials. Box plots show
the median (line inside box), interquartile range (box height), data points within 1.5 times the
interquartile range and any points outside this range (with a “+” symbol). LEFT - nc = 2 − 34.
CENTRE - nc = 35 − 67. RIGHT - nc = 68 − 100.
UMDA
Performance ratio (UMDA v’s fminsearch)
4
3.5
3
2.5
Simplex
2
1.5
1
0.5
0
10
20
30
40
50
60
70
80
90
100
Number of Circles (nc)
IC
Figure 4: Median performance for UMDAG
c and fminsearch over 30 trials: nc = 2 − 100. For
UMDAG
the
line
corresponds
to
the
median
line through the boxes in Figure 3.
c
M
where dn is the objective function value of the known global optimum (or best known solution) and
f (xb ) is the objective function value of a solution found by the algorithm (or a statistic over such
values). For example (from Figure 3 (Left)), the median performance value for UMDAG
c for the 14
circle problem is approximately 1.5; meaning that this median value is 1.5 times greater than the
globally optimal value (i.e. lower ratio values are better).
The algorithm clearly does not come close to the global optimum value for any size problem
except nc = 2 and perhaps 3. Nevertheless, the results seem relatively acceptable up to about
nc = 58 (median approx. 1.8 times worse than the optimum). For larger problems, the results then
become worse more quickly. The individual problem results distributions are mostly symmetric
with a fairly small variance. An informal comparison was also made between the UMDAG
c results
and the Nelder-Mead Simplex algorithm. Figure 4 shows the median performance of UMDAG
c from
Figure 3 and the median performance of fminsearch for nc = 2, . . . , 100. For smaller problem sizes
(nc < 10) the performance of the two algorithms is comparable. Then for nc > 10 up to around
nc = 65, UMDAG
c finds better results than fminsearch. For larger problems, the performance of
fminsearch is far superior. It is important to make some comments to qualify these results. While
this comparison of algorithms is a useful illustration of the CiaS benchmark problems, the results
Hamburg, Germany, July 13–16, 2009
MIC 2009: The VIII Metaheuristics International Conference
id-9
6
Conclusions
20
09
were not intended to be a rigorous head to head comparison of the algorithms. As mentioned above,
UMDAG
c was run for a fixed number of generations across all problem sizes, while the maximum
number of objective function values for fminsearch increases with dimensionality. Examination
of model parameter values (not shown here) reveals that the fixed number of generations became
a limiting factor around the point identified above where the results began to deteriorate (around
nc = 58). Allowing more generations (or scaling this with dimensionality) should lead to significant
improvement in the results on larger problems. It should also be noted that no attempt was made
to optimize the selection parameter or the population size/generations relationship for UMDAG
c .
Nevertheless, the performance of fminsearch is relatively impressive given the small number of
objective function values that it uses (at least an order of magnitude less than UMDAG
c ).
This paper has proposed 2D circles in a square geometric packing problems as a useful benchmark
problem set for the evaluation and comparison of continuous metaheuristic and evolutionary optimization algorithms. The problem set is closely related to a number of real-world optimization
problems, but simplified and therefore more easily applied in the experimental evaluation of algorithms. The features of the problem set are reviewed and discussed and it is found to have a number
of useful properties for a benchmark problem set. Some analysis of the landscape properties of the
CiaS problems is conducted to provide some guidelines to the characteristics and difficulty of the
problems.
IC
To illustrate the usage of the problems set, experimental results are presented for UMDAG
c for
problem sizes from 2 to 100 circles. The performance of UMDAG
is
also
compared
with
the
Nelderc
Mead Simplex algorithm. These experimental results are intended to serve as a baseline for more
rigorous evaluations and comparisons of algorithms. It will be interesting to see by how much other
metaheuristics are able to improve on the results presented here. Even more impressive will be if a
metaheuristic is able to improve on the best known solution for CiaS problem sizes where the global
optimum has not yet been determined.
M
Note that the CiaS problem is but one instance of continuous geometric packing problems and
so there is considerable scope to adopt more benchmark problems from this area to algorithm
comparison. For example, the geometric dimensionality of the CiaS problem can be increased
(packing (hyper)spheres in a (hyper)cube), other instances considered (e.g. packing circles into a
larger circle) and so on (see, e.g.[9, 13]). It is expected that experimental research on the evaluation
of metaheuristics will benefit significantly from the usage of a wider variety of challenging benchmark
problem sets. To encourage usage of the CiaS benchmark problem set, a Matlab implementation
will made available by the author at (http://www.itee.uq.edu.au/∼marcusg/cias.html).
References
[1] B. Addis, M. Locatelli, and F. Schoen. Disk packing in a square: A new global optimization
approach. INFORMS Journal on Computing (online, Articles in Advance), 2008.
Hamburg, Germany, July 13–16, 2009
id-10
MIC 2009: The VIII Metaheuristics International Conference
[2] T. Bartz-Beielstein. Experimental Research in Evolutionary Computation: The New Experimentalism. Springer-Verlag New York Inc, 2006.
[3] T. Bartz-Beielstein, K. E. Parsopoulos, and V. N. Vrahatis. Design and analysis of optimization
algorithms using computational statistics. Applied Numerical Analysis and Computational
Mathematics, 1(2):413–433, 2004.
[4] I. Castillo, F. J. Kampas, and J. D. Pintér. Solving circle packing problems by global optimization: Numerical results and industrial applications. European Journal of Operational Research,
191:786–802, 2008.
[5] A. M. Chen, H. Lu, and R. Hecht-Nielsen. On the geometry of feedforward neural network
error surfaces. Neural Computation, 5(6):910–927, 1993.
20
09
[6] A. Czarn, C. MacNish, K. Vijayan, B. Turlach, and R. Gupta. Statistical exploratory analysis
of genetic algorithms. IEEE Transactions on Evolutionary Computation, 8(4):405–421, 2004.
[7] A. E. Eiben and M. Jelasity. A critical note on experimental research methodology in EC. In
2002 Congress on Evolutionary Computation (CEC 2002), pages 582–587. IEEE, 2002.
[8] M. Gallagher and B. Yuan. A general-purpose, tunable landscape generator. IEEE Transactions
on Evolutionary Computation, 10(5):590–603, 2006.
[9] Andrea Grosso, Jalal Uddin Abdur Jamali, Marco Locatelli, and Fabio Schoen. Solving the
problem of packing equal and unequal circles in a circular container. Technical report, Eprint
- Optimization Online, http://www.optimization-online.org/DB HTML/2008/06/1999.html,
March 2008.
[10] P. Larrañaga, R. Etxeberria, J. A. Lozano, and J. M. Peña. Optimization by learning and
simulation of Bayesian and Gaussian networks. Technical Report KZZA-IK-4-99, University of
the Basque Country, Spain, 1999.
IC
[11] JJ Liang, PN Suganthan, and K. Deb. Novel composition test functions for numerical global
optimization. In Swarm Intelligence Symposium, 2005. SIS 2005. Proceedings 2005 IEEE,
pages 68–75, 2005.
[12] M. Locatelli and U. Raber. Packing equal circles in a square: a deterministic global optimization
approach. Discrete Applied Mathematics, 122:139–166, 2002.
[13] E. Specht. Packomania. Retrieved from http://www.packomania.com/ (24/10/08), 2008.
M
[14] M. Stephens. Dealing with label switching in mixture models. Journal of the Royal Statistical
Society(B), 62(4):795–809, 2000.
[15] P. G. Szabó, M. C. Markót, and T. Csendes. Global optimization in geometry - circle packing
into the square. In P. Audet, P. Hansen, and P. Savard, editors, Essays and Surveys in Global
Optimization. Kluwer, 2005.
[16] V. E. Theodoracatos and J. L. Grimsley. The optimal packing of arbitrarily-shaped polygons using simulated annealing and polynomial-time cooling schedules. Computer Methods in
Applied Mechanics and Engineering, 125:53–70, 1995.
[17] D. Whitley, K. Mathias, S. Rana, and J. Dzubera. Evaluating evolutionary algorithms. Artificial Intelligence, 85(1-2):245–276, 1996.
Hamburg, Germany, July 13–16, 2009