Outline
CS 6776
Evolutionary Computation
January 21, 2014
Fitness Function
• A mathematical function that quantifies
how good a solution is.
• A problem can be modelled as a
maximization or a minimization problem.
• Example:
• TSP: A tour X = {x i },i = 1…n
• minimize
f ( X ) = ∑∑ d
n
• Problem modeling includes representation
design and Fitness Function definition.
• Fitness function:
– Unconstrained optimization/modeling
– Constrained optimization/modeling
– Multi-objective optimization/modeling
– Relative fitness: co-evolution
• Population size and population models
• Convergence and termination criteria
Numerical Optimization
• Maximize f (x1,x 2 ) = x12 + x 22,
−5.0 ≤ x1 ,x 2 ≤ 5.0
How many
optima?
What are they?
€
n
ij
i =1 j =1
€
⎧ dist( x i , x j ), x to x is in the tour
i
i
dij = ⎨
0,
⎩
otherwise
€
1
Regression Model
Example
• Regression models, such as symbolic
regressions, where the output is a real
value, the most commonly used fitness
function is minimizing error:
N
∑( y
– MSE: mean squared error
f (x) =
i
− yˆ i )
• The true value of the
outliner point y = 60:
2
f (x) =
– the predicted value
ŷ=80:
– MSE: |80-60|^2=400
– MAE: |80-60|=20
• Non-liner model:
i=1
N
• Biased more weighted on outliers
• Weights large errors more heavily than small ones
N
€
∑ y i − yˆ i
– MAE: mean absolute error
• Less biased
• Liner model:
– the predicted value
ŷ=80:
– MSE: |80-70|^2=100
– MAE: |80-70|=10
i=1
N
€
Classification Model
Constrained Optimization
• One way to handle solutions that violate the
constraints is to assign penalty p in the fitness
function:
• Classification models, where the output is
a discrete label, the most commonly used
fitness function is maximizing accuracy:
f (x)' = f (x) ± p
N
∑t
f ( x) =
ti
i
i =1
,
N
⎧1, y i = y
ˆi
= ⎨
ˆi
⎩0, y i ≠ y
• Problematic when the data set is
€
imbalanced.
€
• For maximization problems, p is subtracted from
f(x).
• For minimization problems, p is added to f(x)
• There are other constraints handling methods,
other than sum-penalty.
2
Multi-Objective Optimization
Relative Fitness
• Problems require satisfying more than one objective: e.g.
maximizing profit & minimizing cost.
• Weighted sum approach:
– Converts all fitness function for maximization, e.g.
convert fitness by multiply -1;
λ = [w , w ], w ≥ 0, ∑ w = 1
• The fitness of an individual is measured in
relation to that of other individuals in the
same population or in a competing
population.
• Interaction Patterns:
m
1
m
i
i
i =1
m
∑ w f ( x)
• Maintain a set of Pareto Front solutions:
– Fitness is a vector F(x) = [ f1 (x), f 2 (x), f 3 (x) f m (x)]
€
– Select Pareto-optimal solution (discussed in later
lecture)
F ( x) =
i
i
i =1
– Single population
• Relative (competitive) fitness vs. absolute fitness
– Multiple populations
• All vs. previous-best
• Neighborhood interaction
€
Standard Evolutionary Algorithm
• Individual fitness is based on its absolute
performance without interacting with
others.
Co-evolution – Single Population
• Individuals are evaluated by having them
interact with each other, e.g. play a game.
Play N games with randomly selected N individuals from the population
3
Co-evolution – Multiple Populations
• In asymmetric games (the strategies for
player 1 are different from that of the
player 2), each member of Pop. 1 interacts
with each member of Pop. 2.
Population Size
• Intuitively, the population size can be
viewed as a measure of the degree of
parallel search an EA supports.
• A larger population size provides:
– better coverage of the search space
(diversity) which helps high fitness individuals
to be included in the initial population
– a larger past memory, so that good individuals
do not lost so quickly during evolution.
Evolutionary Algorithms Workflow
• To design an effective
evolutionary
algorithm, one need
to consider the
problem at hand.
Population Size - Continued
• Depending on the complexity of the
problem fitness landscape, different
population size is needed to search a
solution.
• Increase population size beyond
necessary would take EA longer to find a
solution.
4
Generational Model
• Non-overlapping Population:
gen-0
gen-1
gen-2
gen-n
• Canonical Genetic Algorithms:
• Parent selection only; all offspring are kept in
the following generation.
• Evolution Strategies (µ, λ):
• Random parent selection to generate a large
number of offspring;
• Select fitter offspring to form the new
generation.
Steady-state Model
• Overlapping Population
parents
pop
replace
offspring
• Generation Gap:
– The proportion of the population that is replaced
[Sarma & De Jong, 1995].
– Generational model: pop_size/pop_size=1.0
– Steady-state model: #_replaced/pop_size
Generational Model - Continued
• Under stochastic selection, an individual,
regardless how fit it is, may only live for
one generation, hence has a short-term
impact on evolution.
• During stochastic selection, best solutions
might get lost and are never carried over
to new generation.
• These can be fixed by deterministic
selection, or elite selection.
Steady-state Model - Continued
• Offspring and parents compete to survive
in the population.
• Fit individual can live for a long period of
time to impact the evolution.
• A fit offspring can have impact on the
evolution immediately after its birth,
without waiting until the next generation.
• Impact evolutionary search?
5
EA Termination Criteria
• EA search process termination criteria:
– When a specified number of generation is
reached.
– When the known best solution is found.
– When the population is “converged”: no
further changes in the population may occur.
Convergence
• Practical ways to detect convergence:
– Measure the degree of homogeneity of the
population using spatial dispersion or entropy:
When the homogeneity measure approaches
0, the population is converged.
– Measure the global fitness improvement:
When the best fitness does not improve for a
certain number of generation (typically 10-20),
the population is converged.
6
© Copyright 2026 Paperzz