Professional Documents
Culture Documents
Multi-Objective Evolutionary Algorithms: Håken Jevne, Kazi Ripon and Pauline Haddow
Multi-Objective Evolutionary Algorithms: Håken Jevne, Kazi Ripon and Pauline Haddow
Multi-Objective Evolutionary Algorithms: Håken Jevne, Kazi Ripon and Pauline Haddow
Håken Jevne,
Kazi Ripon and Pauline Haddow
Outline
• Introduction to MOO
– Conceptual example
• Pareto-Optimality and Metrics
• Diversity Preservation
– Example: NSGA II
• Other MOO Algorithms
• MOO Application example
Outline
• Introduction to MOO
– Conceptual example
• Pareto-Optimality and Metrics
• Diversity Preservation
– Example: NSGA II
• Other MOO Algorithms
• MOO Application example
Single Objective Optimization
The problem has a 1 dimensional performance space and the optimum
point is the one that is the furthest toward the desired extreme.
There is typically only a single solution that gives the best objective value.
Optimum Optimum
-0+ F
Best in all dimensions?
But what happens in a case like this (conflicting):
Optimum?
Optimum? F1
Why Multi-Objective Optimization?
Buying an Automobile
• Objective = reduce cost, while maximize
comfort.
• Which solution (1, A, B, C, 2) is best ???
• A solution x that does not satisfy all of the (J + K) constraints and all of
the 2N Variable bounds is called an infeasible solution.
• On the other hand, if any solution x satisfies all the constraints and
variable bounds, it is called a feasible solution.
7
Multi-Objective Optimization
• A MOOP will have many alternative
solutions in the feasible region.
• This is because a solution that is
optimal with respect to one objective
might be a poor candidate for another
objective.
• Even though we may not be able to
assign numerical relative importance to
multiple objectives, we can still
classify some possible solutions as
better than others.
8
Outline
• Introduction to MOO
– Conceptual example
• Evolutionary MOO concepts
• Pareto-Optimality and Metrics
• Diversity Preservation
– Example: NSGA II
• Other MOO Algorithms
• MOO Application example
Evolutionary Approach: General Concept
Multi-objective optimization is the process of simultaneously optimizing
several incommensurable and often competing objectives subject to
certain constraints.
11
Example
5000
4000
Price ($)
3000
D B
2000 A
E
1000 C
0
0 5 10 15 20 25
Flight Time (hrs)
14
Solution to MOOP
• The solution to MOOP consists of sets of trade-offs between
objectives.
• The goal of MOO algorithms is to generate these trade-offs.
15
Traditional Approaches
• Weighted Sum Method.
16
Weighted Sum Method
• Multiple objectives are combined into a single objective using weighted
co-efficients.
! ! !
Minimize : { f1` ( x ), f 2 ( x ),"" , f m ( x )}
Can simultaneously deal with a set Normally work with a single solution.
of possible solutions (the so-called
population).
Allow us to find several members of Need to perform a series of
the Pareto-Optimal set in a single separate runs to find a set of
run of the algorithm. alternative solutions.
less susceptible to the shape or These two issues are a real concern
continuity of the Pareto front -- can for mathematical programming
easily deal with discontinuous or techniques
concave Pareto fronts.
Can be implemented in a parallel Difficult for parallel implementation.
19
environment.
Outline
• Introduction to MOO
– Conceptual example
• Pareto-Optimality and Metrics
• Diversity Preservation
– Example: NSGA II
• Other MOO Algorithms
• MOO Application example
Pareto-Optimality
• Pareto optimization can handle the
problems associated with weighted-sum
approach very efficiently.
• The term domination is used to find the
trade-offs solutions.
• A solution x(1) is said to dominate the
other solution x(2), if both the following
conditions are true:
1. The solution x(1) is no worse than
• The line is called the
x(2) in all objectives.
Pareto front and solutions
2. The solution x(1) is strictly better on it are called Pareto-
than x(2) in at least one objective. Optimal.
Shape of Pareto-Front
22
Objectives in MOOP
Co
f2 e nver Possible
ge
nc solutions
Minimize function
Div
ers
ity True
Pareto
front
Minimize function f1
min distance
10 T RUE PARET O FRONT
PARET O FRONT
8
function - 2
6
min distance
4
0
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
24 function - 1
Diversity Metrics: One Example
DIVERSITY PLOT
4
Pareto front
3.5
Extreme
solution
3 d1
2.5 d2
function -2
2 d3
d4
1.5
1 Extreme
solution
0.5
d n-1
0
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5
function -1
27
Niching: ranking (MOGA)
• Upper figure shows a two-objective
minimization problem having 10
solutions.
28
Niching
• The idea of segmenting the
population of the GA into disjoint sets
to have at least one member in each
region of the fitness function that is
"interesting“ – local/global optima;
– General intention is to cover more
than one local optima.
29
Niching : Sharing Function Model
• To maintain diversity among non-dominated solutions, niching among
solutions of each rank.
• Focusing on degrading the fitness of similar solutions.
• Niche count provides an estimate of the extent of crowding near a
solution.
• The normalized distance between any two solutions i and j in a rank is
calculated as follows:
30
Niching : Sharing Function Model
• For the solution i, dij is computed for each solution j (including i) having
the same rank.
• With α = 1, the sharing function value is computed as:
31
• Calculate the shared fitness value as f’i = fi / nci
Niching : Sharing Function Model
• Sharing function is used to obtain an estimate of the no. of solutions
belonging to each optimum.
• The parameter d is the distance between any two solutions in the population.
• The above function takes a value in [0,1], depending on the values of d and
σshare.
• If d is zero (two solutions are identical or their distance is zero), Sh(d) = 1.
• A solution has full sharing effect on itself.
• If d ≥ σshare (two solutions are at least a distance of σshare away from each
other, Sh(d) = O.
• Two solutions do not have any sharing effect on each other.
• Any other distance d between two solutions will have a partial effect on each.
32
Example * (fitness sharing, not MO)
Assume:
σshare = 0.5
α=1
34
Example
35
Example
36
Crowding Distance Assignment
• The crowding operator ( ≤" ) guides the selection process at the
various stages of the algorithm toward a uniformly spread-out Pareto
optimal front.
37
Crowding Tournament Selection Operator
• A solution i wins a tournament with another solution j if any of the
following conditions are true:
• If solution i has a better rank: ri < rj.
38
Crowding Distance Assignment Procedure:
Crowding-sort(F, <c)
f1 (x ) = x 2 , f 2 ( x ) = ( x - 2 )
2 0.4678
1.7355
where -5 £ x £ 5 0.8183
-0.414
• Search space is of single dimension (given). 3.2105
-1.272
• Objective space is of two dimension (given).
-1.508
• Let population size = 10 -1.832
-2.161
• Initialize population with 10 chromosomes having single -4.105
dimensioned real value.
• These values are randomly distributed in between [-5, 5].
Crowding Distance : Example
• Find out all objective functions values for all chromosomes.
40
x f1 (x) f 2 (x ) 35
-0.414 0.171 5.829
0.467 0.218 2.347 30
f2(x)
3.210 10.308 1.465 20
f2(x)
3.210 10.308 1.465 2
20 4
3
-1.272 1.618 10.708 2 15
35
6
• Crowning distance can be 30
f2(x)
20 4
3
15
f1 (x) f 2 (x ) Rank
10
x C.D. 5
2
¥
1
f2(x)
-1.508 2.275 12.308 3 ¥ 2
i-1 3
-1.832 3.355 14.682 4 ¥ 1 i
¥
i+1
-2.161 4.671 17.317 5 0
4
-4.105 16.854 37.275 6 ¥ 0.5 1 1.5
f1(x)
2 2.5 3
Tournament Selection
‘Tournament’ among a few individuals chosen at random from the
population and selects the winner (the one with the best fitness)
for crossover.
• Uses Elitism:
NSGA- II: Elitism
• The offspring population Qt is first created by using parent population Pt.
F1
F2
Pt N
F3
2N
Pt+1
Qt Instead of arbitrarily discarding
some members from the last front,
Rejected
Use a niching strategy to choose
the members of the last front,
which reside in the least crowded
Elitist Replacement
48
Rt ={Pt , Qt} region in that front.
NSGA II: Example *
50
NSGA II: Example
A
A a
51
NSGA II: Example
Step 3:
• Next, we consider solutions of the second front only and observe that 3
of the 4 solutions must be chosen to fill up 3 remaining slots in the
new population.
52
NSGA II: Crowding Distance Assignment
53
NSGA II: Crowding Distance Assignment
54
NSGA II: Crowding Distance Assignment
55
NSGA II: Crowding Distance Assignment
56
NSGA II: Crowding Distance Assignment
57
NSGA II: Example
58
NSGA II: Example
• The offspring population Qt+1 has to be created next by using this
parent population.
59
Outline
• Introduction to MOO
– Conceptual example
• Pareto-Optimality and Metrics
• Diversity Preservation
– Example: NSGA II
• Other MOO Algorithms
• MOO Application example
Other MOO Algorithms
• Multiple Objective Genetic Algorithm (MOGA)
61
Comparison of MOO Algorithms
MOGA NSGA NPGA NSGAII SPEA PAES
Fitness Very simple Excellent due to No explicit fitness Due to global non- Easy to calculate. A more direct
assignment of assignment is domination check approach is used in
fitness according to needed like most among offspring and calculating the
non dominated other MOEAs. parent solutions, elitism density while
sets. is implemented comparing the
perfectly. fitness.
Effect of Sharing function Very sensitive to Requires fixing Solutions compete with Parameter-less, In addition to
parameters parameter σshare σshare two important their crowding thereby making it archive size (!),
needs fixing parameters: ashare attractive to use. depth parameter
distances. So, no extra
and tdom (") is also
niching parameter is important.
required.
Convergence Slow Front-wise The choice of Elitism helps speedy Not so speedy (non- Depends on the
selection helps σshare has more convergence. dominated sorting choice of
better convergence effect. of the whole parameter values.
speed. population is not
used for assigning
fitness)
When to use If a spread of It ensures that Computationally When a global non- Ensures a better Has a direct control
Pareto-optimal diversity is efficient in solving domination check spread is achieved on the diversity
solutions is maintained among problems with among offspring and among the obtained that can be
required on the the non-dominated many objectives. parent solutions is non-dominated achieved by using
objective space. solutions. necessary. solutions. the appropriate size
62 of the depth size ".
Outline
• Introduction to MOO
– Conceptual example
• Pareto-Optimality and Metrics
• Diversity Preservation
– Example: NSGA II
• Other MOO Algorithms
• MOO Application example
Layout Optimization for a Wireless
Sensor Network using NSGA - II
a) Coverage
b) Lifetime
Node 1
High Energy Communication Node
(HECN)
Node 9
Node 6
Node 7
Node 2
Node 8
Node 5
Node 4
Node 3
Example of a WSN where sensor nodes are communicating with the DPU through HECN
Optimization of Coverage
66
Optimization of Lifetime
• The lifetime of the whole network is the time until one of the
participating nodes run out of energy.
• In every sensing cycle, the data from every node is routed to HECN
through a route of minimum weight.
Tfailure
Lifetime =
Tmax
Tfailure = maximum number of sensing cycles before failure of any node
Tmax = maximum number of possible sensing cycles
Competing Objectives
Lifetime Coverage
HECN
• try to arrange the nodes as • try to spread out the nodes for
close as possible to the maximizing coverage
HECN for maximizing lifetime
Simulation Parameters
Parameters of NSGA-II
69
NSGA-II Results
Pareto optimal front
1
0.9
0.8
0.7
Lifetime
0.6
0.5
0.4
0.3
0.2
0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Coverage
Coverage = 0.63709
-4
6
-6 HECN
4
-6 -4 -2 0 2 4 6
0
HECN
-2 Best Coverage
-4 Coverage = 0.5353 Lifetime = 0.249
6
-6
-6 -4 -2 0 2 4 6 4
-2
-4
-6
HECN
-6 -4 -2 0 2 4 6
Additional Reading + References
• Fonseca, C. M., & Fleming, P. J. (1993). Multiobjective genetic algorithms. In IEE
colloquium on Genetic algorithms for control systems engineering, (pp. 6-1)
• Deb, K. (2001). Multi-objective optimization using evolutionary algorithms (Vol. 16). John
Wiley & Sons.
• Deb, K., Pratap, A., Agarwal, S., & Meyarivan, T. A. M. T. (2002). A fast and elitist
multiobjective genetic algorithm: NSGA-II. IEEE transactions on evolutionary
computation, 6(2), 182-197.
• Horn, J., Nafpliotis, N., & Goldberg, D. E. (1994, June). A niched Pareto genetic algorithm
for multiobjective optimization. In IEEE World Congress on Computational Intelligence
Evolutionary Computation, 1994, pp. 82-87. IEEE.
• Zitzler, E., & Thiele, L. (1999). Multiobjective evolutionary algorithms: a comparative case
study and the strength Pareto approach. IEEE transactions on Evolutionary Computation,
3(4), 257-271.
72