Title: Evolutionary Algorithms EVO Satisfying Multiple Objectives L10
1Evolutionary Algorithms (EVO) Satisfying
Multiple Objectives (L10)
- John A Clark
- Professor of Critical Systems
- Non-Standard Computation Group Dept. of Computer
Science - University of York, UK
2References
- A Comprehensive Survey of Evolutionary-Based
Multiobjective Optimization Techniques. - by Carlos A. Coello Coello (on web, and link
from web page) - Multi-objective Evolutionary Algorithms
Analysing the State of the Art - David A. van Veldhuizen and Gary Lamont
- A Short Tutorial on Evolutionary Multiobjective
Optimization - by Carlos A. Coello Coello (on web, and link
from web page) - www.cs.cinvestav.mx/emooworkgroup/tutorial-slides
-coello.pdf - Evolutionary Algorithms for Multi-criterion
Optimization A Survey. - By Ashish Ghosh and Satchidananda Dehuri
3Definition of Multi-objective Problem
- Can be defined as the problem of finding
- A vector of decision variables which satisfies
constraints and optimizes a vector function whose
elements represent the objective functions. These
functions form a mathematical description of
performance criteria which are usually in
conflict with each other. Hence the term optimize
means finding such a solution which would give
the values of all the objective functions
acceptable to the designer.
4Definition
- More formally. Find the vector which will
satisfy - the m inequality constraints
- d
- and the p equality constraints
- d
- and optimizes the vector of functions
(objectives)
5Examples of Multiple Objectives
- Pervasive computing Chips with Everything.
Limited amount of processing power available
(often very small chips) - Amount of memory (e.g. stack usage)
- Time taken to reach a solution.
- Precision of the answer.
- Power consumption yes, if you have a smart dust
mote with 1mm3 battery this is important. - Also has effects on on-chip temperature
- Note multiple objectives may be in conflict.
6Examples of Multiple Objectives
- Car design
- Need to reduce drag since this impacts on petrol
consumption. - Needs to be able to accommodate people (!!!) and
provide reasonable amount of other carrying
space, e.g. for luggage. - Needs to have an aesthetic appeal (but peoples
tastes differ). - Performance related factors speed, acceleration
and weight of vehicle. - Strengthbut also ability to absorb impacts.
- And so on
- Again multiple objectives in conflict.
7Examples of Multiple Objectives
- Dependable systems task distribution. Want
- Low communications overheads - so put them on
the same processor. - Fault tolerance bad idea to place on same
processor. - Timing of end to end transactions.
8Pareto Optimality (Minimisation)
- Pareto optimality is a major concept in
multi-objective optimisation. - A solution is Pareto optimal if for every
either
or there is at least one i e I such thatThus,
if any vector causes a decrease in the
value of some objective it causes a simultaneous
increase in some other objective
9Pareto Optimality 2 Objectives
Objective 2
For any two Pareto optimal vectors, neither is
better than the other.
dominated
Pareto optimal
Gives rise to the notion of a Pareto
front. Generally difficult to compute a
mathematical expression for this front.
Objective 1
10Aggregating Functions
11Weighted Sums
- An obvious way to couch a multi-objective problem
is as a minimisation of a weight sum of objective
values
In practice the magnitude of the weights matters,
even when they are all scaled by some factor (due
to the interaction with specifics of the
optimisation technique).
12Strengths and Weaknesses
- Computationally fairly straightforward and
efficient. - Can also use the results as inputs into other
techniques, i.e. seed other techniques with
generated solutions.
13Strengths and Weaknesses
- Difficulty finding appropriate weights.
- Need to choose weights so that one objective does
not dominate. - Need to understand to some extent the ranges of
the possible objectives. - This can be very hard for many difficult
problems. - Needs to be chosen by designer.
- Often a trial and error approach is needed.
- Weights do not reflect importance
- Weights are just a means to an end
- We just need them to identify Pareto optimal
points. - Optimality by reducing to a single meta (sum)
objective is a function of the weights. - Using linear combinations cannot handle
non-convex regions of the search space. - Can give a single solution - what if you want
more?
14Goal Programming
- Here the designer sets targets for the
optimisation
Targets for each objective
15Strengths and Weaknesses
- Efficient.
- Designer needs to determine targets
- This may require significant knowledge in its own
right. - Can also use a weighting regime as previous
method - But subject to many of the same problems.
- May prove inefficient if feasible range is
difficult to approach
16Non-Pareto Optimality Based Methods
17Vector Evaluated Genetic Algorithm (VEGA)
- k sub-populations are selected of size N/k
- The ith subpopulation is selected according to
fitness of the ith objective. - Once selected these populations are merged and
shuffled and the usual cross-over and mutation
operators applied. - So selection is varied in its objective and good
performing solutions in each objective go through
to be further processed.
18Vector Evaluated Genetic Algorithm (VEGA)
Based around the Genesis program
1
2
3
4
5
6
7
8
9
10
11
N
19VEGA
- Very simple.
- Work by Schaffer used proportional fitness
selection (and fitnesses were proportional to the
objectives). - Some parts of the search space will not be
sampled. - Possible dangers arising of speciation, where
species arise that are good at one particular
objective but not necessarily others.
20Lexicographic Ordering
- Rank objectives in order of importance and adopt
a stepwise optimization approach.
21Lexicographic Ordering
- Proceeds in criticality or importance order.
Optimizing objectives introduces constraints for
the subsequent optimisations. - Optimise the most important.
- Now do the best you can on the second but but get
worse on the first - And so on
- Major issue is a tendency to favour certain
objectives disproportionately. - Ranking objectives and then dealing with each in
turn has consequences you might prefer a more
global view when carrying out the optimisation
depends on the problem. - As with any technique, it really depends on
whether you are happy with the answers.
22Pareto Optimality Based Methods
23Pareto Ranking by Fronts (Goldbergs scheme)
- curr_rank1
- mn
- While n! do
- for(i1..m)
- if(xu) is non-dominated then rank(x,t)curr_rank
- for(i1..m)
- if rank(x,t)curr_rank then remove x from
population. nn-1 - curr_rankcurr_rank-1
- mn
24Pareto Ranking by Fronts (Goldbergs scheme)
- Identify all the non-dominated individuals and
give them rank 1. - Remove these from the population.
- Now identify all the non-dominated individuals
from the remaining population and give them rank
2, and so on
25Pareto Ranking by Fronts (Goldbergs scheme)
Objective 2
4
3
2
1
Objective 1
26Further Domination Based Ranking (Fonseca and
Flemming)
- Identify all non-dominated individuals in the
population and give them rank 1. - For each other member determine the number of
individuals pi(t) in the rest of the population
that dominate it. - Rank based fitness much as before
- Sort according to rank.
- Use linear gradient of probabilities for
selection.
27Cute Examples of Curve and Surface Fitting!
- Some nice pictures!
- Work by Rony Goldenthal
- Michel Bercovier
- www.cs.huji.ac.il/ronygold/multi_objective.html
28Problem
- The work is about curves and surfaces
- Fitting finding surfaces (curves) as close as
possible to a given set of points. - Design/Fairing generate a surface achieving
certain quality measures design objective
(minimal length, appropriate curvature,).
NURBS Non-Uniform Rational B-Spline
29NURBS Surface
- Surface Area 155.77 Surface Curvature 0.099
30NURBS Surface
- Surface Area 155.16 Surface Curvature 0.04
31NURBS Surface
- Surface Area 154.72 Surface Curvature 0.12
32NURBS Surface II
- Order 4,4
- Input points 16,16
- Control Points 5,5
- Cost functions
- Approximation Error
- Surface Curvature
33NURBS Surface
- Approximation Error 5.807 Surface Curvature 3.39
34NURBS Surface
- Approximation Error 5.19 Surface Curvature 3.56
35NURBS Surface
- Approximation Error 2.46 Surface Curvature 5.23
36NURBS Surface
- Approximation Error 1.348 Surface Curvature 9.95
37NURBS Surface
- Approximation Error 1.256 Surface Curvature
11.51
38NURBS Surface
- Approximation Error 0.937 Surface Curvature
20.84
39Many Many More
- There are many more multi-objective techniques.
- Paper by Ghosh and Dehuri talks about many.
- There are many good survey papers around on the
issue of multiple objectives. - Have referenced several at the beginning of the
talk.