Title: 010'141 Engineering Mathematics II Lecture 13 Evolutionary Algorithms
1010.141 Engineering Mathematics IILecture
13Evolutionary Algorithms
- Bob McKay
- School of Computer Science and Engineering
- College of Engineering
- Seoul National University
2Outline
- Feedback
- Simple example of applying an evolutionary
algorithm - Variants of the classical evolutionary algorithm
- Realistic evolutionary algorithm example
3Reminder Evolutionary Algorithm
- Pop(0)random t0
- While not (solution ? Pop(t)) do
- while size(Pop(t1)) lt size(Pop(t)) do
- randomly select Ind1, Ind2 ? Pop(t)
- with probability p1
- Newrandom_mutate(Ind1)
- else with probability p2
- Newrandom_combine(Ind1, Ind2)
- else New Ind1
- add New to Pop(t1)
- end while
- tt1
- End while
4Genetic Algorithm Example
- We will use the simple GA to maximise the
function f(x) x2, with x in the integer
interval 0 31, i.e., x 0 1 30 31 - Of course, the maximum at x31, but pretend we
dont know this - The first step of EA applications is encoding
- i.e., the representation of chromosomes
- We adopt binary representation for integers.
- Five bits are necessary for integers up to 31
- Suppose the population size is 4
5Generate initial population at random
- e.g., 01101, 11000, 01000, 10011
6Calculate fitness value for each individual
- Decode the individual into an integer,
- 01101 ?13 11000 ?24 01000 ?8 10011 ?19
- Evaluate the fitness according to f (x) x2,
- 13 ?169 24 ?576 8 ?64 19 ?361
7Selection Probabilities
- p1(13) 169/1170 0.14
- p2(24) 576/1170 0.49
- p3(8) 64/1170 0.06
- p4(19) 361/1170 0.31
8Select individuals for crossover
- Select two individuals (with replacement)
according to the above probabilities for
crossover - Say we have crossover(01101 11000) and
crossover(10011 11000) - We obtain offspring 01100 and 11001 from
crossover(01101 11000) by choosing a random
crossover point at 4 - We obtain 10000 and 11011 from crossover(10011
11000) by choosing a random crossover point at 2 - Now we have 01100 11001 10000 11011
9Select individuals for mutation
- in the newly formed (intermediate) population
with a small probability - i.e., randomly change 0(1) to 1(0) in 01100
11001 10000 11011 - Now we have the new population P (1) 01101
11001 00000 11011
10Repeat if the population has not converged.
- P (1) 01101 11001 00000 11011
- P (2) 00001 11101 00011 11010
- P (3) 00001 11111 11110 11001
- P (4) 11111 11110 11101 11101
- P (5) 11111 11101 11111 11110
- Etc
11Remarks
- There is no restriction on the fitness or
objective function - It can be non-differentiable or even
discontinuous - There is no need to know the exact form of the
objective function - If the objective function is too complex to
express explicitly, it can still be simulated
since EAs only use function values - In fact, more modern selection algorithms just
use solution ordering, so objective values just
have to be orderable - There can be many different genetic operators and
selection mechanisms - The initial population does not have to be
generated at random. - The representation of chromosomes does not have
to be binary. - Domain knowledge can be incorporated into
representation and genetic operators.
12Variants Crossover
- One-point crossover
- 011011 ? 011101
- 101101 101011
- k-point crossover (k gt 1)
- xxxxxxxxxx ? xxoooxxxxo
- oooooooooo ooxxxoooox
- Uniform crossover
- xxxxxxxxxx ? xooxxoxoxx
- 1001101011
- oooooooooo oxxooxoxoo
13Variants Crossover
- Intermediate crossover (
- for continuous representation of chromosomes only
- xoffspring x2 r(x1 - x2)
- where r is a uniformly distributed random number
in 0 1 - Other crossover, e.g., order-based crossover
14Variants Mutation
- Mutation on binary strings (bit-flipping)
- Mutation on real values (Gaussian mutation)
- Adaptive Mutation
- Gaussian mutation with varying ?
- or ? may be part of the chromosome
- Problem-specific mutation
15Variants Selection
- Truncation selection
- Select the k fittest individuals
- Roulette wheel selection
- Fitness-proportionate selection
- Likelihood of selection is proportional to
fitness - Rank-based selection
- Likelihood of selection is proportionate to rank
- Tournament selection (size k)
- Randomly choose k individuals
- Select the fittest of them
16Variants other mutation operators
- Problem-specific operators
- You can design operators to match the problem
representation - Biologically inspired operators
- Insertion
- Deletion
- Replication
- Transposition
- Correspond to the real mutations that occur in
DNA
17Variants other recombination operators
- Differential evolution
- A kind of 3-way crossover given x1, x2, x3
- xnew x3 r(x1 - x2)
- Multi-parent crossover
- given x1, x2, , xr
- xnew a1x1 a2x2 arxr
- a1, a2, , ar ? -?, 1 ?
- ? usually chosen as 0.5
18Travelling Salesman Problem (TSP)
- Find the (a) shortest tour of N cities visiting
each city exactly once and returning to the
starting city - Or expressed mathematically
- Given N cities, 1, 2,..., n, and distances
dij, 1 i, j N, among them - Find a permutation (x1, x2,..., xN ) of (1,
2,..., N ) - Such that D ?I1N dxixi1 is minimum
- (where xN1 x1)
- An NP-Complete Problem
19TSP Representation
- Represent each tour as a tuple
- (x1, x2,..., xN) means visiting city x1 first,
then x2, then ..., , then xN, and finally going
back to x1
20TSP Mutation
- Reverse any segment of the tour
- Similar to Lin's 2-opt neighbourhood system
21TSP Recombination
- Construct an edge map from 2 parental tours
22TSP Recombination
- Construct a child tour from the edge map
- Choose the initial city at random as the current
city - Determine which of the cities in the edge list of
the current city has the fewest entries in its
own edge list - City with the fewest entries becomes current city
23Some Variants Selection
- Fitness can be defined as inverse of tour length
- f 1 / D
- Roulette Wheel Selection
- Let fitness of solution j in a population, G(j),
be fj, 1 j M - Then solution j's reproduction probability is
- PGA5 (j) fj / ?k1M fk
24Some Variants Selection
- Rank-Based Selection
- Tours in a population are first sorted in a
non-descending order according to their lengths - Let the M sorted tours be numbered as 0, 1,...,
M-1 - Then the Mjth tour is selected with probability
- PGA6 (M - j) j / ?k1M k
- Competition
- The probability of attaining a win over the
opponent is the opponent's fitness divided by the
sum of the two competing solutions' fitness - For example, if solution i's fitness is 100 and
solution j's fitness is 200, then solution i wins
with probability 2/3
25We look at six Approaches
- GA1, GA2, and GA3 use recombination only, no
mutation - GA4, GA5, and GA6 use mutation only, no
recombination - GA1 Recombination as described above
- GA2 Favour the nearest neighbour
- GA3 Tie is broken in favour of a nearer
neighbor - GA4 Competition
- GA5 Roulette wheel selection
- GA6 Rank-based selection
26Experiments
- We compare GA1 to GA6 on a TSP
- Ten randomly generated TSPs with 100 cities
- The expected minimum tour length is 100
- Performance Metrics
- Best (found so far)
- Mean (in a population)
- Entropy (in a population)
- H 1 / N ?i1N H i
- where N is the number of cities and
- H i -1 / log(N) ?j1N (nij / 2P) log (nij /
2P) - P is the population size, nij is the number of
edges connecting city i and city j in the
population
27ResultsBest Tour Length vs of generations
28ResultsMean Tour Lgth vs of generations
29ResultsEntropy vs of generations
30Analysis of Results
- GA2 looks best
- But is it
- If we look early, GA3 looks better
- But it stops searching effectively
- Because it loses diversity
- Crossover becomes ineffective
- Notice GA1, 4, 5, 6 retain more diversity
- Maybe they will get better if we search longer
31Exploration vs Exploitation
- An evolutionary algorithm has a trade-off between
- Exploitation
- Looking very carefully near already good points
- Hill-climbing is the extreme case
- Exploration
- Looking very widely at good points
- Random search is the extreme case
32Exploration vs Exploitation
- The right trade-off is problem-specific
- If the fitness landscape is very smooth
- Find the maximum of f(x)x2
- Then the algorithm should emphasise exploitation
- Stochastic hillclimbing will out-perform any
evolutionary algorithm - Deterministic hillclimbing (gradient descent)
will do even better
33Exploration vs Exploitation
- If the fitness landscape is very rough
- Find good paths in TSP
- Then the algorithm should emphasise exploration
- Stochastic hillclimbing will perform very poorly
- Methods to increase diversity (fitness sharing)
may be very important
34Fitness Landscapes
- Sometimes we can know about the fitness landscape
before we start - More commonly, we have to discover about the
fitness landscape - By experimenting with the performance of
algorithms - By directly measuring
- Fitness - distance correlation
- The better the correlation, the smoother the
landscape
35Summary
- Simple example of applying an evolutionary
algorithm - Variants of the classical evolutionary algorithm
- Realistic evolutionary algorithm example
36?????