introduction to multicriteria optimisation - PowerPoint PPT Presentation

1 / 5
About This Presentation
Title:

introduction to multicriteria optimisation

Description:

mutlicriteria optimisation (also referred to as multi-objective and vector ... for example, consider optimising an engine design based on criteria of ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 6
Provided by: decBourn
Category:

less

Transcript and Presenter's Notes

Title: introduction to multicriteria optimisation


1
introduction to multicriteria optimisation
  • previous slides have considered scalar objective
    functions, i.e. those that return a single
    (nominally real-valued) indicator of merit
  • mutlicriteria optimisation (also referred to as
    multi-objective and vector evaluated) considers
    problems having more than one evaluation criteria
  • typical of many engineering design problems
  • for example, consider optimising an engine design
    based on criteria of acceleration and fuel
    consumption
  • searching for a solution that maximises
    acceleration AND minimises fuel consumption
  • there may be one single best solution, i.e. one
    that offers both the most acceleration and the
    least fuel consumption
  • the fitness function can combine the criteria,
    e.g. by summation or multiplication
  • solution can be readily found using a standard
    genetic algorithm
  • it is more likely, however, that some solutions
    will offer better acceleration but worse fuel
    consumption and vice versa
  • the optimal trade-off is generally to be selected
    by an expert user
  • the objective of the search algorithm is to find
    a set of good solutions for the user to choose
    from

2
comparing fitness
  • genetic algorithms are driven by bias in
    selection and/or replacement
  • this requires the ability to compare absolute or
    relative fitness
  • e.g. roulette wheel or tournament selection
  • how does this work when there are more than one
    criteria?
  • some solutions may be poor on all criteria
  • others may be better on one criterion and worse
    on another

3
the optimal set
  • individual A is said to dominate individual B
    when
  • for all evaluation criteria A is at least as good
    as B
  • for at least one criterion A is better than B
  • the objective is to find all (or other
    satisfactory or feasible quantity) of
    non-dominated solutions
  • often referred to as the Pareto-optimal set
  • in continuous spaces this set may be infinite
  • the number of solutions found will be limited by
    the population size
  • a simple solution is to convert the multiple
    criteria to a single value
  • for example, weights, wi, could be applied to
    each criterion to determine their relative
    importance to the application
  • a standard genetic algorithm can then tackle the
    problem
  • difficult to identify suitable values for the
    weights (implies considerable domain knowledge)
    and unexpected good solutions can be missed
  • multiple runs could be made with different
    weights to find a set of solutions

4
simultaneous search
  • because genetic algorithms are population based,
    they offer the possibility to simultaneously
    search for multiple elements of the optimal set
  • no assumptions need to be made about the relative
    importance of criteria or about the properties of
    the Pareto-optimal front
  • population size is a critical parameter to ensure
    satisfactory coverage
  • algorithms adopt a range of measures including
  • modifications to selection, typically based on
    dominance
  • techniques to preserve diversity - the usual
    convergence of the population to a single
    solution is clearly a bad thing when a set of
    solutions is required
  • preservation of dominant solutions discovered
  • VEGA (vector evaluated genetic algorithm) is an
    early example
  • divides the population into n sets
  • selection within each set is based on a different
    criterion
  • reproduction is carried out across the whole
    population
  • tendency towards extreme solutions, i.e. those
    with best performance on individual criteria
    rather than good performance across all criteria
  • no measures taken to preserve diversity or
    dominant solutions discovered
  • the majority of algorithms now adopt some form of
    Pareto-ranking based on the number of solutions
    an individual dominates or is dominated by

5
  • MOGA is an early example
  • all non-dominated solutions are assigned a rank
    of 1
  • all dominated solutions are assigned a rank based
    on the number of solutions to which they are
    inferior
  • the population is sorted according to rank and
    fitness values assigned by interpolation, with
    scores averaged for individuals of the same rank
  • most approaches also incorporate some form of
    niching technique to maintain diversity across
    the Pareto front
  • the solution set is flat in these problems so
    there is no need to maintain stable
    sub-populations
  • the population size is finite and generally small
    compared with the Pareto front so schemes that
    spread the population out across this front are
    useful
  • different approaches are possible, but fitness
    sharing schemes and nearest neighbour density
    estimates are popular
  • good solutions are often retained by elitist
    population structures
  • alternatives include the use of an archive of
    know solutions
  • the archive might simply provide a record of past
    solutions
  • alternatively solutions from the archive could
    form part of the selection pool
  • the archive typically contains the current
    approximation of the Pareto-optimal set, i.e. all
    non-dominated solutions
  • some approaches exploit properties of
    coarse/fine-grained parallel models
  • for example, local selection in fine grained
    models means that the Pareto-rank does not have
    to computed for the entire population, just the
    local neighbourhood
Write a Comment
User Comments (0)
About PowerShow.com