ACTUAL STATE IN OPTIMIZATION TECHNIQUES - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

ACTUAL STATE IN OPTIMIZATION TECHNIQUES

Description:

f (x1,x2,...,xn) continuous function of all the independent variables ... representation of continuous or discrete variable as one item in one list or one ... – PowerPoint PPT presentation

Number of Views:44
Avg rating:3.0/5.0
Slides: 14
Provided by: lms96
Category:

less

Transcript and Presenter's Notes

Title: ACTUAL STATE IN OPTIMIZATION TECHNIQUES


1
ACTUAL STATE IN OPTIMIZATION TECHNIQUES
  • EXTREMA OF A REAL FUNCTIONFree Extrema
  • f(x) a real function of real variable x ?
    interval (a,b) derivative gt f ' (x) d f(x)
    /dx .
  • f(x1,...,xn) f(x) for xÃŽ Â n scalar function of
    n real variables x1,...,xn.

2
  • f (x1,x2,...,xn) continuous function of all the
    independent variables (x1,x2,...,xn) ,
  • maximum or a minimum gt free extremum, gt
  • D f f (x D x) - f (x)³ 0 for any small
    variation D x.
  • When the function is derivable gt f / xi (x1,
    x2, ..., xn) 0   " i1,...n.

3
  • Linked Extrema n variables (x1,...,xn) not
    independent
  • mE conditions (mE n) (or equalities constraints)
    cm (x1,...,xn) 0 gt linked extrema.D f
    f (x D x) - f (x) ³ 0, any compatible
    variation D x.
  • i) eliminate mE variables gt select (n-mE)
    independent variables
  • ii) Lagrange Multipliersnew function of nmE
    variables (Lagrangian) L (x1,...,xn,µ 1,...,µ
    mE) f (x1,...,xn) åj1mE µ j cj(x1,...,xn)
    gt free extrema
  • D L Ã¥in L /x i D xiÃ¥jmE L / µ jD µ j
    0," D xi,D µ j L / xi f/ xi åjmE µ j
    cj/ xi 0," i1,n   L / µ j cj 0,"
    j1,mE

4
  • Constrainted Extremaf(x) f(x1,...,xn), cost
    or objective function.mE equality conditions
    (simple conditions)cj(x) cj(x1,...,xn) 0,
    j1,mE (mE n)and the m-mE inequality
    conditions (unilateral constraints)cj(x)
    cj(x1,...,xn) 0,jmE1,mgreat difficulty for
    the solutionxT ( x1,.... xn)

5
CONVEX ANALYSIS
  • minimum of a convex function f (x) and all
    conditions cj(x) 0 (j 1, m) convex
    functions,
  • gt convex optimization
  • x gt constraints c (x) 0 gt feasible point.
    gt set of feasible points gt convex set C.
  • Generalized Lagrangian L (x,l ) f (x) l Tc
    (x) f (x) åj1m l j  cj(x)function of nm
    variables,xT (x1,...,xn) and ?T (l 1,...,
    l m) with all l j ³ 0.l j generalized Lagrange
    multipliers.

6
  • Kuhn-Tucker conditions Saddle point L (x, l )
    L ( x, l) L (x, l)
  • Particular cases
  • i) min f (x) bT x e    c(x) A x d 0   
    f (x) åi1n bi xi e with cj(x) åi1n Aji
    xi dj 0 linear programming simplex algorithm
    gt solution gt a corner of the set C limited
    by hyperplanes c (x) 0.
  • ii) f (x) 1/2 xT C x DT x e   c (x) 1/2
    xT A x BT x b 0C, A being positive
    symmetrical matrices quadratic programming
    Uzawa algorithm

7
MATHEMATICAL PROGRAMMING
  • find the minimum of f(x) of n variables x
    (x1,...,xn)which satisfy
  • xÃŽ Â nand (a). ci(x) 0, i 1,...,mE
  • (b). ci(x) 0, i mE1,...,m
  • (c). xjl xj xju, j 1,...,n
  • in  n gt the feasible domain.
  • algorithms gt reduce the cost function
  • gt keep the point in the feasible domain at any
    time.

8
  • Gradient methodsÑ x h first derivative of the
    function h relative to x gradientÑ x h (x) (
    / x1h(x),..., / xnh(x))T
  • Generalized Lagrangian L (x, µ, l ) f( x)
    åiÎ E µ i ci (x) åiÎ I l i ci (x)feasible
    domain, active constraint if ci(x) 0, iÎ
    I or inactive otherwise.optimization algorithms
    gt iterative processes,
  • first estimation of a feasible point x0
  • xk1 xk a  d k, k iteration, vector d k gt
    search direction in  n,gt gradient or the
    conjugated gradient, scalar a gt step of the
    motion in the direction.2 main steps gt good
    direction gt good step.

9
  • Modified Newton method
  • RQP, Recursive Quadratic Programming
  • Generalized Reduced Gradient (GRG).
  • Sequential Methods (SUMT) with penalty

10
Stochastic Evolutionary Algorithms Genetic
Algorithms
  • In feasible domain, points are ''cleverly
    randomly'' chosen
  • Genetic algorithms gt always great number of
    iterations.
  • At first, codingeach point or individual x is
    codified,
  • gt one string or a list of chromosomes
  • In the chromosome gt sequence of genes. gt
    representation of continuous or discrete variable
    as one item in one list or one exclusive item in
    one list.Continuous chromosome, represented
    according to the required accuracy by either 8,
    16, or 32 genes,as binary bits. (a gene, 0 and 1
    gt alleles).

11
  • Enumerated chromosomes used in a combinatorial
    problem. Multiple chromosomes make up the
    individual.each point gt internal
    representation within the algorithm, gt one
    point in the real world at the end
    (decoding)Iterations a loop gt one generation.
  • creation of one population of individuals
    represented by their chromosomes through a
    fitness function, evaluation of the performance
    of each individual
  • generation of a new population by making the
    operations
  • selection, proportional to their performance,
  • cross-over and mutation on the individuals,
  • substitution by the new population and loop until
    one criterium is reached.

12
  • cross-overgt 2 chromosomes or parents, are cut
    in the same position (randomly chosen) and they
    exchange their part to generate 2 new chromosomes
    or children.
  • mutationgt randomly, a small perturbation is
    introduced in the chromosome.
  • reproductiongt new individuals are created
    after cross-over and mutation.
  • selectiongt to favorize the individuals with
    the best performance. interesting and practical
    program gt GENEHUNTER from Wards Systems.

13
  • Several Iterations (gtgt 1000 !)
  • Normal uses in Optimal design gt IMPOSSIBLE
  • Introduction gt Surrogates, Surface
    representation
  • Still Design Of Experiments gt too expensive !!
Write a Comment
User Comments (0)
About PowerShow.com