Grammer Modelbased Program Evolution - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

Grammer Modelbased Program Evolution

Description:

Grammer Model-based Program Evolution. Shan Yin et al. KIM KANG IL. Contents. Introduction ... Keep the building block from mutation or crossover ... – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 14
Provided by: kki51
Category:

less

Transcript and Presenter's Notes

Title: Grammer Modelbased Program Evolution


1
Grammer Model-based Program Evolution
  • Shan Yin et al

KIM KANG IL
2
Contents
  • Introduction
  • Model
  • Experiments
  • Discussion
  • Conclusion

3
Introduction
  • Keep the building block from mutation or
    crossover
  • Building block have useful information for
    finding a solution EDA
  • Propose a model using the GP-style tree
    representation against conventional GA-style
    linear representation
  • Stochastic context free Grammar model represents
    the building blocks

4
Model
  • Flow diagram

Evaluate and select
Sample SCFG Model to Obtain new Population
Learn SCFG model from the selected individual
5
1 step Evaluate and select
  • Give initial randomly chosen population
  • On population , calculate the fitness for each
    individuals by the defined function
  • find the one individual which have the best
    fitness value specification

S
R1(Exp)
R2(Exp)
R3(A)
R4(B)
R5(C)
R6(B)
R7(B)
S -gt R1(Exp) R2(Exp) R1(Exp) -gt R3(A)
R4(B) R2(Exp) -gt R5(C) R6(B) R7(B)
S -gt Exp Exp -gt A B C B B
Initial grammar
Changed grammar by initial individual
6
2 step learn the model
  • Use stochastic context free grammar
  • Give probability for each production
  • The sum of probability of Productions which have
    same LHS is 1
  • The probability of productions which are used in
    elitist increase

S -gt R1(Exp) 0.5 R2(Exp) 0.5 R1(Exp) -gt
R3(A) R4(B) 1 R2(Exp) -gt R5(C) R6(B) R7(B) 1
7
S -gt R1(Exp) R2(Exp) R1(Exp) -gt R3(A)
R4(B) R2(Exp) -gt R5(C) R6(B) R7(B)
  • Problem just specification make the model
    converge to local minimum
  • We have to give a process to generalize
  • Merge the productions -generalization
  • Find the smallest grammar MML

Merge R1 , R2
S -gt R1(Exp) R1(Exp) -gt R3(A) R4(B) -gt R5(C)
R6(B) R7(B)
8
3 step - sample the model
  • Through the probability table of the selected
    grammar, we generate new population

9
Experiments
  • Royal tree problem
  • The arity is increased by 1 for each depth
  • In this experiment, maximum depth is 6
  • Terminal x
  • Nonterminal a,b,c,d,e
  • The fitness is 50 runs
  • resemblance between perfect tree

10
  • Max problem
  • Operator x,
  • Terminal x
  • Finding maximum value
  • Depth limit to 7
  • Maximum fitness is 65536
  • 50runs

11
Discussion
  • Internal structure representation
  • Parent child dependency
  • Structural representation
  • Locality local dependency with near node

12
  • Positional independency
  • Dependent model PIPE
  • Should preserve building block
  • Various complexity of model
  • Vs PIPE (Fixed model about the number of node or
    depth, etc)
  • Fixed model might have a problem for expressing
    the dependency or more complex individuals

13
Conclusion
  • GMPE is a kind of EDA with GP using stochastic
    grammar
  • Grammar can better preserve the building blocks
    than conventional GP
Write a Comment
User Comments (0)
About PowerShow.com