Title: Department of Biomedical, Industrial
1Genetic AlgorithmsPart III - Applications
2A Genetic Algorithm for the Multidimensional
Knapsack Problem
3Multi-Dimensional Knapsack
4Pseudo Utility
- Need some way to pick items or drop items so that
each decision is a best decision - Natural way is to evaluate items in terms of
benefit per unit cost - The bang for the buck approach
- General form of this is the following ui ci /
vi - The penalty factor, vi, varies from approach to
approach - For single knapsack, it is just the constraint
coefficient - For multiple constraints, need some method to
collapse the cost within each of the
constraints into a single measure
5Penalty Factor Approaches
- Resource cost as a function of remaining
resources in a constraint - Resources remaining in each constraint if an item
is selected - Resource cost in the most constrained resource
- Sum of remaining resources in a constraint
weighted by some multiplier - Sum of resource cost in a constraint weighted by
some multiplier - There are also various ways to get these
multipliers - Chu and Beasley use shadow prices or the Lagrange
multipliers
6Finding a Good Trajectory
7Chu and Beasley GA Design
8GA Design
- Representation
- Since MKP is a 0-1, combinatorial optimization
problem, representation is a binary string of
length n - Initial population
- Randomly selection, constructed in primal manner
- Not a purely random population
- The initial population is ensured entirely
feasible - Subsequent populations are retained in feasible
status - Chromosome repair
- As authors point out, not literally repairing
- An accepted term
- Projecting infeasible points back into the
feasible space - Projection based on a greedy DROP ADD heuristic
9GA Design
- Intermediate population
- Selected via tournament selection
- Used binary tournament in increase the selective
pressure - Crossover
- Uniform crossover employed
- For each bit, flip a coin, heads from Parent 1,
tails from Parent 2 - This increases the selective pressure in the
subsequent populations - Mutation
- Kept small
- 2 bits changed per string (bits randomly
selected) - Actually changing mutation rate based on problem
chromosome length
10GA Design
- Repair operation
- Meant to ensure all members remain feasible
- Easier than other approaches, particularly
penalty methods - Starts with LP solution to the problem being
solved - The shadow prices used to weight each constraint
- Pseudo-utility ratio for each variable is profit
of the variable divided by the sum of the product
of the constraint coefficient and that
constraints assigned weight - Based on pseudo-utility values, repair the
solution using a DROP ADD greedy heuristic - Actually this approach was first mentioned in the
Senju Toyoda (1968) heuristic paper
11Other Details
- Add an approach to ensure population members are
unique - Delete any duplicate members
- Replace weakest members in the population
- Generate a child via crossover and mutation
- Evaluate that child
- Place child (and its measure) into population
- Return population to steady-state by removing
weakest member - Any concerns about this design??
- No duplicates?
- Same greedy repair approach?
- Decreasing mutation rate?
12Time Complexity
- A measure of algorithm running time as a function
of problem size - Generally calculated in terms of basic computing
operations - O(n2) says the number of operations is some
function of the square of the problem size - The authors list their GA as O(mn) per iteration
- Caveat is that this is per iteration
- Each iteration (generation) similar to entire
competitor heuristic - Timing results indicate up to 31 minutes per
problem
13Problem Sets
14Larger Problems
15Correlation Graph
16Results
17Case Study by Hoff
- Focused on GA Parameterization
18Initial Parameterization
- Population size, 50
- Single point crossover
- Steady-state regeneration
- Feasible initial population
- Penalize infeasibility
- Run algorithm for 30,000 generations
- Fitness value objective function value
- Test set 57 standard problems
- Small problems referenced by Chu and Beasley
19Examined Population Size
20Things Examined
- Mutation
- Max 1 bit per chromosome
- Examined each bit for mutation
- Tried swap mutation
- Used 1/n rate of invert mutation
- Crossover
- Single point
- Two point
- Burst (or uniform) with 50 probability
21Crossover Results
22Population Types Considered
- Feasible
- Initial feasible
- Infeasibility allowed, but penalized
- Random
- Uniformly generated
- At least one member feasible
- Weighted Random
- Some portion required to be feasible
- Filtered
- Always maintain feasibility
- Found to work the best
- This was a surprise to the authors
23Hoff et al. Results
- Good results
- Average 1.6 from optimum
- Actually found the optimal in many cases
- Results better than comparable methods
- Careful GA parameter tuning very useful
- Proximate Optimality Principle
- Extensively test a small subset of problems
- Apply the parameters to the full set
24Some Experiences on Solving Multiconstraint
Zero-One Knapsack Problems with Genetic Algorithms
25Overview of Method
- Same standard type of encoding
- Infeasible chromosomes repaired via ADD-DROP
operation - Filter operation randomly drops items until
feasibility achieved - Evaluation
- Objective function if all in population are
feasible - Penalty function if infeasible solutions are
allowed - Mutation
- Bit strange
- An item is dropped after which there is an
attempt to add some other item - Local search
- Added to solutions implemented a basic tabu
search
26Theil and Voss Results
- Tests run on same 57 test problems
- Hard to get feasible solutions from randomly
generated test problems - Crossover rate of 90 and mutation of 0.09 were
deemed best - Population of 50 and run for 100 generations
- Larger populations overcame some of the problems
with infeasible solutions - The tabu search operator allowed the GA to be
effective with smaller sample sizes
27Gaining Insights Into Initial Genetic Algorithm
Performance On Multi-Dimensional Knapsack
Problems Under Varied Parameterizations And
Initial Population Compositions
- Masters Thesis
- By
- Chaitr Hiremath
- Advisor Dr. Raymond Hill
- Committee Dr. Frank Ciarallo and Dr. Xinhui Zhang
Department of Biomedical, Industrial, and Human
Factors Engineering Wright State University July
6, 2004
28Introduction
Figure 1 Notional Best-So-Far Curves
29Introduction
- Initial Population Generation (Hill, WSC 1999)
Probability of Feasible Random Solutions,
AU(1,40)
Probability of Feasible Random Solutions,
AU(1,15)
New Heuristic Develop a reasonable estimate for
Pr(x1)
30Introduction
- Initial Population Generation (contd)
Percentage Feasible Solutions Produced by Each
Approach
Average Infeasibility Ratios for Infeasible
Solutions
31Introduction
- Initial Population Generation (contd)
Average Objective Function Values by Constraint
Slackness Settings
Average Pr(x1) Values by Constraint Slackness
Settings
32Research Results
Comparing Convergence Results to the Best So Far
33Research Focus
Comparing Convergence Results to 1 of the Best
So Far
34Hiremaths Results
- A better way of generating an initial population
can in fact shift the best-so-far curve to the
left - This shifting to the left can result is savings
of computational requirements - The GA still needs mechanisms to get to steady
state but with this improved initial population,
dynamic parameterization such as changing the
mutation rate might help jump the best so far
curve up earlier in the process - It is also quite possible this result will
provide more pronounced results when applied to
higher dimensional problems versus the relatively
simple 2KP set used in this research - Also did not examine the Beasley set in this work
35Final Genetic Algorithm Comments
36General Comments About GAs
- Coding of the problem move the GA to operate in a
different space than that of the problem - Performance of most GA implementations is
comparable to or better than the performance of
many other search techniques - Fails to live up to the high expectations
engendered by the theory - GA can be considered as a pressure system
- Strong selective pressure supports premature
convergence of GA search - Weak selective pressure can make the search
ineffective - Population size has large impact
- Size too small then GA may converge to quickly
- Size too large may waste computational resources
37GA Summary
- Biological underpinnings of GA
- Search algorithms on genetic processes of
biological organisms - GAs are intelligent exploitation of a random
search - Successfully deals with a wide range of problem
areas - Not guaranteed to find the global optimum
solutions - Generally good at finding acceptably good
solutions acceptably quick - Extremely robust
- Balance between efficiency and efficacy needed
for survival in many different environments - Applicable to a variety of applications
38GA Summary
- Differences between GA and other search
algorithms - Considers may points in the search space
simultaneously, not a single point - Work directly on with strings of characters
representing the parameter set, not he parameters
themselves - Domain independent
- Probabilistic rules to guide their search
39Web Sites
- Hitch-Hiker's Guide to Evolutionary Computation
- http//www.cs.purdue.edu/coast/archive/clife/FAQ/w
ww/ - Source Code Collection, GA Archive
- http//www.aic.nrl.navy.mil/galist/src/
40Genetic AlgorithmsQuestions?