Ecole Centrale Nantes, 2014
Genetic algorithms
Genetic methods come out from the Darwins evolution theory.
Darwins evolution theory: Those natural qualities that are more consistent with natural laws, have greater chances of survival.
Darwin's evolution theory has no definitive anallitical proof, however, it has been confirmed by
statistical and experimental data.
The people of a community (human / animal / plant / ...) makes New Generations by mating (breeding).
Chances of survival of an individual in the next generation depends on the particular composition of the individual chromosome in that generation. (In each generation elite species (Elitisism) have more opportunities for reproduction and species with unfavorable characteristics gradually disappear.)
Except in exceptional cases where Mutation may occurs in the characteristics of the new generation, the new generation is more compatible with nature. (In most of the cases mutant are incompatible with the nature. In rare cases, a creature with excellent features and high compatibility may be present .)
As a result of different generations evolve over time (Evolution).
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Genetic algorithms
1. Initialize Population (Randomly choose n individuals)
2. Fitness Evaluation (Solutions quality)
3. Individual Selection for Mating Pool
4. For each consecutive pair apply Crossover
5. Mutate Childeren
6. Replace the Current Population by the the resulting Mating Pool (New Generation)
7. Fitness Evaluation
8. Go to 3 Until Stopping Criteria is met
Initialization: first generation of n individuals
Selection
Mutation
end of the algorithm
Crossing
Evaluation of the current generation
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Selection: search of the individuals to be selected within the population
Roulette wheel selection
Tournament selection
Elitist selection
Linear Rank Selection
Exponential Rank Selection (Boltzmann)
Truncation Selection (Steady State Selection)
Purely Random Selection
Sigma Scaling Selection
Genetic algorithms
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Roulette wheel selection
the fittest individuals have a greater chance of survival than weaker ones. this replicates nature in that fitter individuals will tend to have a better probability
of survival and will go forward to form the mating pool for the next generation.
weaker individuals are not without a chance. In nature such individuals may have genetic coding that may prove useful to future generations.
the probability of being selected is proportional to the VALUE of the functional
Example: The following table lists a sample population of 5 individuals (a typical population of 400 would
be difficult to illustrate).
If i is the fitness of individual in the population, its probability of being selected is: i
i
i
ind
fp
f
Genetic algorithms
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Example: These individuals consist of 10 bit chromosomes and are being used to optimise a simple
mathematical function
This gives the strongest individual a
value of 38% and the weakest 5%.
These percentage fitness values can then
be used to configure the roulette wheel.
Figure highlights that individual No. 3 has
a segment equal to 38% of the area.
The number of times the roulette wheel
is spun is equal to size of the population.
Each time the wheel stops this gives the fitter
individuals the greatest chance of being
selected for the next generation and
subsequent mating pool.
drawback: in case of a very heterogeneous
population the presence of a very performant
individual can lead to premature convergence.
Conversely, if the population is very homogeneous individuals all have the same probability
of being selected and convergence is slow.
Genetic algorithms
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Tournament selection
the probability of being selected is proportional to the RANK of the individuals in the population
It involves running several "tournaments" among a few individuals chosen at random from the population.
the winner of each tournament (the one with the best fitness) is selected for crossover.
Example:
Genetic algorithms
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Crossing (Crossover):
combines two individuals to create new individuals for possible inclusion in next
generation
main operator for local search (looking close to existing solutions) perform each crossover with probability pc {0.5,,0.8} crossover points selected at random individuals not crossed carried over in population
Genetic algorithms
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Initial Strings Offspring
Single-Point
Two-Point
Uniform
11000101 01011000 01101010
00100100 10111001 01111000
11000101 01011000 01101010
11000101 01011000 01101010
00100100 10111001 01111000
00100100 10111001 01111000 10100100 10011001 01101000
00100100 10011000 01111000
00100100 10111000 01101010
11000101 01011001 01111000
11000101 01111001 01101010
01000101 01111000 01111010
Genetic algorithms Crossover:
Ecole Centrale Nantes, 2014
Mutation:
random modification of the characteristics of an individual
each component of every individual is modified with probability pm If the probability of mutation is too high, the algorithm ends in a random search; if it is too
low it loses its robustness.
main operator for global search (looking at new areas of the search space) mutation is usually applied after the crossover individuals not mutated carried over in population
Genetic algorithms
Ecole Centrale de Nantes, 2015
mutation (pm=0.05)
00010101 00111001 01111000
00100100 10111010 11110000
11000101 01011000 01101010
11000101 01011000 01101010
00010101 00110001 01111010
10100110 10111000 11110000
11000101 01111000 01101010
11010101 01011000 00101010
Ecole Centrale Nantes, 2014
Crossover or Mutation?
Which one is better, which one is necessary
it depends on the problem itself
each one has its own role
generally, it is better to use both
however, it is possible to have an algorithm that only use the mutation, but, an algorithm that only uses the crossover will not work
Other operators can be added:
Elitism: permits to be sure to keep the best element of the population
Sharing: for different individuals of comparable performance, it permits to
preserve the diversity of the population in respect to these different individuals
Genetic algorithms
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Multi-objective optimization
The majority of real-life problems need to account for multiple objectives:
performance vs. cost, performance in situation 1 vs. in situation 2, etc.
For hull optimization one is interested at performances at max and cruise
velocities, for different displacements, etc. For sail boats, a dozen of typical
regimes are at aim.
One therefore seeks the minimization of:
In general there is no set of parameters for which all the functions
are minimal. There is therefore no unique optimum to a multi-objective
optimization problem but a set of points representing best comprises between
the various objectives.
1( ) ( ),..., ( ) mf x f x f x1( ),..., ( )mf x f x
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
The set of dominant points is called Pareto frontier
1( )f x
2( )f x
Pareto frontier
1( )f x
2( )f x
these points are
dominant over x
x is dominant over these points
?
?
x
Multi-objective optimization
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Multi-objective methods
Weighting method
One defines a weighting function as the barycenter of all the functions used as criteria
Drawback: in this way one gets an optimum which is weighted by coefficients defined a
priori, i.e. without knowing the problem. One thus gets one point of the Pareto frontier
(any point).
Constraint method
For a problem with m objectives, m-1 objectives are replaced by constraints
and we minimize the last function:
Drawback: in practice it is slow to get the Pareto frontier by this method
1 1
( ) ( ), 1
m m
weighting i i i
i i
f x f x
( )
( ) ,
i
j j
minimize f x
with f x j i
2( )f x
1( )f x
2'2
Multi-objective optimization
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Multi-objective methods
Genetic-algorithm based methods
There are different methods, among which the Vector Evaluated Genetic
Algorithm:
For a problem with m objectives, the population is divided into m groups.
Selection is performed within each group with respect to one objective.
Over-all the selection thus accounts for all the objectives.
X1, X2, . Xi, .Xn
f1 fi fm
Multi-objective optimization
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Inclusion of constraints
The majority of industrial problems present some constraints. Several methods
exist to account for them:
Penalty method
Principle: a new functional to minimize is defined which includes the
constraints:
is called penalty coefficient. In practice one optimizes this new functional by
increasing gradually the value of .
Drawback: the choice of can be tricky: if it is too small, it will not permit to correctly account for the constraints
If it is too high, it will be hard for the optimization algorithm to converge
( ) ( ) ( )f x f x C x
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Conclusions
There is no best algorithm, but different algorithms which are well suited or not
to each problem
Algorithm families can be sorted depending on their intrinsic performances:
and depending on their capability to account for multi-objectives and to take
advantage of parallel hardware architectures:
Algorithm Multi-Objectives Multi-core
Gradient No No
SIMPLEX No No
Genetic Yes Yes
Algorithm NatureKnowledge of
derivative needed
Convergence
speedAccuracy Robustness
Gradient Deterministic Yes Quick High Null
SIMPLEX Deterministic No Medium Medium Medium
Genetic Stochastic No Slow Low Strong
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
Conclusions
In practice, it is often interesting to proceed step by step:
Exploration of the design space to identify the problem to optimize
Choice of a dedicated algorithm depending on:
aspect of the function, and its noise number of objectives and parameters CPU time needed to evaluate the function
Sometimes, combination of algorithms (ex: genetic then gradient)
Industrial tools exist which answer very well engineering needs
Simple to use, intuitive and suitable for many kind of problems
Enclose numerous algorithms, including most evolved and up-to-date
However, knowledge of the main advantages/drawbacks of the different algorithms is
needed to take best part of them
Some well known tools: Optimus (LMS), ModeFrontier (Esteco), Isight...
Ecole Centrale de Nantes, 2015
Ecole Centrale Nantes, 2014
References
Practical Methods of Optimization, Fletcher, 1987
Practical Optimization, P.E. Gill & W. Murray, 1981
ModeFrontier v4.0 manual, Esteco
Contribution loptimisation de forme pour des coulements fort nombre de Reynolds autour de gomtries complexes, Rgis Duvigneau, PhD, Ecole
Centrale Nantes, 2002
Ecole Centrale de Nantes, 2015