multi objective optimizationmoea
TRANSCRIPT
-
7/27/2019 Multi Objective OptimizationMOEA
1/31
MultiMulti--Objective Optimization UsingObjective Optimization Using
1
-
7/27/2019 Multi Objective OptimizationMOEA
2/31
Evolution Strategies (ESs) were developed in Germany and have beenextensively studied in Europe
1. ESs use real-coding of design parameters since they model the organicevolution at the level of individuals phenotypes.
2. ESs depend on deterministic selection and mutation for its evolution.3. ESs use strategic parameters such as on-line self-adaptation of mutability
parameters.
Short ReviewShort Review (I)(I)
2
e se ec on o paren s o orm o spr ng s ess cons ra ne an s ngene calgorithms and genetic programming. For instance, due to the nature of therepresentation, it is easy to average vectors from many individuals to form asingle offspring.
In a typical evolutionary strategy, parents are selected uniformly randomly (i.e.,
not based upon fitness), more than offspring are generated through the use ofrecombination (considering ), and then survivors are selected deterministically.The survivors are chosen either from the best offspring (i.e., no parents survive,(,)-ES) or from the best +parents and offspring - (+)-ES.
-
7/27/2019 Multi Objective OptimizationMOEA
3/31
Genetic programming and Genetic algorithms are similar in mostother aspects, except that the reproduction operators are tailored to atree representation.
The most commonly used operator issubtree crossover, in which anentire subtree is swapped between two parents.
In a standard genetic program,the representation used is a variable-
Short ReviewShort Review (II)(II)
3
.
from an available set of value labels. Each internal node in the tree islabeled from an available set of function labels.
The entire tree corresponds to a single function that may beevaluated. Typically, the tree is evaluated in a left-most depth-first
manner. A leaf is evaluated as the corresponding value. A function isevaluated using as arguments the result of the evaluation of itschildren.
-
7/27/2019 Multi Objective OptimizationMOEA
4/31
OverviewOverview
Principles of Multi-Objective Optimization.
Difficulties with the classical multi-objectiveoptimization methods.
Schematic of an ideal multi-objective optimizationprocedure.
4
The original Genetic Algorithm (GA). Why using GA?
Multi-Objective Evolutionary Algorithm (MOEA).
An example of using a MOEA for solvingengineering design problem.
-
7/27/2019 Multi Objective OptimizationMOEA
5/31
MultiobjectiveMultiobjective algorithms classification based onalgorithms classification based onhow the objectives are integratedhow the objectives are integrated withiwithinn
We will use the following simple classification of EvolutionaryMulti-Objective Optimization (EMOO) approaches:
Non-Pareto Techniques
Aggregating approachesLexicographic orderingVEGA (Vector Evaluated Genetic Algorithm)
P r t T hni u
Pure Pareto rankingMOGANSGA
Recent ApproachesPAESSPEA
Bio-inspired Approaches PSOAnt-colony based
5
-
7/27/2019 Multi Objective OptimizationMOEA
6/31
Principles of MultiPrinciples of Multi--Objective OptimizationObjective Optimization
Real-world problems have more than one objectivefunction, each of which may have a differentindividual optimal solution.
Different in the optimal solutions corresponding todifferent objectives because the objective functionsare often conflicting(competing) to each other.
6
-optimal solution, generally known as Pareto-Optimal solutions (named after Italian economistVilfredo Pareto (1906)).
No one solution can be considered to be better thanany other with respect to all objective functions. Thenon-dominant solutionconcept.
-
7/27/2019 Multi Objective OptimizationMOEA
7/31
MultiMulti--Objective OptimizationObjective Optimization
Is the optimization of different objective functions at thesame time, thus at the end the algorithm return ndifferentoptimal values which is different to return one value in a
normal optimization problem. Thus, there are more than one objective function
7
Pareto - optimal solutions and Pareto - optimal front Pareto - optimal solutions: The optimal solutions
found in a multiple-objective optimization problem
Pareto - optimal front: the curve formed by joining allthese solution (Pareto - optimal solutions)
-
7/27/2019 Multi Objective OptimizationMOEA
8/31
NondominatedNondominated and dominated solutionsand dominated solutions
Non-dominated -> given two objectives, a non-dominated solution is when none of both solutionare better than the other with respect to two
objectives. Both objectives are equally important.e.g. speed and price.
8
all objectives, and solution ais strictly better than b
in at least one objective, then solution adominatesolution b.
A weakly dominated solution: when solution ais no
worse than bin all objectives.
-
7/27/2019 Multi Objective OptimizationMOEA
9/31
Principles of MultiPrinciples of Multi--Objective OptimizationObjective Optimization(cont.)(cont.)
Simple car design example: two objectives - cost andaccident rate both of which are to be minimized.
A, B, D - One objectivecan only be improved atth x n f t l t
9
A multi-objective optimization algorithm must achieve:1. Guide the search towards the global Pareto-Optimal front.
2. Maintain solution diversity in the Pareto-Optimal front.
one other objective!
-
7/27/2019 Multi Objective OptimizationMOEA
10/31
NonNon--ParetoPareto Classification TechniquesClassification Techniques((Traditional ApproachesTraditional Approaches))
Aggregating the objectives into a single and parameterized
objective function and performing several runs with different
parameter settings to achieve a set of solutions approximating
the Pareto-optimal set.
Weighting Method (Weighting Method (CohonCohon, 1978), 1978)
ons ra n e oons ra n e o o ono on,,
Goal Programming (Goal Programming (SteuerSteuer, 1986), 1986)
MinimaxMinimax Approach (Approach (KoskiKoski, 1984), 1984)
-
7/27/2019 Multi Objective OptimizationMOEA
11/31
Vector Evaluated Genetic AlgorithmVector Evaluated Genetic Algorithm
Proposed by Schaffer in the mid-1980s (1984,1985).
Only the selection mechanism of the GA is modifiedso that at each generation a number of sub-
populations was generated by performingproportional selection according to each objectivefunctionin turn.
Thus, for a problem with kobjectives and a populationsize of M, k sub-populations of size M/k each would begenerated.
These sub-populations would be shuffled together to
obtain a new population of size M, on which the GAwould apply the crossover and mutation operators in theusual way.
11
-
7/27/2019 Multi Objective OptimizationMOEA
12/31
Schematic of VEGA selectionSchematic of VEGA selection
12
-
7/27/2019 Multi Objective OptimizationMOEA
13/31
Advantages and DisadvantagesAdvantages and Disadvantages of VEGAof VEGA
Efficient and easy to implement.
If proportional selection is used, then the
shuffling and merging of all the sub-populations corresponds to averaging thefitness components associated with each of
the objectives. In other words, under these conditions, VEGA
behaves as an aggregating approach and
therefore, it is subject to the same problems ofsuch techniques.
13
-
7/27/2019 Multi Objective OptimizationMOEA
14/31
Problems in Multiobjectives Optimization
Weighting MethodWeighting Method example
Fitness Function = w1 F1(x) + w2 F2(x)
Consider the problem for minimize response time,maximize throughput
= ,
F2(x) = throughputWi = weight value
Then,It is hard to find the values of W1and W2.
It is hard to form a fitness function.
-
7/27/2019 Multi Objective OptimizationMOEA
15/31
Traditional ApproachesTraditional Approaches
Difficulties with classical methods:
Being sensitive to the shape of the Pareto-optimal front (e.g.
weighting method).
Restrictions on their use in some application areas.
Need to several optimization runs to achieve the best parameter
setting to obtain an approximation of the Pareto-optimal set.
-
7/27/2019 Multi Objective OptimizationMOEA
16/31
cu t es w t t e c ass ca mu tcu t es w t t e c ass ca mu t --o ect veo ect veoptimization methodsoptimization methods
Such as weighted sum, -perturbation, goalprogramming, min-max, and others:
1. Repeat many times to find multiple optimal solutions.
2. Require some knowledge about the problem being solved.
16
. -
(e.g. non-convex).4. The spread of optimal solutions depends on efficiency of
the single-objective optimizer.
5. Not reliable in problems involving uncertainties orstochastic.
6. Not efficient for problems having discrete search space.
-
7/27/2019 Multi Objective OptimizationMOEA
17/31
LexicographicLexicographic OrderingOrdering (LO)(LO)
In this method, the user is asked to rank theobjectives in order of importance. The
optimum solution is then obtained byminimizing the objective functions, startingwith the most important one and proceeding
of the objectives.
It is also possible to select randomly a singleobjective to optimize at each run of a GA.
17
-
7/27/2019 Multi Objective OptimizationMOEA
18/31
Advantages andAdvantages and DisadvantagesDisadvantages of LOof LO
Efficient and easy to implement.
Requires a pre-defined ordering of objectives and its
performance will be affected by it.
Selecting randomly an objective is equivalent to aweighted combination of objectives, in which eachweight is defined in terms of the probability that each
objective has of being selected. However, if tournamentselection is used, the technique does not behave likeVEGA, because tournament selection does not requirescaling of the objectives (because of its pair-wise
comparisons). Therefore, the approach may workproperly with concave Pareto fronts.
Inappropriate when there is a large amount of objectives.
18
-
7/27/2019 Multi Objective OptimizationMOEA
19/31
Schematic of an ideal MultiSchematic of an ideal Multi--ObjectiveObjectiveoptimization procedureoptimization procedure
Multi-objectiveoptimization problem
Minimize f1
Minimize f2
Minimize fn
subject to constraints
IDEAL
Multi-objective Optimizerp1
In 1967, Rosenberg hinted the potential ofGenetic Algorithms in multi-objectiveoptimization
No significant study until in 1989 Goldbergoutlined a new non-dominated sortingprocedure
A lot of interest recently because a GA iscapable of finding multiple optimum
19
Multiple trade-off solutions
foundChoose one solution
Higher-levelInformation
Step 2
St solutions in one single run (more than 63
publications in this research area)
-
7/27/2019 Multi Objective OptimizationMOEA
20/31
ParetoPareto--based Techniquesbased Techniques
Suggested by Goldberg (1989) to solve the
problems with Schaffers VEGA.
Use of non-dominated ranking and selection tomove the population towards the Pareto front.
Requires a ranking procedureand a technique tomaintain diversity in the population (otherwise,
the GA will tend to converge to a single solution,because of the stochastic noise involved in theprocess).
20
-
7/27/2019 Multi Objective OptimizationMOEA
21/31
The original Genetic Algorithm (GA)The original Genetic Algorithm (GA)
Initially introduced by Holland in 1975. General-purpose heuristic search algorithm that mimic the natural
selection process in order to find the optimal solutions.
1. Generate a population of random individuals or candidate
solutions to the problem at hand.2. Evaluate of the fitness of each individual in the population.
3. Rank individuals based on their fitness.
21
.
generation.5. Use genetic operations crossover and mutation to generate a
new population.
. !ontinue the process b" going back to step 2 until the problem#s
ob$ectives are satisfied. The best individuals are allowed to survive, mate, and
reproduce offspring.
Evolving solutions over time leads to better solutions.
Th i i l G i Al i h (GA)Th i i l G i Al i h (GA)
-
7/27/2019 Multi Objective OptimizationMOEA
22/31
The original Genetic Algorithm (GA)The original Genetic Algorithm (GA) Flow ChartFlow Chart
A real coded GA representsparameters without coding,which makes representation ofthe solutions very close to thenatural formulation of manyproblems.
22
mutation operators aredesigned to work withreal parameters.
Multi-objective Fitness:1.Non-dominated (best)2.Dominated but feasible
(average)3.Infeasible points (worst)
-
7/27/2019 Multi Objective OptimizationMOEA
23/31
Why using GA?Why using GA?
Using a GA when the search space is large and notso well understood and unstructured.
A GA can provide a surprisingly powerful heuristic
search. Simple, yet it performs well on many different types of
problems:
23
optimization of functions with linear and nonlinearconstraints,
the traveling salesman problem,
machine learning,
parallel semantic networks,
simulation of gas pipeline systems,
problems of scheduling, web search, software testing,
financial forecasting, and others.
M ltiM lti Obj ti E l ti Al ithObj ti E l ti Al ith
-
7/27/2019 Multi Objective OptimizationMOEA
24/31
MultiMulti--Objective Evolutionary AlgorithmObjective Evolutionary Algorithm(MOEA)(MOEA)
An EA is a variation of the original GA.
An MOEA has additional operations to maintain multiplePareto-optimal solutions in the population.
Advantages: Deal simultaneously with a set of possible solutions.
Enable of finding several members of the Pareto optimal set in
24
.
Explore solutions over the entire search space.
Less susceptible to the shape or continuity of the Pareto front.
Disadvantages:
Not completely supported theoretically yet (compared toanother method such as Stochastic Approximation which hasbeen around for half a century).
-
7/27/2019 Multi Objective OptimizationMOEA
25/31
MultiMulti--Objective Genetic Algorithm (MOGA)Objective Genetic Algorithm (MOGA)
Proposed by Fonseca and Fleming (1993).
The approach consists of a scheme in which therank of a certain individual corresponds to thenumber of individuals in the current
population by which it is dominated.
It uses fitness sharing and mating restrictions.
25
-
7/27/2019 Multi Objective OptimizationMOEA
26/31
Advantages andAdvantages and DisadvantagesDisadvantages of MOGAof MOGA
Efficient and relatively easy to implement.
Its performance depends on the
appropriate selection of the sharing factor. MOGA has been very popular and tends
EMOO approaches.
26
Some Applications
Fault diagnosis Control system design
Wings plan form design
-
7/27/2019 Multi Objective OptimizationMOEA
27/31
NondominatedNondominated Sorting GeneticSorting Genetic AlgorithmAlgorithm ((NSGA)NSGA)
Proposed by Srinivas and Deb (1994). It is based on several layers of classificationsof
the individuals.
Nondominated individuals get a certain dummyfitness value and then are removed from the
.
entire population has been classified.
To maintain the diversity of the population,
classified individuals are shared (in decisionvariable space) with their dummy fitness values.
27
-
7/27/2019 Multi Objective OptimizationMOEA
28/31
NSGANSGA Flow ChartFlow Chart
Multi-objective Fitness:1.Non-dominated (best)2.Dominated but feasible (average)3.Infeasible points (worst)
Before selection is performed, the population isranked on the basic of domination: all non-dominated individuals are classified into onecategory (with a dummy fitness value, which isproportional to the population size).To maintain the diversit of the o ulation these
28
classified individuals are shared (in decision
variable space) with their dummy fitness values.Then this group of classified individuals is removedfrom the population and another layer of no-dominated individuals is considered (theremainder of the population is re-classified).The process continues until all the individuals in
the population are classified. Since individuals inthe first front have maximum fitness value, theyalways get more copies than the rest of thepopulation. This allow us to search for non-dominated regions, and results in convergence ofthe population toward such regions. Sharing, on its
part, helps to distribute the population over thisregion.
-
7/27/2019 Multi Objective OptimizationMOEA
29/31
DemoDemo NSGA IINSGA II
0.3
0.35
0.4
0.45
0.5
CPI
[Gellert et al., 2012]Multi-ObjectiveOptimizations for aSuperscalar
Architecture withSelective ValuePrediction, IET
29
0.25
7.00E+09 1.20E+10 1.70E+10 2.20E+10 2.70E+10 3.20E+10 3.70E+10 4.20E+10 4.70E+10
Energy
Run without fuzzy Run with fuzzy Manual
http://webspace.ulbsibiu.ro/adrian.florea/html/docs/IET_MultiObjective.pdf
Techniques, Vol. 6, No. 4(July), pp. 205-213, ISSN:1751-8601
- Features of NSGA II
-
7/27/2019 Multi Objective OptimizationMOEA
30/31
The research areaThe research area
Problems:1. The so called standard settings (De Jong, 1990) : population
size of 50-100, crossover rate of 0.6 0.9, and mutation rate of0.001 do not work for complex problems.
2. For complex real-world problems, GAs require parametertunings in order to achieve the optimal solutions.
3. The task of tuning GA parameters is not trivial due to thecomplex and nonlinear interactions among the parameters and
30
being solved (e.g. density of the search space).Research:
1. Self-Adaptive MOEA: use information fed-back from the MOEAduring its execution to adjust the values of parameters attached
to each individual in the population.2. Improve the performance of MOEA: finding wide spread
Pareto-Optimal solutions and reducing computing resources.
3. Make them easier to use and available to more users.
Multi-Objective Evolutionary Algorithms -
-
7/27/2019 Multi Objective OptimizationMOEA
31/31
Multi-Objective Evolutionary Algorithms -
references (MOEAs)
Some representatives of MOEAs in operationalresearch through past years:
a) Non-Dominated Sorting genetic Algorithm (NSGA), Srinivaset Deb, 1995.
b) NSGA-II, Deb et al., 2002.
c) Strength Pareto Evolutionary Algorithm (SPEA), Zitzler andThiele 1 .
d) SPEA2, Zitzler et al., 2001.
e) Epsilon-NSGAII, Kollat and Reed, 2005.
f) Multi-objective Shuffled Complex Evolution MetropolisAlgorithm (MOSCEM-UA), Vrugt et al., 2003.