New Genetic or Evolutionary Algorithm for multiobjective optimization, that attempts to find tradeoff solutions and scales easily with increase in parameter space as well as objective space. Does not use complex niche calculation that is used in existing multiobjective genetic algorithms.
call girls in Harsh Vihar (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
Toward a Natural Genetic / Evolutionary Algorithm for Multiobjective Optimization
1. Toward a Natural
Genetic/Evolutionary Algorithm for
Multiobjective Optimization
Hariharane Ramasamy
Evolutionary Systems Inc., Cupertino, CA.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 1/58
2. Outline
1. Problem Description
2. Practical Examples
3. Classical Algorithms
4. Genetic Algorithms
5. Multiobjective Genetic Algorithms
6. Extended Genetic Algorithms
7. Results
8. Protein Folding—Lattice Model
9. Future Research
10. Conclusion
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 2/58
3. Problem Description
1. In most practical optimization problems, one deals with
simultaneous optimization of multiple objectives that may
conflict with one another.
2. We seek efficient ways to optimize systems specified by a
tuple of parameters (not necessarily numerical). Each
tuple determines a system with fitnesses for the individual
objectives, this maps the parameter space into the fitness
space.
3. The best compromise solutions are called the tradeoff or
nondominated solutions and form the Pareto front. The
parameter set associated with the Pareto front is called
the Pareto set
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 3/58
4. An Example
Figure 1: Multiobjective problem example with two objectives
The plot is an example of a biobjective problem (f1 , f2 ) that depends on two parameters (x1 ,
x2 ). Assuming minimization, the thick dark line represents the best tradeoff solutions and
constitute the Pareto front. In the above figure, point O is dominated by all points in the
interior of the quadrant OAB. The lines OA, OB, OC, and OD define the boundary points
where f1 or f2 gets better or worse by moving either horizontally or vertically with respect to
point O. However, the points inside the region OAC and ODB are neither dominated nor
nondominated by the point O. Each one of them has either f1 or f2 optimal, but not both with
respect to point O.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 4/58
5. Local and Global Pareto Front
A local Pareto front dominates every point in its neighborhood and a global Pareto front
dominates every point in the objective space. In the left plot, A and B are two disjoint
objective regions. Assuming minimization of objectives, B contains the global Pareto front
and A the local Pareto front. In the right plot, the curve AOBCD contains the entire objective
space; OB is A local Pareto front, and the global Pareto front is the union of AO and CD.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 5/58
6. Multiobjective problem
Thus a multiobjective problem can be stated as:
Minimize F (x) = (f1 (x), f2 (x), ..., fm (x))
such that x ∈S
x = (x1 , x2 , ..., xn )
S is the feasible parameter space
m is the number of objectives
we seek feasible configurations, tuples (x1 , x2 ,...,xn ) that maps into points
(f1 , f2 ,...,fm ) on objective space, which can’t be improved w.r.t any
objective without worsening them w.r.t. some other objective.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 6/58
7. Practical Examples
1. Protein folding problem
2. Multiple knapsack problem
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 7/58
8. Protein Folding Problem
Protein Fold
The primary sequence of a protein (left), based on its composition of amino acids, folds into
a unique three dimensional structure (right) under certain physiological conditions. The
folding is driven by multiple physio-chemical properties like hydrogen bonds, protecting
hydrophobes from water- a multiobjective problem.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 8/58
9. Protein Folding—Lattice Model
Allowed Moves Buried Hydrophobe Rule
.
in in
The amino acids in the primary sequence of a protein is connected to each other by the
formation of peptide bond that resembles a rectangular lattice. Using a three dimensional
rectangular lattice, protein folding is simulated with move rules defined by the left picture.
The right picture defines the condition under which a Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 9/58
side chain in the rectangular lattice is
10. Multiple Knapscack Problem
Problems such as bin packing, cutting stock, and financial management
can all be modeled as multiple knapsack problems, in which;
1. we are given a set of items with specific weights and profits and
multiple knapsacks with fixed capacity.
2. We have to fill up each knapsack, maximizing its profits and space
utilization.
We present results for two variations of multiple knapsack problems.
1. In one problem, when an item is selected, it is included in all
knapsacks.
2. In the second, the selected item is included in only one knapsack.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 10/58
11. Existing Methods
1. Traditional methods
2. Genetic algorithm methods
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 11/58
12. Summary of Traditional Methods
Name Advantages Disadvantages
Weighted Simple, works well when Pareto Computationally expensive, fails
Methods front is simple. with complicated Pareto fronts.
ǫ-constraint Works with complicated regions. Success largely depends on the
Method initial solution, which might be
selected in an infeasible region.
Lexicographic Simple, optimizes objectives se- A limited number of Pareto opti-
Approach quentially by predefine priority. mal points are found.
Normal Bound- Finds a well-spread Pareto Fails with complicated land-
ary Intersec- points. scapes in objective space.
tion (NBI)
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 12/58
13. Genetic Algorithm
Genetic algorithm, motivated from natural evolution, repeatedly apply genetic operators on a
pool of solutions called populations.
generate initial population P randomly
set new population Pn
while desired convergence is achieved in the population do
perform crossover with probability pc
if the fitness of offspring are better then parents then
add the offspring to Pn
else
decide with very low probability to include offspring in Pn
end if
perform mutation with probability pm
perform reproduction with probability pr
if size of(Pn ) ≥ N then
replace P with N members from Pn
reduce size of Pn by N
end if
end while
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 13/58
14. Genetic Algorithm—Metrics
1. Diversity—Genetic Algorithm often converge to a few dominant solutions and
lose diversity. Any further optimization is ineffective since repeated
application of operators yield the few converged dominant solutions. Diversity
is essential for multiobjective problems.
2. Elitism—selecting individuals with a bias to create better individuals is called
elitism; selecting the best parents to replace the less fit members drives the
population to converge to a few best parents.
3. Scalability—Performance of the algorithm should not deteriorate with an
increase in the number of objectives and parameters.
4. Exploration—Ability of an algorithm to find new solutions or reproduce lost
solutions is called exploration.
5. Exploitation—Ability to retain the current best solutions in the population is
called exploitation.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 14/58
15. Niching
1. Goldberg and Richardson proposed the initial method to promote the
diversity within the population.
2. The method used a distance function, called sharing function, which
calculates similarities within the population members using a
parameter called niche radius in parameter or objective space.
3. The members in a crowded neighborhood get their fitness degraded,
thus preventing their selection for next generation.
4. The sharing function helps to keep the solutions diverse.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 15/58
16. Niching
1. Niche radius is difficult to determine. Too large a radius will ignore good tradeoff
solutions; too small can consider points that are already dominated in the set.
2. When there are more than two objectives, the number of solutions on the Pareto front
increases, along with the complexity of the niche calculation.
3. In practical problems with more than two objectives, the sharing function has met little
success.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 16/58
17. Multiobjective Genetic Algorithms
Name Advantages Disadvantages
Pareto Gives a clear description for Does not scale well with problem
Archived class of problems that genetic al- dimension.
Evolution gorithm should excel.
Strategy
Vector Evalu- Executes by population reshuf- Loses good solutions and does
ated Genetic fling using each objective func- not have a good spread of Pareto
Algorithm tion. optimal set.
Weight-Based Similar to weighted methods, Fails when the objective function
Genetic Algo- evolves the weight along with is nonconvex.
rithm the optimization of the objective
function.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 17/58
18. Multiobjective Genetic Algorithms
Name Advantages Disadvantages
Niched Pareto Builds a small nondominated set Success largely depends on the
Genetic Algo- along with the algorithm. small nondominated subset.
rithm
Nondominated Niching is performed in the deci- Niching function is too complex,
Sorting Ge- sion space. and scales poorly as number of
netic Algorithm objectives increases.
Strength Finds well-spread Pareto points clustering is performed to main-
Pareto Evo- by maintaining an external non- tain the size of the external
lutionary dominated set of fixed size. Pareto set. This clustering pro-
Algorithm cess scales poorly with the size.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 18/58
19. Extended Genetic Algorithms
Will be published in paper, only results are presented here
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 19/58
20. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. find more that one distinct points in the Pareto set that map to the same point
in the Pareto front?
8. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 20/58
21. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. find more that one distinct points in the Pareto set that map to the same point
in the Pareto front?
8. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 21/58
22. Schaeffer-1
25
Table 1: Summary of Results—Schaeffer 1
20 Name Pop. Size Total - Pareto front Points Duration
NSGA 100 ≤ 30
15
sec
f2
10 SPEA 100 100 ≤ 30 sec
MOEAD 100 100 ≤ 30 sec
5
NSGA 1000 1000 2 min
0 SPEA 1000 1000 8 min
0 5 10 15 20 25
f1 MOEAD 1000 300 11 sec
MOEAD 10000 300 11 sec
SEGA 100 2541 ≤ 30 sec
SEGA 100 18500+ 3 min
PEGA 100 608 ≤ 30 sec
PEGA 100 900+ 3 min
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 22/58
23. Schaeffer-2
35
Table 2: Summary of Results—Schaeffer 2
30
Name Pop. Size Total - Pareto front Points Duration
25
NSGA 100 100 ≤ 30
20 sec
f2
15 SPEA 100 100 ≤ 30
10 sec
5 MOEAD 100 100 ≤ 30
0 sec
-1 0 1 2 3 4
SEGA 100 3767 ≤ 30
f1
sec
PEGA 100 x 2 409 ≤ 30
sec
SEGA 100 114000+ 3 min
PEGA 100 x 2 2300+ 3 min
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 23/58
24. Viennet
Table 3: Viennet—Summary of Results
60
Name Unique PF points Unique PS points
50 NSGA 100 100
40 SPEA 1 1
f2
30 MOEAD 40 65
20 SEGA 2394 2405
10 PEGA 3522 3605
0
0 1 2 3 4 5 6 7 8 9
f1
0.2
0.15
f3
0.1
0.05
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 24/58
0
25. Kalyanamoy
DTLZ4 - SEGA Population Plot Table 4: Kalayanomoy Scalable Multiobjective Problem
SEGA
PEGA DTLZ4—Seven Objectives—Summary of Results
Name Unique Pareto front points Unique Pareto set points
1
NSGA2 999 999
0.8
SEGA 744 773
f3 0.6
0.4 PEGA 3728 4815
0.2
0
1
0.8
0 0.6
0.2 0.4 f2
0.4
0.6 0.2
f1 0.8
10
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 25/58
26. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. find more that one distinct points in the Pareto set that map to the same point
in the Pareto front?
8. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 26/58
27. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 500 Generations
35000
1000
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
28. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 1000 Generations
35000
1000
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
29. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 1500 Generations
35000
1500
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
30. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 2000 Generations
35000
2000
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
31. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 2500 Generations
35000
2500
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
32. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 3000 Generations
35000
3000
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
33. Pareto front—Progress
Knapsack 750-3 - F1 Versus F2 after 3300 Generations
35000
3300
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 27/58
34. Pareto front
Knapsack 750-3 - PF F1 Versus F2 after 3300 Generations
35000
500
1000
1500
2000
2500
3000
3300
30000
f3 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 28/58
35. Pareto front
Knapsack 750-3 - PF F1 Versus F3 after 3300 Generations
35000
500
1000
1500
2000
2500
3000
3300
30000
f3 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 28/58
36. Pareto front
Knapsack 750-3 - PF F2 Versus F3 after 3300 Generations
35000
500
1000
1500
2000
2500
3000
3300
30000
f3 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 28/58
37. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. find more that one distinct points in the Pareto set that map to the same point
in the Pareto front?
8. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 29/58
38. Exploration And Exploitation
PEGA - F1 Vs F2 Population
35000
F1
F2
F3
F1-F2
F1-F3
F2-F3
F1-F2-F3
30000
f2 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 30/58
39. Exploration And Exploitation
PEGA - F2 Vs F3 Population
35000
F1
F2
F3
F1-F2
F1-F3
F2-F3
F1-F2-F3
30000
f3 Profits
25000
20000
20000 25000 30000 35000
f2 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 30/58
40. Exploration And Exploitation
PEGA - F1 Vs F3 Population
35000
F1
F2
F3
F1-F2
F1-F3
F2-F3
F1-F2-F3
30000
f3 Profits
25000
20000
20000 25000 30000 35000
f1 Profits
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 30/58
41. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 31/58
42. Scaling with Objectives
Kalyanamoy Scalable Objective Test function - Five
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 32/58
43. Scaling with Objectives
DTLZ4—f1 vs f2 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 5 Objectives - PF f1 vs f2 - NSGA2 DTLZ4 - 5 Objectives - PF f1 vs f2 - SEGA DTLZ4 - 5 Objectives - PF f1 vs f2 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f2
f2
f2
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f1 f1 f1
Figure 2: Results of DTLZ4 with five objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 33/58
44. Scaling with Objectives
DTLZ4—f3 vs f4 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 5 Objectives - PF f3 vs f4 - NSGA2 DTLZ4 - 5 Objectives - PF f3 vs f4 - SEGA DTLZ4 - 5 Objectives - PF f3 vs f4 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f4
f4
f4
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f3 f3 f3
Figure 3: Results of DTLZ4 with five objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 34/58
45. Scaling with Objectives
DTLZ4—f1 vs f5 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 5 Objectives - PF f1 vs f5 - NSGA2 DTLZ4 - 5 Objectives - PF f1 vs f5 - SEGA DTLZ4 - 5 Objectives - PF f1 vs f5 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f5
f5
f5
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f1 f1 f1
Figure 4: Results of DTLZ4 with five objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 35/58
46. Scaling with objectives
Kalyanamoy Scalable Objective Test function - Seven
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 36/58
47. Scaling with Objectives
DTLZ4—f1 vs f2 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 7 Objectives - PF f1 vs f2 - NSGA2 DTLZ4 - 7 Objectives - PF f1 vs f2 - SEGA DTLZ4 - 7 Objectives - PF f1 vs f2 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f2
f2
f2
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f1 f1 f1
Figure 5: Results of DTLZ4 with seven objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 37/58
48. Scaling with Objectives
DTLZ4—f3 vs f4 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 7 Objectives - PF f3 vs f4 - NSGA2 DTLZ4 - 7 Objectives - PF f3 vs f4 - SEGA DTLZ4 - 7 Objectives - PF f3 vs f4 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f4
f4
f4
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f3 f3 f3
Figure 6: Results of DTLZ4 with seven objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 38/58
49. Scaling with Objectives
DTLZ4—f5 vs f6 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 7 Objectives - PF f5 vs f6 - NSGA2 DTLZ4 - 7 Objectives - PF f5 vs f6 - SEGA DTLZ4 - 7 Objectives - PF f5 vs f6 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f6
f6
f6
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f5 f5 f5
Figure 7: Results of DTLZ4 with seven objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 39/58
50. Scaling with Objectives
DTLZ4—f1 vs f2 —NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 7 Objectives - PF f1 vs f7 - NSGA2 DTLZ4 - 7 Objectives - PF f1 vs f7 - SEGA DTLZ4 - 7 Objectives - PF f1 vs f7 - PEGA
2 2 2
NSGA2 SEGA PEGA
1.5 1.5 1.5
1 1 1
f7
f7
f7
0.5 0.5 0.5
0 0 0
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
f1 f1 f1
Figure 8: Results of DTLZ4 with seven objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 40/58
51. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 41/58
52. Scaling with Parameter Space
Kalyanamoy Scalable Objective Test function - Five
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 42/58
53. Scaling with Parameter Space
DTLZ4—Pareto set (x1 , x2 , x3 )—NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 5 Objectives - PS (x1,x2,x3) - NSGA2 DTLZ4 - 5 Objectives - PS (x1,x2,x3) - SEGA DTLZ4 - 5 Objectives - PS (x1,x2,x3) - PEGA
NSGA2 SEGA PEGA
1 1 1
0.8 0.8 0.8
x3 0.6 x3 0.6 x3 0.6
0.4 0.4 0.4
0.2 0.2 0.2
0 0 0
1 1 1
0.8 0.8 0.8
0 0.6 0 0.6 0 0.6
0.2 0.4 x2 0.2 0.4 x2 0.2 0.4 x2
0.4 0.4 0.4
0.6 0.2 0.6 0.2 0.6 0.2
x1 0.8 x1 0.8 x1 0.8
10 10 10
Figure 9: Results of DTLZ4 with five objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 43/58
54. Scaling with Parameter Space
Kalyanamoy Scalable Objective Test function - Seven
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 44/58
55. Scaling with Parameter Space
DTLZ4—Pareto set (x1 , x2 , x3 )—NSGA (left), SEGA (middle), PEGA(right)
DTLZ4 - 7 Objectives - PS (x1, x2, x3) - NSGA2 DTLZ4 - 7 Objectives - PS (x1, x2, x3) - SEGA DTLZ4 - 7 Objectives - PS (x1, x2, x3) - PEGA
NSGA2 SEGA PEGA
1 1 1
0.8 0.8 0.8
x3 0.6 x3 0.6 x3 0.6
0.4 0.4 0.4
0.2 0.2 0.2
0 0 0
1 1 1
0.8 0.8 0.8
0 0.6 0 0.6 0 0.6
0.2 0.4 x2 0.2 0.4 x2 0.2 0.4 x2
0.4 0.4 0.4
0.6 0.2 0.6 0.2 0.6 0.2
x1 0.8 x1 0.8 x1 0.8
10 10 10
Figure 10: Results of DTLZ4 with seven objectives
Among the tests we did, only the NSGA results can be compared qualitatively with SEGA
and PEGA. The graph clearly show the superior performance of EGAs.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 45/58
56. Multiple Knapsack Problem(Exclusive)
We present results for 750 items and 3 knapsacks
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 46/58
57. Three Knapsacks, 750 Items—Profits
SEGA f1 vs f3 PEGA f1 vs f3
Knapsack 750-3 - f1 vs f3 Profits - SEGA Knapsack 750-3 - f1 vs f3 Profits - SEGA
35000 35000
30000 30000
25000 25000
20000 20000
f3
f3
15000 15000
10000 10000
5000 5000
0 0
0 5000 10000 15000 20000 25000 30000 35000 0 5000 10000 15000 20000 25000 30000 35000
f1 f1
Figure 11: Profits KP 750-3. The left and right columns show the profits-to-weights
ratio obtained by the sequential and parallel extended genetic algorithms for f1 vs
f3 . The conflicting nature of the objective functions can be easily seen in the plots
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 47/58
58. Three Knapsacks, 750 Items—Items
SEGA f1 vs f3 PEGA f1 vs f3
Knapsack 750-3 - f1 vs f3 Items - SEGA Knapsack 750-3 - f1 vs f3 Items - PEGA
750 750
500 500
Items in KS3
Items in KS3
250 250
0 0
0 250 500 750 0 250 500 750
Items in KS1 Items in KS1
Figure 12: Profits KP 750-3. The plots show the items obtained by the sequential
and parallel extended genetic algorithms for f1 vs f3 . The conflicting nature of the
objective functions can be easily seen in the plots
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 48/58
59. Parameter Space—Exploration
Solutions that Map to Similar Sum
Knapsack 750-3 - Number of unique solutions
800
Total Items Included in all Knapsacks(SEGA+PEGA)
750
700
650
600
0 25 50 75 100
Number of Occurrences
Figure 13: Item KP 750-3. The plots show the number of solutions assigned to the
three knapsacks that map to the same sum
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 49/58
60. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 50/58
61. Scattered Pareto Set
Non Dominated Points/Non Dominated Set
F1 versus F2 Corresponding x1, x2, x3 - Set
1
0.8
1
0.6
0.5
f2
0.4 x3
0
0.2 -0.5 1
-1 0 0.5
0.2 0.4 0 x
0 -0.5 2
0 0.2 0.4 0.6 0.8 1 x1 0.6 0.8
1 -1
f1
F1 versus F2 Corresponding x1, x2, x3 - Set
1
0.8
1
0.6
0.5
f2
0.4 x3
0
0.2 -0.5 1
-1 0 0.5
0.2 0.4 0 x
0 -0.5 2
0 0.2 0.4 0.6 0.8 1 x1 0.6 0.8
1 -1
f1
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 51/58
62. Results
We’ll see that our versions outperformed a large number of existing algorithms, on
a battery of tests commonly used in the literature to evaluate optimization
methods. Specifically, do our proposed methods :
1. find more Pareto front points with ease?
2. effectively move towards Pareto front?
3. explore and exploit the search space effectively?
4. scale with increase in number of objectives and parameters?
5. find different Pareto set points that map to one point in objective space?
6. scale well with scattered Pareto set?
7. have a framework for flexible extension?
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 52/58
63. Flexibility
The results presented so far were ran with the same algorithm
without altering any parameters.
Niching methods require suitable parameter selection to reach
Pareto front which is totally absent here.
Population size is not increased to obtain a larger Pareto set. This is
a great advantage as we cannot provide infinite source to hold the
population.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 53/58
64. Protein Folding - Interesting Results
Helix Beta Sheets
Figure 14: Two interesting results obtained by extended genetic algorithms. The red and blue
represents hydrophobic and hydrophilic amino acids. The two green lattices, connecting the
two helices, represent the glycine, which has four additional lattice movements. Glycine is
often found near sharp turns in the protein due to its small size. In the right result, every other
member in the sequence is hydrophobic, and hence the algorithm produced a beta sheet that
folds against itself to bury the hydrophobes
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 54/58
65. Conclusion
Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is
explored simultaneously by different members in the population. Extended genetic
algorithms successfully extended pseudo-parallelism not only in parameter space but
also in objective space.
The new algorithms leave much room for extension and improvements. A few key
enhancements are mentioned in Future research.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
66. Conclusion
Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is
explored simultaneously by different members in the population. Extended genetic
algorithms successfully extended pseudo-parallelism not only in parameter space but
also in objective space.
Parallel and sequential extended genetic algorithms represent two different extensions
of the genetic algorithm. Both the algorithms were applied for the first time to the
protein folding problem and were presented in several conferences in the year 1996.
The new algorithms leave much room for extension and improvements. A few key
enhancements are mentioned in Future research.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
67. Conclusion
Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is
explored simultaneously by different members in the population. Extended genetic
algorithms successfully extended pseudo-parallelism not only in parameter space but
also in objective space.
Parallel and sequential extended genetic algorithms represent two different extensions
of the genetic algorithm. Both the algorithms were applied for the first time to the
protein folding problem and were presented in several conferences in the year 1996.
Extended genetic algorithms have better performance in exploring objective and
parameter space than do existing multiobjecitve genetic algorithms.
The new algorithms leave much room for extension and improvements. A few key
enhancements are mentioned in Future research.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
68. Conclusion
Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is
explored simultaneously by different members in the population. Extended genetic
algorithms successfully extended pseudo-parallelism not only in parameter space but
also in objective space.
Parallel and sequential extended genetic algorithms represent two different extensions
of the genetic algorithm. Both the algorithms were applied for the first time to the
protein folding problem and were presented in several conferences in the year 1996.
Extended genetic algorithms have better performance in exploring objective and
parameter space than do existing multiobjecitve genetic algorithms.
Duplication and transposon operators were introduced for the first time in the genetic
algorithm.
The new algorithms leave much room for extension and improvements. A few key
enhancements are mentioned in Future research.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
69. Conclusion
Pseudo-parallelismis a feature in the genetic algorithm in which parameter space is
explored simultaneously by different members in the population. Extended genetic
algorithms successfully extended pseudo-parallelism not only in parameter space but
also in objective space.
Parallel and sequential extended genetic algorithms represent two different extensions
of the genetic algorithm. Both the algorithms were applied for the first time to the
protein folding problem and were presented in several conferences in the year 1996.
Extended genetic algorithms have better performance in exploring objective and
parameter space than do existing multiobjecitve genetic algorithms.
Duplication and transposon operators were introduced for the first time in the genetic
algorithm.
Extended genetic algorithms’ performed better with increase in the number of
objectives and variables than were the multiobjective genetic algorithms.
The new algorithms leave much room for extension and improvements. A few key
enhancements are mentioned in Future research.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 55/58
70. Future Research
Natural Genetic Algorithm—described
in the next slide represents the combination of
parallel and sequential extended genetic algorithm.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 56/58
71. Future Research
Natural Genetic Algorithm—described
in the next slide represents the combination of
parallel and sequential extended genetic algorithm.
Coding—binary coding is used in most of the problems. The selection of coding depends
on the type of problem and the parameters associated with the problem.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 56/58
72. Future Research
Natural Genetic Algorithm—described
in the next slide represents the combination of
parallel and sequential extended genetic algorithm.
Coding—binary coding is used in most of the problems. The selection of coding depends
on the type of problem and the parameters associated with the problem.
Operators—duplication and transposon operators used an arbitrary method. There are
different ways the operators can be performed.
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 56/58
73. Future Research
Toward a Natural Genetic / Ramasamy Algorithm
Toward a Natural Genetic/Evolutionary Algorithm for Multiobjective Optimization – p. 57/58