SlideShare ist ein Scribd-Unternehmen logo
1 von 41
Downloaden Sie, um offline zu lesen
MULTIOBJECTIVE OPTIMIZATION AND
PERFORMANCE METRICS ENSEMBLE

                Gary G. Yen, FIEEE
                    gyen@okstate.edu

         Professor, Oklahoma State University
Past President, IEEE Computational Intelligence Society
ieee-wcci2014.org
Multiobjective Optimization

   Optimization problems involve more than one objective
    functions
   Very common, yet difficult problems in the field of science,
    engineering, and business management
   Nonconflicting objectives: achieve a single optimal solution
    satisfies all objectives simultaneously    SOPs
   Competing objectives: cannot be optimized simultaneously
   MOP– search for a set of “acceptable”– maybe only
    suboptimal for one objective– solutions is our goal
   In operation research/management terms - multiple criterion
    decision making (MCDM) (International Society on MCDM;
    http://www.terry.uga.edu/mcdm/)
Why MOP? Buying an Automobile

                     Objective = reduce
                        cost, while maximize
                        comfort
                       Which solution (1, A,
                        B, C, 2) is best ???
                       No solution from this
                        set makes both
                        objectives look better
                        than any other
                        solution from the set
                       No single optimal
                        solution
                       Trade off between
                        conflicting objectives-
                        cost and comfort
Mathematical Definition

•    Mathematical model to formulate the optimization
     problem
    Objective   Decision   Environment    Equality      Inequality     Variable
    vectors     vectors    states         constraints   constraints    bounds


    min{y  f (x, e) : h(x, e)  0, g(x, e)  0, x L  x  xU }
    x
      n


      o     Design Variables: decision and objective vector
      o     Constraints: equality and inequality
      o     Greater-than-equal-to inequality constraint can be converted to
            less-than-equal-to constraint by multiplying -1
      o     Objective Function: maximization can be converted to
            minimization due to the duality principle max f (x)  min ( f (x))
Pareto Optimality

•   Formal Definition: the minimization of the n components
                           f k , k  1,, n
    of a vector function f of a vector variable x in a universe μ, where

                 f (x)  ( f1 (x), f 2 (x),, f n (x))

•   Then a decision vector xu   is said to be Pareto-optimal if and
    only if there is no xv   for which v  f (xv )  (v1 ,, vn )
    dominates u  f (xu )  (u1 ,, un ) , that is, there is no x v   such that


      i {1,, n}, vi  ui       and         i {1,, n} | vi  ui
   When encounter problems with many objectives (more
    than five), nearly all algorithms performs poorly because
    of loss of selection pressure in fitness evaluation solely
    based upon Pareto domination.
Distinctions from SOP

•   Multiple conflicting objectives as opposed to single one
•   Multiple optima vs. single optimum
•   Two goals instead of one
     o Progressing towards the Pareto front
     o Maintaining a diverse set of solutions in the non-dominated front
•   Dealing with two search spaces
     o A decision variable space plus an objective space
     o A proximity of two solutions in one space does not mean a
       proximity in the other space
     o Search is performed in the decision space
Disadvantages of Classical Methods
•   We need prior knowledge of the problem domain to
    result in to a single objective optimization problem
    (e.g., weight vector,  constraints)

•   Results in a single solution for each run

•   Non-uniformity in Pareto-optimal solution

•   Require fitness function to be linear, continuous and
    differentiable

•   Cannot deal with MOPs having discontinuous and
    concave Pareto fronts
Why Population-Based Heuristics?

•   An unorthodox, stochastic, and population based parallel
    searching algorithm maybe more suitable for MOPs
•   Classification of EA’s–
    o Genetic Algorithm;
    o Genetic Programming;
    o Evolutionary Strategy;
    o Ant Colony;
    o Artificial Immune System;
    o Particle Swarm Optimization;
    o Differential Evolution;
    o Memetic Algorithm
Efforts in Enhancing a PSO for MOPs

• Modifying the fitness assignment

• Improving PSO flight mechanism

• Enhancing the convergence

• Preserving the diversity

• Managing the population

• Constraints and uncertainty handling

• Knowledge Management through Culture/Meme
Performance Metrics

• To quantify the
  performance of
  evolutionary multiobjective
  algorithms according two
  essential metrics dictated
  by Pareto Optimality

     Convergence measure

     Diversity measure
Current Practice

   In literature, when an MOEA is proposed,
   a number of benchmark problems are chosen to
    quantify the performance, and
   based on a set of heuristically chosen performance
    metrics, the proposed MOEA and some competitive
    representatives are evaluated statistically given a
    large number of independent trials.
   The conclusion, if any been drawn, is often
    indecisive and reveals no additional insight
    pertaining to the specific problem characteristics that
    the proposed MOEA would do the best
   By the No Free Lunch theorem, any algorithm’s
    elevated performance over one class of problems is
    exactly paid for in loss over another class.



   Our Goal is to rank the MOEAs considered based on
    a more comprehensive measure (hybrid
    performance metric),

    revealing specific problem characteristics that the
    underlying MOEA could perform the best.
Case Study

   Five state-of-the-art MOEAs
    –   SPEA 2, NSGA-II, PESA-II, IBEA, and MOEA/D
   Six Benchmark Problems
    –   2-objective ZDT1, ZDT2, ZDT3, ZDT4, ZDT6
    –   3-objective DTLZ2, 5-objective WFG1, WFG2, and
    –   10-objective DTLZ1
   Five Performance Metrics
    –   Inverted Generational Distance (IGD),
    –   Pareto Dominance Indicator (NR),
    –   Maximum Spread (MS),
    –   Spacing, and
    –   Hypervolume Indicator
Performance Metrics Ensemble

   For the same initial population, all five MOEAs will
    generate a non-dominated front for a given benchmark
    function with specific problem characteristics.
   A randomly chosen performance metric is used to
    identify the winner of the non-dominated front and its
    associated MOEA.
   This process will be repeated 50 times to gain
    meaningful statistics.
   These 50 non-dominated fronts could come from either
    one of five MOEAs and each of five performance
    metrics could be used for multiple times.
ZDT1

 Generates 50 non-dominated fronts as the initial
 population of Double Elimination Tournament Selection:




 SPEA 2     NSGA-II      IBEA      PESA-II    MOEA/D
   19          11          3          5          12

  IGD         NR       Spacing    S-metric      MS
   11         10          12         10          7
Flow Chart

      Input:
      MOEAs                               Output:
                         Specific         Rank Value of All
                         Benchmark        MOEAs
    50                   Problem
    Approximation                                      YES
    fronts


   Double                            NO
                                               No. Remain
   Elimination to
                                               fronts is 0?
   obtain best front


   Identify the Winner                    Eliminate All Fronts
   Algorithm and                          from Winner
   Assign Its Rank                        Algorithm
   Value
Double Tournament Elimination

                          50 Winners from 50 Running Times



          Winner Bracket (25)                              Loser Bracket (25)



   13 Winners           13 Losers                13 Winners               13 Losers



                                13 Winners     13 Losers

Reserved as Winner       Reserved as Loser
Bracket in the Next      Bracket in the Next                             Eliminate
Round                    Round
Round 1

   50 fronts are competed to down select to 26 fronts
    (13 in winner bracket and 13 in loser bracket) going
    through 25 + 12 +12 + 13 = 62 binary tournaments:



    SPEA 2     NSGA-II       IBEA      PESA-II     MOEA/D
       9           8           0           1           8
     IGD          NR       Spacing     S-metric       MS
      13          13          11          13          12
Round 2

          Winner Bracket (13)                           Loser Bracket (13)



    7 Winners            7 Losers               7 Winners              7 Losers




                           7 Winners         7 Losers




Reserved as Winner     Reserved as Loser
Bracket in the Next    Bracket in the Next                           Eliminate
Round                  Round
   26 fronts are competed to down select to14 fronts
    (7 in winner bracket and 7 in loser bracket) going
    through 6 + 6 + 7 = 19 binary tournaments:


SPEA 2       NSGA-II      IBEA      PESA-II    MOEA/D
     6           2          0           1          5

    IGD         NR       Spacing    S-metric      MS
     5           4          5           2          3
Round 3

          Winner Bracket (7)                            Loser Bracket (7)



    4 Winners            4 Losers               4 Winners              4 Losers




                               4 Winners     4 Losers




Reserved as Winner     Reserved as Loser
Bracket in the Next    Bracket in the Next                           Eliminate
Round                  Round
   14 fronts are competed to down select to 8 fronts
    (4 in winner bracket and 4 in loser bracket) going
    through 3 + 3 + 4 = 10 binary tournaments:


    SPEA 2     NSGA-II      IBEA       PESA-II    MOEA/D
       3           2          0           0              3
     IGD          NR       Spacing     S-metric      MS
       3           1          3           3              0
Round 4

          Winner Bracket (4)                            Loser Bracket (4)



    2 Winners            2 Losers               2 Winners              2 Losers




                               2 Winners     2 Losers




Reserved as Winner     Reserved as Loser
Bracket in the Next    Bracket in the Next                           Eliminate
Round                  Round
   8 fronts are competed to down select to 4 fronts
    (2 in winner bracket and 2 in loser bracket) going
    through 2 + 2 + 2 = 6 binary tournaments:


    SPEA 2    NSGA-II      IBEA       PESA-II    MOEA/D
      2           0           0          0           2
     IGD         NR       Spacing    S-metric       MS
      1           0           2          1           2
Round 5

          Winner Bracket (2)                            Loser Bracket (2)



    1 Winners            1 Losers               1 Winners              1 Losers




                               1 Winners     1 Losers




Reserved as Winner     Reserved as Loser
Bracket in the Next    Bracket in the Next                           Eliminate
Round                  Round
   4 fronts are competed to down select to 2 fronts
    (1 in winner bracket and 1 in loser bracket) going
    through 1 + 1 + 1 = 3 binary tournaments : :


SPEA 2        NSGA-II      IBEA       PESA-II    MOEA/D
      1          0           0           0           1
    IGD         NR        Spacing     S-metric      MS
      1          0           0           1           1
Round 6


   Winner Bracket (1)                     Loser Bracket (1)




                        1 Final Winners
   In the final that 2 fronts are competed to generate the
    final winner.
   About 152 binary tournaments were held to decide a
    final winner.
    SPEA 2     NSGA-II      IBEA       PESA-II    MOEA/D

      1           0           0           0           0

     IGD         NR        Spacing     S-metric      MS
      0           0           0           1           0

   Removing 18 fronts generated by SPEA 2, the
    remaining 32 fronts will go through the process again…
Final Ranking
• 35 repeated and independent experiments are done for each
  function and the findings have been consistent

Ranking    2-obj     2-obj     2-obj     2-obj     2-obj    3-obj     5-obj     5-obj     10-obj
 Order     ZDT1      ZDT2      ZDT3      ZDT4      ZDT6     DTLZ2     WFG1      WFG2      DTLZ1

  1       SPEA 2    SPEA 2    NSGA-II   MOEA/D    MOEA/D     IBEA      IBEA      IBEA      IBEA


  2       MOEA/D    MOEA/D    MOEA/D    NSGA-II    IBEA     MOEA/D    MOEA/D    MOEA/D    NSGA-II


  3       NSGA-II   NSGA-II    IBEA     PESA-II   NSGA-II   SPEA 2    SPEA 2    NSGA-II   MOEA/D


  4       PESA-II    IBEA     SPEA 2     IBEA     SPEA 2    NSGA-II   NSGA-II   SPEA 2    SPEA 2


  5        IBEA     PESA-II   PESA-II   SPEA 2    PESA-II   PESA-II   PESA-II   PESA-II   PESA-II
Observations on SPEA2

   It is the final winner in problem ZDT1 and ZDT2.
   ZDT1 and ZDT2 do not have local Pareto-optimal
    fronts and their global Pareto-optimal fronts are
    continuous.
   IBEA and PESA-II dropped out of competition in the
    first round.
   SPEA2, MOEA/D and NSGA-II compete fiercely till
    round 4.
   SPEA 2 will perform well in problems having
    continuous Pareto-optimal fronts and do not have local
    Pareto-optimal fronts.
   In ZDT1, SPEA 2 is the final winner and it wins under all four
    metrics but is inferior to NSGA-II in S-metric.
   In ZDT2, SPEA 2 is the final winner and it wins under all four
    metrics but it is a little bit worse than NSGA-II in Spacing
    metric.
   In ZDT3, NSGA-II is the final winner and it wins under all four
    metrics but is inferior to MOEA/D in S-metric.
   In ZDT4, MOEA/D is the final winner and it wins under all four
    metrics but it is a little bit worse than NSGA-II in NR metric.
   In ZDT6, MOEA/D is the final winner but is inferior to IBEA in
    MS metric and a little bit worse than NSGA-II in Spacing metric.
   In DTLZ 2, IBEA is the final winner and it wins under all four
    metrics but is inferior to MOEA/D in Spacing metric.
Observations on NSGA-II

   It has the best performance in ZDT3.
   ZDT3 has the discreteness feature and has a
    Pareto-optimal front consisting of several non-
    contiguous convex parts.
   MOEA/D is comparable in performance.
   NSGA-II will perform well in problems having a
    Pareto-optimal front consisting of several
    noncontiguous convex parts.
Observations on MOEA/D

   It wins in both ZDT4 and ZDT6.
   ZDT4 has many local Pareto-optimal fronts, make EAs
    exhibit their ability to deal with multi-modality.
   ZDT6’s Pareto-optimal solutions are non-uniformly
    distributed.
   For ZDT4, SPEA2 was eliminated in early stage of
    competition. For ZDT6, SPEA2 and PESA-II were
    eliminated very early.
   MOEA/D will exhibit its good performance in problems
    with lots of local Pareto-optimal fronts or Pareto-
    optimal solutions are not uniformly distributed its
    global Pareto front.
Observations on IBEA

   It wins all in DTLZ 2, WFG1, WFG2 and DTLZ 1
    which are the test problem having more than two
    objectives.
   Many credible publications support the ranking for
    higher-dimensional benchmark problems.

   We can make a comparatively conclusion that IBEA
    can perform better than others in some test
    problems with high-dimension objectives.
Overall Findings

   Double elimination design allows specific
    characteristic-poor performance of a quality algorithm
    under the special environment still to be able to
    survive through competitions and win it all.
   It gives every individual two chances to take part in the
    competition. This is helpful to reserve good individual,
    especially in some special conditions.
Remarks
   knowing no single metric alone can faithfully quantify
    the performance of a given MOEA under real-world
    scenarios, this study is intended to reveal the insight
    pertaining to specific problem characteristics that the
    underlying MOEA could perform the best.

   For a given real-world problem, if we know its problem
    characteristics (e.g., a Pareto front with a number of
    disconnected segments and a high number of local
    optima), we may make an educated judgment to
    choose the specific MOEA for its superior performance
    given the problem characteristics.
Grand Challenges in EMO

   Groundbreaking applications with smashing success

   Toward Many-Objective Optimization under
    constraints and uncertainties

   Universal fundamentals in all algorithm formulations

   Publicity in Interdisciplinary World

   Education for the next Generations
Q&A

Weitere ähnliche Inhalte

Was ist angesagt?

Tte 451 operations research fall 2021 part 1
Tte 451  operations research   fall 2021   part 1Tte 451  operations research   fall 2021   part 1
Tte 451 operations research fall 2021 part 1Wael ElDessouki
 
Multi-Objective Evolutionary Algorithms
Multi-Objective Evolutionary AlgorithmsMulti-Objective Evolutionary Algorithms
Multi-Objective Evolutionary AlgorithmsSong Gao
 
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...Ajay Kumar
 
Convex optmization in communications
Convex optmization in communicationsConvex optmization in communications
Convex optmization in communicationsDeepshika Reddy
 
Introduction to Optimization.ppt
Introduction to Optimization.pptIntroduction to Optimization.ppt
Introduction to Optimization.pptMonarjayMalbog1
 
NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING karishma gupta
 
Genetic algorithm
Genetic algorithmGenetic algorithm
Genetic algorithmgarima931
 
Applications of linear programming
Applications of linear programmingApplications of linear programming
Applications of linear programmingZenblade 93
 
Duality in lpp
Duality in lppDuality in lpp
Duality in lppAbu Bashar
 
Simplex method
Simplex methodSimplex method
Simplex methodtatteya
 
Operation research unit 1: LPP Big M and Two Phase method
Operation research unit 1: LPP Big M and Two Phase methodOperation research unit 1: LPP Big M and Two Phase method
Operation research unit 1: LPP Big M and Two Phase methodDr. L K Bhagi
 
Duality in Linear Programming
Duality in Linear ProgrammingDuality in Linear Programming
Duality in Linear Programmingjyothimonc
 

Was ist angesagt? (20)

Tte 451 operations research fall 2021 part 1
Tte 451  operations research   fall 2021   part 1Tte 451  operations research   fall 2021   part 1
Tte 451 operations research fall 2021 part 1
 
Multi-Objective Evolutionary Algorithms
Multi-Objective Evolutionary AlgorithmsMulti-Objective Evolutionary Algorithms
Multi-Objective Evolutionary Algorithms
 
Simplex method
Simplex method Simplex method
Simplex method
 
Linear Programming
Linear  ProgrammingLinear  Programming
Linear Programming
 
Duality
DualityDuality
Duality
 
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
ADVANCED OPTIMIZATION TECHNIQUES META-HEURISTIC ALGORITHMS FOR ENGINEERING AP...
 
Convex optmization in communications
Convex optmization in communicationsConvex optmization in communications
Convex optmization in communications
 
Introduction to Optimization.ppt
Introduction to Optimization.pptIntroduction to Optimization.ppt
Introduction to Optimization.ppt
 
NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING NON LINEAR PROGRAMMING
NON LINEAR PROGRAMMING
 
PRIMAL & DUAL PROBLEMS
PRIMAL & DUAL PROBLEMSPRIMAL & DUAL PROBLEMS
PRIMAL & DUAL PROBLEMS
 
Genetic Algorithm
Genetic Algorithm Genetic Algorithm
Genetic Algorithm
 
Introduction to optimization Problems
Introduction to optimization ProblemsIntroduction to optimization Problems
Introduction to optimization Problems
 
Genetic algorithm
Genetic algorithmGenetic algorithm
Genetic algorithm
 
Applications of linear programming
Applications of linear programmingApplications of linear programming
Applications of linear programming
 
Simultaneous equations (2)
Simultaneous equations (2)Simultaneous equations (2)
Simultaneous equations (2)
 
Duality in lpp
Duality in lppDuality in lpp
Duality in lpp
 
Lp simplex 3_
Lp simplex 3_Lp simplex 3_
Lp simplex 3_
 
Simplex method
Simplex methodSimplex method
Simplex method
 
Operation research unit 1: LPP Big M and Two Phase method
Operation research unit 1: LPP Big M and Two Phase methodOperation research unit 1: LPP Big M and Two Phase method
Operation research unit 1: LPP Big M and Two Phase method
 
Duality in Linear Programming
Duality in Linear ProgrammingDuality in Linear Programming
Duality in Linear Programming
 

Ähnlich wie Gary Yen: "Multi-objective Optimization and Performance Metrics Ensemble"

MOMDPSO_IDETC_2014_Weiyang
MOMDPSO_IDETC_2014_WeiyangMOMDPSO_IDETC_2014_Weiyang
MOMDPSO_IDETC_2014_WeiyangMDO_Lab
 
Heuristic design of experiments w meta gradient search
Heuristic design of experiments w meta gradient searchHeuristic design of experiments w meta gradient search
Heuristic design of experiments w meta gradient searchGreg Makowski
 
Redes neuronales basadas en competición
Redes neuronales basadas en competiciónRedes neuronales basadas en competición
Redes neuronales basadas en competiciónSpacetoshare
 
Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...IDES Editor
 
Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...IDES Editor
 
Optimizing Market Segmentation
Optimizing Market SegmentationOptimizing Market Segmentation
Optimizing Market SegmentationRobert Colner
 
EvaluationMetrics.pptx
EvaluationMetrics.pptxEvaluationMetrics.pptx
EvaluationMetrics.pptxshuchismitjha2
 
imple and new optimization algorithm for solving constrained and unconstraine...
imple and new optimization algorithm for solving constrained and unconstraine...imple and new optimization algorithm for solving constrained and unconstraine...
imple and new optimization algorithm for solving constrained and unconstraine...salam_a
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.pptbutest
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.pptbutest
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.pptbutest
 
LPP, Duality and Game Theory
LPP, Duality and Game TheoryLPP, Duality and Game Theory
LPP, Duality and Game TheoryPurnima Pandit
 
Neighborhood Component Analysis 20071108
Neighborhood Component Analysis 20071108Neighborhood Component Analysis 20071108
Neighborhood Component Analysis 20071108Ting-Shuo Yo
 
Mining at scale with latent factor models for matrix completion
Mining at scale with latent factor models for matrix completionMining at scale with latent factor models for matrix completion
Mining at scale with latent factor models for matrix completionFabio Petroni, PhD
 

Ähnlich wie Gary Yen: "Multi-objective Optimization and Performance Metrics Ensemble" (20)

MOMDPSO_IDETC_2014_Weiyang
MOMDPSO_IDETC_2014_WeiyangMOMDPSO_IDETC_2014_Weiyang
MOMDPSO_IDETC_2014_Weiyang
 
06 cs661 qb1_sn
06 cs661 qb1_sn06 cs661 qb1_sn
06 cs661 qb1_sn
 
Heuristic design of experiments w meta gradient search
Heuristic design of experiments w meta gradient searchHeuristic design of experiments w meta gradient search
Heuristic design of experiments w meta gradient search
 
Redes neuronales basadas en competición
Redes neuronales basadas en competiciónRedes neuronales basadas en competición
Redes neuronales basadas en competición
 
Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...
 
Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...Optimization of Mechanical Design Problems Using Improved Differential Evolut...
Optimization of Mechanical Design Problems Using Improved Differential Evolut...
 
Optimizing Market Segmentation
Optimizing Market SegmentationOptimizing Market Segmentation
Optimizing Market Segmentation
 
EvaluationMetrics.pptx
EvaluationMetrics.pptxEvaluationMetrics.pptx
EvaluationMetrics.pptx
 
imple and new optimization algorithm for solving constrained and unconstraine...
imple and new optimization algorithm for solving constrained and unconstraine...imple and new optimization algorithm for solving constrained and unconstraine...
imple and new optimization algorithm for solving constrained and unconstraine...
 
Graphical method
Graphical methodGraphical method
Graphical method
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.ppt
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.ppt
 
MachineLearning.ppt
MachineLearning.pptMachineLearning.ppt
MachineLearning.ppt
 
LPP, Duality and Game Theory
LPP, Duality and Game TheoryLPP, Duality and Game Theory
LPP, Duality and Game Theory
 
1015 track2 abbott
1015 track2 abbott1015 track2 abbott
1015 track2 abbott
 
1030 track2 abbott
1030 track2 abbott1030 track2 abbott
1030 track2 abbott
 
Neighborhood Component Analysis 20071108
Neighborhood Component Analysis 20071108Neighborhood Component Analysis 20071108
Neighborhood Component Analysis 20071108
 
Mining at scale with latent factor models for matrix completion
Mining at scale with latent factor models for matrix completionMining at scale with latent factor models for matrix completion
Mining at scale with latent factor models for matrix completion
 
CAGT-IST Student Presentations
CAGT-IST Student Presentations CAGT-IST Student Presentations
CAGT-IST Student Presentations
 
Sota
SotaSota
Sota
 

Mehr von ieee_cis_cyprus

Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...
Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...
Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...ieee_cis_cyprus
 
Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis"
Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis" Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis"
Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis" ieee_cis_cyprus
 
Johan Suykens: "Models from Data: a Unifying Picture"
Johan Suykens: "Models from Data: a Unifying Picture" Johan Suykens: "Models from Data: a Unifying Picture"
Johan Suykens: "Models from Data: a Unifying Picture" ieee_cis_cyprus
 
Jennie Si: "Computing with Neural Spikes"
Jennie Si: "Computing with Neural Spikes" Jennie Si: "Computing with Neural Spikes"
Jennie Si: "Computing with Neural Spikes" ieee_cis_cyprus
 
Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...
Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...
Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...ieee_cis_cyprus
 
Xin Yao: "What can evolutionary computation do for you?"
Xin Yao: "What can evolutionary computation do for you?"Xin Yao: "What can evolutionary computation do for you?"
Xin Yao: "What can evolutionary computation do for you?"ieee_cis_cyprus
 
Prof. Jim Bezdek: Every Picture Tells a Story — Visual Cluster Analysis
Prof. Jim Bezdek: Every Picture Tells a Story — Visual Cluster AnalysisProf. Jim Bezdek: Every Picture Tells a Story — Visual Cluster Analysis
Prof. Jim Bezdek: Every Picture Tells a Story — Visual Cluster Analysisieee_cis_cyprus
 

Mehr von ieee_cis_cyprus (7)

Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...
Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...
Piero Bonissone: "Analytics, Cloud-Computing, and Crowdsourcing --- or How To...
 
Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis"
Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis" Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis"
Pablo Estevez: "Computational Intelligence Applied to Time Series Analysis"
 
Johan Suykens: "Models from Data: a Unifying Picture"
Johan Suykens: "Models from Data: a Unifying Picture" Johan Suykens: "Models from Data: a Unifying Picture"
Johan Suykens: "Models from Data: a Unifying Picture"
 
Jennie Si: "Computing with Neural Spikes"
Jennie Si: "Computing with Neural Spikes" Jennie Si: "Computing with Neural Spikes"
Jennie Si: "Computing with Neural Spikes"
 
Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...
Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...
Hisao Ishibuchi: "Scalability Improvement of Genetics-Based Machine Learning ...
 
Xin Yao: "What can evolutionary computation do for you?"
Xin Yao: "What can evolutionary computation do for you?"Xin Yao: "What can evolutionary computation do for you?"
Xin Yao: "What can evolutionary computation do for you?"
 
Prof. Jim Bezdek: Every Picture Tells a Story — Visual Cluster Analysis
Prof. Jim Bezdek: Every Picture Tells a Story — Visual Cluster AnalysisProf. Jim Bezdek: Every Picture Tells a Story — Visual Cluster Analysis
Prof. Jim Bezdek: Every Picture Tells a Story — Visual Cluster Analysis
 

Gary Yen: "Multi-objective Optimization and Performance Metrics Ensemble"

  • 1. MULTIOBJECTIVE OPTIMIZATION AND PERFORMANCE METRICS ENSEMBLE Gary G. Yen, FIEEE gyen@okstate.edu Professor, Oklahoma State University Past President, IEEE Computational Intelligence Society
  • 3. Multiobjective Optimization  Optimization problems involve more than one objective functions  Very common, yet difficult problems in the field of science, engineering, and business management  Nonconflicting objectives: achieve a single optimal solution satisfies all objectives simultaneously SOPs  Competing objectives: cannot be optimized simultaneously  MOP– search for a set of “acceptable”– maybe only suboptimal for one objective– solutions is our goal  In operation research/management terms - multiple criterion decision making (MCDM) (International Society on MCDM; http://www.terry.uga.edu/mcdm/)
  • 4. Why MOP? Buying an Automobile  Objective = reduce cost, while maximize comfort  Which solution (1, A, B, C, 2) is best ???  No solution from this set makes both objectives look better than any other solution from the set  No single optimal solution  Trade off between conflicting objectives- cost and comfort
  • 5. Mathematical Definition • Mathematical model to formulate the optimization problem Objective Decision Environment Equality Inequality Variable vectors vectors states constraints constraints bounds min{y  f (x, e) : h(x, e)  0, g(x, e)  0, x L  x  xU } x n o Design Variables: decision and objective vector o Constraints: equality and inequality o Greater-than-equal-to inequality constraint can be converted to less-than-equal-to constraint by multiplying -1 o Objective Function: maximization can be converted to minimization due to the duality principle max f (x)  min ( f (x))
  • 6. Pareto Optimality • Formal Definition: the minimization of the n components f k , k  1,, n of a vector function f of a vector variable x in a universe μ, where f (x)  ( f1 (x), f 2 (x),, f n (x)) • Then a decision vector xu   is said to be Pareto-optimal if and only if there is no xv   for which v  f (xv )  (v1 ,, vn ) dominates u  f (xu )  (u1 ,, un ) , that is, there is no x v   such that i {1,, n}, vi  ui and i {1,, n} | vi  ui
  • 7. When encounter problems with many objectives (more than five), nearly all algorithms performs poorly because of loss of selection pressure in fitness evaluation solely based upon Pareto domination.
  • 8. Distinctions from SOP • Multiple conflicting objectives as opposed to single one • Multiple optima vs. single optimum • Two goals instead of one o Progressing towards the Pareto front o Maintaining a diverse set of solutions in the non-dominated front • Dealing with two search spaces o A decision variable space plus an objective space o A proximity of two solutions in one space does not mean a proximity in the other space o Search is performed in the decision space
  • 9. Disadvantages of Classical Methods • We need prior knowledge of the problem domain to result in to a single objective optimization problem (e.g., weight vector,  constraints) • Results in a single solution for each run • Non-uniformity in Pareto-optimal solution • Require fitness function to be linear, continuous and differentiable • Cannot deal with MOPs having discontinuous and concave Pareto fronts
  • 10. Why Population-Based Heuristics? • An unorthodox, stochastic, and population based parallel searching algorithm maybe more suitable for MOPs • Classification of EA’s– o Genetic Algorithm; o Genetic Programming; o Evolutionary Strategy; o Ant Colony; o Artificial Immune System; o Particle Swarm Optimization; o Differential Evolution; o Memetic Algorithm
  • 11.
  • 12. Efforts in Enhancing a PSO for MOPs • Modifying the fitness assignment • Improving PSO flight mechanism • Enhancing the convergence • Preserving the diversity • Managing the population • Constraints and uncertainty handling • Knowledge Management through Culture/Meme
  • 13. Performance Metrics • To quantify the performance of evolutionary multiobjective algorithms according two essential metrics dictated by Pareto Optimality Convergence measure Diversity measure
  • 14. Current Practice  In literature, when an MOEA is proposed,  a number of benchmark problems are chosen to quantify the performance, and  based on a set of heuristically chosen performance metrics, the proposed MOEA and some competitive representatives are evaluated statistically given a large number of independent trials.  The conclusion, if any been drawn, is often indecisive and reveals no additional insight pertaining to the specific problem characteristics that the proposed MOEA would do the best
  • 15. By the No Free Lunch theorem, any algorithm’s elevated performance over one class of problems is exactly paid for in loss over another class.  Our Goal is to rank the MOEAs considered based on a more comprehensive measure (hybrid performance metric), revealing specific problem characteristics that the underlying MOEA could perform the best.
  • 16. Case Study  Five state-of-the-art MOEAs – SPEA 2, NSGA-II, PESA-II, IBEA, and MOEA/D  Six Benchmark Problems – 2-objective ZDT1, ZDT2, ZDT3, ZDT4, ZDT6 – 3-objective DTLZ2, 5-objective WFG1, WFG2, and – 10-objective DTLZ1  Five Performance Metrics – Inverted Generational Distance (IGD), – Pareto Dominance Indicator (NR), – Maximum Spread (MS), – Spacing, and – Hypervolume Indicator
  • 17. Performance Metrics Ensemble  For the same initial population, all five MOEAs will generate a non-dominated front for a given benchmark function with specific problem characteristics.  A randomly chosen performance metric is used to identify the winner of the non-dominated front and its associated MOEA.  This process will be repeated 50 times to gain meaningful statistics.  These 50 non-dominated fronts could come from either one of five MOEAs and each of five performance metrics could be used for multiple times.
  • 18. ZDT1 Generates 50 non-dominated fronts as the initial population of Double Elimination Tournament Selection: SPEA 2 NSGA-II IBEA PESA-II MOEA/D 19 11 3 5 12 IGD NR Spacing S-metric MS 11 10 12 10 7
  • 19. Flow Chart Input: MOEAs Output: Specific Rank Value of All Benchmark MOEAs 50 Problem Approximation YES fronts Double NO No. Remain Elimination to fronts is 0? obtain best front Identify the Winner Eliminate All Fronts Algorithm and from Winner Assign Its Rank Algorithm Value
  • 20. Double Tournament Elimination 50 Winners from 50 Running Times Winner Bracket (25) Loser Bracket (25) 13 Winners 13 Losers 13 Winners 13 Losers 13 Winners 13 Losers Reserved as Winner Reserved as Loser Bracket in the Next Bracket in the Next Eliminate Round Round
  • 21. Round 1  50 fronts are competed to down select to 26 fronts (13 in winner bracket and 13 in loser bracket) going through 25 + 12 +12 + 13 = 62 binary tournaments: SPEA 2 NSGA-II IBEA PESA-II MOEA/D 9 8 0 1 8 IGD NR Spacing S-metric MS 13 13 11 13 12
  • 22. Round 2 Winner Bracket (13) Loser Bracket (13) 7 Winners 7 Losers 7 Winners 7 Losers 7 Winners 7 Losers Reserved as Winner Reserved as Loser Bracket in the Next Bracket in the Next Eliminate Round Round
  • 23. 26 fronts are competed to down select to14 fronts (7 in winner bracket and 7 in loser bracket) going through 6 + 6 + 7 = 19 binary tournaments: SPEA 2 NSGA-II IBEA PESA-II MOEA/D 6 2 0 1 5 IGD NR Spacing S-metric MS 5 4 5 2 3
  • 24. Round 3 Winner Bracket (7) Loser Bracket (7) 4 Winners 4 Losers 4 Winners 4 Losers 4 Winners 4 Losers Reserved as Winner Reserved as Loser Bracket in the Next Bracket in the Next Eliminate Round Round
  • 25. 14 fronts are competed to down select to 8 fronts (4 in winner bracket and 4 in loser bracket) going through 3 + 3 + 4 = 10 binary tournaments: SPEA 2 NSGA-II IBEA PESA-II MOEA/D 3 2 0 0 3 IGD NR Spacing S-metric MS 3 1 3 3 0
  • 26. Round 4 Winner Bracket (4) Loser Bracket (4) 2 Winners 2 Losers 2 Winners 2 Losers 2 Winners 2 Losers Reserved as Winner Reserved as Loser Bracket in the Next Bracket in the Next Eliminate Round Round
  • 27. 8 fronts are competed to down select to 4 fronts (2 in winner bracket and 2 in loser bracket) going through 2 + 2 + 2 = 6 binary tournaments: SPEA 2 NSGA-II IBEA PESA-II MOEA/D 2 0 0 0 2 IGD NR Spacing S-metric MS 1 0 2 1 2
  • 28. Round 5 Winner Bracket (2) Loser Bracket (2) 1 Winners 1 Losers 1 Winners 1 Losers 1 Winners 1 Losers Reserved as Winner Reserved as Loser Bracket in the Next Bracket in the Next Eliminate Round Round
  • 29. 4 fronts are competed to down select to 2 fronts (1 in winner bracket and 1 in loser bracket) going through 1 + 1 + 1 = 3 binary tournaments : : SPEA 2 NSGA-II IBEA PESA-II MOEA/D 1 0 0 0 1 IGD NR Spacing S-metric MS 1 0 0 1 1
  • 30. Round 6 Winner Bracket (1) Loser Bracket (1) 1 Final Winners
  • 31. In the final that 2 fronts are competed to generate the final winner.  About 152 binary tournaments were held to decide a final winner. SPEA 2 NSGA-II IBEA PESA-II MOEA/D 1 0 0 0 0 IGD NR Spacing S-metric MS 0 0 0 1 0  Removing 18 fronts generated by SPEA 2, the remaining 32 fronts will go through the process again…
  • 32. Final Ranking • 35 repeated and independent experiments are done for each function and the findings have been consistent Ranking 2-obj 2-obj 2-obj 2-obj 2-obj 3-obj 5-obj 5-obj 10-obj Order ZDT1 ZDT2 ZDT3 ZDT4 ZDT6 DTLZ2 WFG1 WFG2 DTLZ1 1 SPEA 2 SPEA 2 NSGA-II MOEA/D MOEA/D IBEA IBEA IBEA IBEA 2 MOEA/D MOEA/D MOEA/D NSGA-II IBEA MOEA/D MOEA/D MOEA/D NSGA-II 3 NSGA-II NSGA-II IBEA PESA-II NSGA-II SPEA 2 SPEA 2 NSGA-II MOEA/D 4 PESA-II IBEA SPEA 2 IBEA SPEA 2 NSGA-II NSGA-II SPEA 2 SPEA 2 5 IBEA PESA-II PESA-II SPEA 2 PESA-II PESA-II PESA-II PESA-II PESA-II
  • 33. Observations on SPEA2  It is the final winner in problem ZDT1 and ZDT2.  ZDT1 and ZDT2 do not have local Pareto-optimal fronts and their global Pareto-optimal fronts are continuous.  IBEA and PESA-II dropped out of competition in the first round.  SPEA2, MOEA/D and NSGA-II compete fiercely till round 4.  SPEA 2 will perform well in problems having continuous Pareto-optimal fronts and do not have local Pareto-optimal fronts.
  • 34. In ZDT1, SPEA 2 is the final winner and it wins under all four metrics but is inferior to NSGA-II in S-metric.  In ZDT2, SPEA 2 is the final winner and it wins under all four metrics but it is a little bit worse than NSGA-II in Spacing metric.  In ZDT3, NSGA-II is the final winner and it wins under all four metrics but is inferior to MOEA/D in S-metric.  In ZDT4, MOEA/D is the final winner and it wins under all four metrics but it is a little bit worse than NSGA-II in NR metric.  In ZDT6, MOEA/D is the final winner but is inferior to IBEA in MS metric and a little bit worse than NSGA-II in Spacing metric.  In DTLZ 2, IBEA is the final winner and it wins under all four metrics but is inferior to MOEA/D in Spacing metric.
  • 35. Observations on NSGA-II  It has the best performance in ZDT3.  ZDT3 has the discreteness feature and has a Pareto-optimal front consisting of several non- contiguous convex parts.  MOEA/D is comparable in performance.  NSGA-II will perform well in problems having a Pareto-optimal front consisting of several noncontiguous convex parts.
  • 36. Observations on MOEA/D  It wins in both ZDT4 and ZDT6.  ZDT4 has many local Pareto-optimal fronts, make EAs exhibit their ability to deal with multi-modality.  ZDT6’s Pareto-optimal solutions are non-uniformly distributed.  For ZDT4, SPEA2 was eliminated in early stage of competition. For ZDT6, SPEA2 and PESA-II were eliminated very early.  MOEA/D will exhibit its good performance in problems with lots of local Pareto-optimal fronts or Pareto- optimal solutions are not uniformly distributed its global Pareto front.
  • 37. Observations on IBEA  It wins all in DTLZ 2, WFG1, WFG2 and DTLZ 1 which are the test problem having more than two objectives.  Many credible publications support the ranking for higher-dimensional benchmark problems.  We can make a comparatively conclusion that IBEA can perform better than others in some test problems with high-dimension objectives.
  • 38. Overall Findings  Double elimination design allows specific characteristic-poor performance of a quality algorithm under the special environment still to be able to survive through competitions and win it all.  It gives every individual two chances to take part in the competition. This is helpful to reserve good individual, especially in some special conditions.
  • 39. Remarks  knowing no single metric alone can faithfully quantify the performance of a given MOEA under real-world scenarios, this study is intended to reveal the insight pertaining to specific problem characteristics that the underlying MOEA could perform the best.  For a given real-world problem, if we know its problem characteristics (e.g., a Pareto front with a number of disconnected segments and a high number of local optima), we may make an educated judgment to choose the specific MOEA for its superior performance given the problem characteristics.
  • 40. Grand Challenges in EMO  Groundbreaking applications with smashing success  Toward Many-Objective Optimization under constraints and uncertainties  Universal fundamentals in all algorithm formulations  Publicity in Interdisciplinary World  Education for the next Generations
  • 41. Q&A