SlideShare a Scribd company logo
1 of 38
Particle Swarm Optimization (PSO) Algorithm
 and Its Application in Engineering Design
                Optimization


                     By

      Sushanta Kumar Mandal
         Research Scholar



         School of Information Technology
     Indian Institute of Technology Kharagpur
                 September 9, 2005
Outline


 Introduction to Optimization
 Optimization Procedure
 Different Optimization Algorithms
 Different Global Optimization Algorithms
 Particle Swarm Optimization (PSO) Algorithm
 Application of PSO in Design Optimization
  Problems
Optimization




As ageless
    as time
Calculus




Maximum and minimum of a smooth
function is reached at a stationary
point where its gradient vanishes.
Optimization is Everywhere

 The more we know about something,
  the more we see where optimization
  can be applied.
 Some personal decision making
  - Finding fastest route home or class
  - Optimal allocation of time for home work
  - Optimal budgeting
Goal of Optimization



Find values of the variables that
minimize or maximize the objective
function    while satisfying   the
constraints.
Component of Optimization
       Problem

Objective Function:          An objective function
  which we want to minimize or maximize.
 For example, in a manufacturing process, we might
  want to maximize the profit or minimize the cost.
 In fitting experimental data to a user-defined
  model, we might minimize the total deviation of
  observed data from predictions based on the model.
 In designing an inductor, we might want to
  maximize the Quality Factor and minimize the
  area.
Component of Optimization
            Problem

   Design Variables:          A set of unknowns or
  variables which affect the value of the objective
  function.
 In the manufacturing problem, the variables might
  include the amounts of different resources used or
  the time spent on each activity.
 In fitting-the-data problem, the unknowns are the
  parameters that define the model.
 In the inductor design problem, the variables used
  define the layout geometry of the panel.
Component of Optimization
         Problem

Constraints: A set of constraints that allow the
  unknowns to take on certain values but exclude
  others.
 For the manufacturing problem, it does not make
  sense to spend a negative amount of time on any
  activity, so we constrain all the "time" variables to
  be non-negative.
 In the inductor design problem, we would probably
  want to limit the upper and lower value of layout
  parameters and to target an inductance value within
  the tolerance level.
Are All these ingredients
              necessary?

 Almost all optimization problems have objective
  function.
 No objective function. In some cases (for example,
  design of integrated circuit layouts), the goal is to
  find a set of variables that satisfies the constraints of
  the model. The user does not particularly want to
  optimize anything so there is no reason to define an
  objective function. This type of problems is usually
  called a feasibility problem.
Are All these ingredients
              necessary?

 Variables are essential. If there are no variables, we
  cannot define the objective function and the problem
  constraints.
 Constraints are not essential. In fact, the field of
  unconstrained optimization is a large and important
  one for which a lot of algorithms and software are
  available. It's been argued that almost all problems
  really do have constraints.
What We Need for Optimization

 Models: Modeling is the process of identifying
  objective function, variables and constraints. The goal
  of models is “insight” not the numbers. A good
  mathematical model of the optimization problem is
  needed.
 Algorithms: Typically, an interesting model is too
  complicated to be able to solve in with paper and
  pencil. An effective and reliable numerical algorithm is
  needed to solve the problem. There is no universal
  optimization algorithm. Algorithm should have
  robustness (good performance for a wide class of
  problems), efficiency (not too much computer time)
  and accuracy (can identify the error)
Flowchart of Optimal Design
        Procedure
         Need for optimization


        Choose design variables


         Formulate constraints


      Formulate objective function


         Set up variable bounds


     Select an optimization algorithm


            Obtain solution(s)
Mathematical Formulation of
  Optimization Problems
minimize the objective function
min f ( x), x = ( x1 , x2 ,......., xn )
subject to constraints
ci ( x) ≥ 0       Example
ci ( x ) = 0           min ( x1 − 2 ) 2 + ( x2 − 1) 2 
                                                      
                       subject: x1 − x2 ≤ 0
                                  2       2


                                  x1 + x2 ≤ 2
Constraints



 Inequality constraints: x12 – x2220
 Equality constraints: x1 = 2
Variable Bounds

 Maximum and minimum bounds on each
 design variable.
 Without variable bounds the constraints
 completely surround the feasible region.
 Variable bounds are used to confine the
 search algorithm within these bounds.
 Ex: xi ( L ) ≤ xi ≤ xi (U )
Classification of Optimization
            Methods
 Single variable
 Multi-variable
 Constrained
 Non-constrained
 Single objective
 Multi-objective
 Linear
 Non-linear
Classifications of Optimization
            Methods
Local and Global Optimizers

 A local minimizer, xB*, of the region B, is defined so
  that: f(xB*)) f(x), f xx B.
 Ex: Gradient based search methods, Newton-Rapson
  algorithms, Steepest Decent, Conjugate-Gradient
  algorithms, Levenberg-Marquardt algorithm etc.
 Shortcomings: 1) One requires an initial guess to start
  with. 2) Convergence to an optimal solution depends on
  the chosen initial guess. 3) Most algorithms tend to get
  stuck to a sub-optimal solution. 4) An algorithm
  efficient in solving one optimization problem may not
  be efficient in solving another one. 5) These are useful
  over a relatively narrow range.
Local and Global Optimizers



  The global optimizer, x* , is defined so that f(x*)) f(x),
 f xx S where S is the search space.
 Ex. Simulated Annealing algorithm, Genetic
 Algorithm, Ant Colony, Geometric Programming,
 Particle Swarm Optimization etc.
Local and Global Optimizers
Particle Swarm Optimization
 Evolutionary computational technique based on
  the movement and intelligence of swarms
  looking for the most fertile feeding location
 It was developed in 1995 by James Kennedy and
  Russell Eberhart
 Simple algorithm, easy to implement and few
  parameters to adjust mainly the velocity
 A “swarm” is an apparently disorganized
  collection (population) of moving individuals
  that tend to cluster together while each
  individual seems to be moving in a random
  direction
Continued…

 It uses a number of agents (particles) that
  constitute a swarm moving around in the search
  space looking for the best solution.
 Each particle is treated as a point in a D-
  dimensional space which adjusts its “flying”
  according to its own flying experience as well as
  the flying experience of other particles
 Each particle keeps track of its coordinates in the
  problem space which are associated with the best
  solution (fitness) that has achieved so far. This
  value is called pbest.
Continued…

 Another best value that is tracked by the PSO is
  the best value obtained so far by any particle in
  the neighbors of the particle. This value is called
  gbest.

 The PSO concept consists of changing the
  velocity(or accelerating) of each particle toward
  its pbest and the gbest position at each time step.
Continued…

                 Each particle tries to modify its current position and velocity
                 according to the distance between its current position and pbest,
                 and the distance between its current position and gbest.

  vn+1 = vn + c1rand 1( ) * ( pbest,n − CurrentPosition n ) + c2 rand 2( ) * ( g best,n − CurrentPosition n )

                                                             vn+1: Velocity of particle at n+1 th iteration
CurrentPosition[n+1] = CurrentPosition[n] + v[n+1]           Vn : Velocity of particle at nth iteration
                                                             c1 : acceleration factor related to gbest
current position[n+1]: position of particle at n+1th
                                                             c2 : acceleration factor related to lbest
                       iteration
                                                             rand1( ): random number between 0 and 1
current position[n]: position of particle at nth
iteration                                                    rand2( ): random number between 0 and 1

v[n+1] : particle velocity at n+1th iteration                gbest: gbest position of swarm
                                                             pbest: pbest position of particle
PSO Algorithm

For each particle
      Initialize particle with feasible random number
   END
   Do
      For each particle
         Calculate the fitness value
         If the fitness value is better than the best fitness value (pbest) in history
            Set current value as the new pbest
      End
Choose the particle with the best fitness value of all the particles as the gbest
      For each particle
         Calculate particle velocity according to velocity update equation
     Update particle position according to position update equation
      End
   While maximum iterations or minimum error criteria is not attained
gbest and lbest

 global version:
vx[ ][ ] = vx[ ][ ] + 2*rand( )*(pbest[ ][ ] – presentx[ ][ ])
  +2*rand( )*(pbestx[ ][gbest] – presentx[ ][ ])



 local version:
vx[ ][ ] = vx[ ][ ] + 2*rand( )*(pbest[ ][ ] – presebtx[ ][ ]) +
  2*rand( )*(pbestx[ ][lbest] – presentx[ ][ ])
Particle Swarm Optimization:
       Swarm Topology
 In PSO, there have been two basic topologies used in
  the literature
   – Ring Topology (neighborhood of 3)
   – Star Topology (global neighborhood)
                                           I0
             I0

                          I1                           I1
   I4                           I4



        I3           I2              I3           I2
PSO Parameters:
         Velocity
 An important parameter in PSO; typically the only
  one adjusted
 Calmps particles’ velocities on each dimenson
 Determines “fineness” with which regions are
  searched
 If too high, can fly past optimal solutions
 If too low, can get stuck in local minima
Flow Chart for Extraction Procedure using PSO
                                                In p u t
                                 N o . o f P a r t ic le
                                 N o . o f ite r a tio n c o u n t
                                 E rro r = 0 .0 5




                                     I n it ia liz e R a n d o m ly
                                1 . P o s it io n o f t h e p a r t ic le
                                2 . V e lo c ity o f th e P a r tic le




                                       F o r e a c h        P a r tic le




                              S im u la te M o d e l P a r a m e te r s
                                     Y 1 1 ,Y 1 2 ,Y 2 1 ,Y 2 2


                                                                                                                                       g o fo r n e x t
M e a s u re m e n t                                                                                                                      p a r tic le
                               E v a lu a te      F itn e s s    F u n c tio n
      D a ta



                                                      Is
                                    fitn e s s ( C u r r e n t P o s .)
                                                      >                                     Y e s
                                          f it n e s s ( g b e s t )
                                                      ?
                                                                                                     U p d a te
                                                                                           g b e s t = c u rre n t p o s .
                                                      N o


                                                   Is
                                    fitn e s s ( C u r r e n t P o s )
                                                    >                                       Y e s
                                          fitn e s s ( lb e s t)
                                                    ?
                                                                                                         U p d a te
                                                                                            lb e s t =     c u rre n t p o s .
                                                                                                                                                 N o
                                                      N o



                                                       Is
                                      p a r t ic le   > M a x . n o .
                                                       ?




                              E v a lu a t e F it n e s s F u n c t io n         E
                                        b a s e d o n g b e s t
                                                                                                                                                                          N e x t Ite r a tio n




                                                        Is
                   Y e s
                              fitn e s s      e r r o r < d e fin e d      e rro r


            E x tr a c te d
                                                      N o
           P a r a m e te r

                                                      Is
                   Y e s      ite r a tio n    c o u n t >     M a x . c o u n t
                                                       ?

                                                      N o

                                                  U p d a te                         L im it v e lo c ity                        U p d a te               L im it p o s itio n
                                                   v lo c ity                        [V m in , V m a x ]                         p o s it io n            [P m in , P m a x ]
Comparison of Genetic
                       Algorithm and PSO
Tested in a MATLAB Program, P4 1.7GHz CPU, 256M RAM.
No. of particle/ population size = 100                                                                          Crossover Prob.= 0.9
No. of simulation runs: 10000                                                                                   Mutation Prob. = .01
                                                             70
                                                                                                          GA
                                                             60                                           PSO
                        Target value of Error minimization




                                                             50

                                                             40

                                                             30

                                                             20

                                                             10
                                                              5
                                                             0
                                                                  0   2000     4000      6000      8000     10000
                                                                             Number of Iteration
Model Fitting
Model Fitting
Inductor Optimization


maximize     Q ( n, d , w, s )
subject to   ( 1 − tol ) Lt arg et ≤ L ( n, d , w, s )
             L ( n, d , w, s ) ≤ ( 1 + tol ) Lt arg et
             n     ≤ n ≤ nmax
               min
             d     ≤ d ≤ d max
               min
             w     ≤w≤d
               min           min
             s     ≤ s ≤ smax
              min
             d ≥ 2n ( w + s ) − 2 s
Constrained PSO Optimization
Thank You

More Related Content

What's hot

Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimizationHanya Mohammed
 
Particle Swarm optimization
Particle Swarm optimizationParticle Swarm optimization
Particle Swarm optimizationmidhulavijayan
 
Pso kota baru parahyangan 2017
Pso kota baru parahyangan 2017Pso kota baru parahyangan 2017
Pso kota baru parahyangan 2017Iwan Sofana
 
Optimization and particle swarm optimization (O & PSO)
Optimization and particle swarm optimization (O & PSO) Optimization and particle swarm optimization (O & PSO)
Optimization and particle swarm optimization (O & PSO) Engr Nosheen Memon
 
Differential Evolution Algorithm (DEA)
Differential Evolution Algorithm (DEA) Differential Evolution Algorithm (DEA)
Differential Evolution Algorithm (DEA) A. Bilal Özcan
 
Particle Swarm Optimization: The Algorithm and Its Applications
Particle Swarm Optimization: The Algorithm and Its ApplicationsParticle Swarm Optimization: The Algorithm and Its Applications
Particle Swarm Optimization: The Algorithm and Its Applicationsadil raja
 
Particle Swarm Optimization Matlab code Using 50, 5000 Swarms
Particle Swarm Optimization Matlab code Using 50, 5000 SwarmsParticle Swarm Optimization Matlab code Using 50, 5000 Swarms
Particle Swarm Optimization Matlab code Using 50, 5000 SwarmsRaza Shamsi
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimizationAbhishek Agrawal
 
MACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMMACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMPuneet Kulyana
 
Role of Tensors in Machine Learning
Role of Tensors in Machine LearningRole of Tensors in Machine Learning
Role of Tensors in Machine LearningAnima Anandkumar
 
Particles Swarm Optimization
Particles Swarm OptimizationParticles Swarm Optimization
Particles Swarm OptimizationBrian Raafiu
 
Particle swarm optimization
Particle swarm optimization Particle swarm optimization
Particle swarm optimization Ahmed Fouad Ali
 

What's hot (20)

Practical Swarm Optimization (PSO)
Practical Swarm Optimization (PSO)Practical Swarm Optimization (PSO)
Practical Swarm Optimization (PSO)
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimization
 
Particle Swarm optimization
Particle Swarm optimizationParticle Swarm optimization
Particle Swarm optimization
 
PSO.ppt
PSO.pptPSO.ppt
PSO.ppt
 
Pso kota baru parahyangan 2017
Pso kota baru parahyangan 2017Pso kota baru parahyangan 2017
Pso kota baru parahyangan 2017
 
PSO
PSOPSO
PSO
 
Optimization and particle swarm optimization (O & PSO)
Optimization and particle swarm optimization (O & PSO) Optimization and particle swarm optimization (O & PSO)
Optimization and particle swarm optimization (O & PSO)
 
Differential Evolution Algorithm (DEA)
Differential Evolution Algorithm (DEA) Differential Evolution Algorithm (DEA)
Differential Evolution Algorithm (DEA)
 
Particle Swarm Optimization: The Algorithm and Its Applications
Particle Swarm Optimization: The Algorithm and Its ApplicationsParticle Swarm Optimization: The Algorithm and Its Applications
Particle Swarm Optimization: The Algorithm and Its Applications
 
Particle Swarm Optimization Matlab code Using 50, 5000 Swarms
Particle Swarm Optimization Matlab code Using 50, 5000 SwarmsParticle Swarm Optimization Matlab code Using 50, 5000 Swarms
Particle Swarm Optimization Matlab code Using 50, 5000 Swarms
 
fitness function
fitness functionfitness function
fitness function
 
Particle swarm optimization
Particle swarm optimizationParticle swarm optimization
Particle swarm optimization
 
Differential evolution
Differential evolutionDifferential evolution
Differential evolution
 
Nature-inspired algorithms
Nature-inspired algorithmsNature-inspired algorithms
Nature-inspired algorithms
 
MACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMMACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHM
 
Ga ppt (1)
Ga ppt (1)Ga ppt (1)
Ga ppt (1)
 
Intelligent Agents
Intelligent Agents Intelligent Agents
Intelligent Agents
 
Role of Tensors in Machine Learning
Role of Tensors in Machine LearningRole of Tensors in Machine Learning
Role of Tensors in Machine Learning
 
Particles Swarm Optimization
Particles Swarm OptimizationParticles Swarm Optimization
Particles Swarm Optimization
 
Particle swarm optimization
Particle swarm optimization Particle swarm optimization
Particle swarm optimization
 

Similar to PSO and Its application in Engineering

TEXT FEUTURE SELECTION USING PARTICLE SWARM OPTIMIZATION (PSO)
TEXT FEUTURE SELECTION  USING PARTICLE SWARM OPTIMIZATION (PSO)TEXT FEUTURE SELECTION  USING PARTICLE SWARM OPTIMIZATION (PSO)
TEXT FEUTURE SELECTION USING PARTICLE SWARM OPTIMIZATION (PSO)yahye abukar
 
metaheuristic tabu pso
metaheuristic tabu psometaheuristic tabu pso
metaheuristic tabu psoheba_ahmad
 
DriP PSO- A fast and inexpensive PSO for drifting problem spaces
DriP PSO- A fast and inexpensive PSO for drifting problem spacesDriP PSO- A fast and inexpensive PSO for drifting problem spaces
DriP PSO- A fast and inexpensive PSO for drifting problem spacesZubin Bhuyan
 
ALGORITHMS - SHORT NOTES
ALGORITHMS - SHORT NOTESALGORITHMS - SHORT NOTES
ALGORITHMS - SHORT NOTESsuthi
 
ETCS262A-Analysis of design Algorithm.pptx
ETCS262A-Analysis of design Algorithm.pptxETCS262A-Analysis of design Algorithm.pptx
ETCS262A-Analysis of design Algorithm.pptxRahulSingh190790
 
Computational Intelligence Assisted Engineering Design Optimization (using MA...
Computational Intelligence Assisted Engineering Design Optimization (using MA...Computational Intelligence Assisted Engineering Design Optimization (using MA...
Computational Intelligence Assisted Engineering Design Optimization (using MA...AmirParnianifard1
 
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdfkhadijabutt34
 
Global Optimization with Descending Region Algorithm
Global Optimization with Descending Region AlgorithmGlobal Optimization with Descending Region Algorithm
Global Optimization with Descending Region AlgorithmLoc Nguyen
 

Similar to PSO and Its application in Engineering (20)

Optimization Using Evolutionary Computing Techniques
Optimization Using Evolutionary Computing Techniques Optimization Using Evolutionary Computing Techniques
Optimization Using Evolutionary Computing Techniques
 
TEXT FEUTURE SELECTION USING PARTICLE SWARM OPTIMIZATION (PSO)
TEXT FEUTURE SELECTION  USING PARTICLE SWARM OPTIMIZATION (PSO)TEXT FEUTURE SELECTION  USING PARTICLE SWARM OPTIMIZATION (PSO)
TEXT FEUTURE SELECTION USING PARTICLE SWARM OPTIMIZATION (PSO)
 
metaheuristic tabu pso
metaheuristic tabu psometaheuristic tabu pso
metaheuristic tabu pso
 
Pso notes
Pso notesPso notes
Pso notes
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
DriP PSO- A fast and inexpensive PSO for drifting problem spaces
DriP PSO- A fast and inexpensive PSO for drifting problem spacesDriP PSO- A fast and inexpensive PSO for drifting problem spaces
DriP PSO- A fast and inexpensive PSO for drifting problem spaces
 
MS Project
MS ProjectMS Project
MS Project
 
introduction pso.ppt
introduction pso.pptintroduction pso.ppt
introduction pso.ppt
 
ALGORITHMS - SHORT NOTES
ALGORITHMS - SHORT NOTESALGORITHMS - SHORT NOTES
ALGORITHMS - SHORT NOTES
 
ETCS262A-Analysis of design Algorithm.pptx
ETCS262A-Analysis of design Algorithm.pptxETCS262A-Analysis of design Algorithm.pptx
ETCS262A-Analysis of design Algorithm.pptx
 
Glowworm Swarm Optimisation
Glowworm Swarm OptimisationGlowworm Swarm Optimisation
Glowworm Swarm Optimisation
 
CH1.ppt
CH1.pptCH1.ppt
CH1.ppt
 
Computational Intelligence Assisted Engineering Design Optimization (using MA...
Computational Intelligence Assisted Engineering Design Optimization (using MA...Computational Intelligence Assisted Engineering Design Optimization (using MA...
Computational Intelligence Assisted Engineering Design Optimization (using MA...
 
Bic pso
Bic psoBic pso
Bic pso
 
Graphical method
Graphical methodGraphical method
Graphical method
 
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
 
Ds33717725
Ds33717725Ds33717725
Ds33717725
 
Ds33717725
Ds33717725Ds33717725
Ds33717725
 
1 sati
1 sati1 sati
1 sati
 
Global Optimization with Descending Region Algorithm
Global Optimization with Descending Region AlgorithmGlobal Optimization with Descending Region Algorithm
Global Optimization with Descending Region Algorithm
 

Recently uploaded

TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfPrecisely
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 

Recently uploaded (20)

TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 

PSO and Its application in Engineering

  • 1. Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization By Sushanta Kumar Mandal Research Scholar School of Information Technology Indian Institute of Technology Kharagpur September 9, 2005
  • 2. Outline  Introduction to Optimization  Optimization Procedure  Different Optimization Algorithms  Different Global Optimization Algorithms  Particle Swarm Optimization (PSO) Algorithm  Application of PSO in Design Optimization Problems
  • 4. Calculus Maximum and minimum of a smooth function is reached at a stationary point where its gradient vanishes.
  • 5. Optimization is Everywhere  The more we know about something, the more we see where optimization can be applied.  Some personal decision making - Finding fastest route home or class - Optimal allocation of time for home work - Optimal budgeting
  • 6. Goal of Optimization Find values of the variables that minimize or maximize the objective function while satisfying the constraints.
  • 7. Component of Optimization Problem Objective Function: An objective function which we want to minimize or maximize.  For example, in a manufacturing process, we might want to maximize the profit or minimize the cost.  In fitting experimental data to a user-defined model, we might minimize the total deviation of observed data from predictions based on the model.  In designing an inductor, we might want to maximize the Quality Factor and minimize the area.
  • 8. Component of Optimization Problem  Design Variables: A set of unknowns or variables which affect the value of the objective function.  In the manufacturing problem, the variables might include the amounts of different resources used or the time spent on each activity.  In fitting-the-data problem, the unknowns are the parameters that define the model.  In the inductor design problem, the variables used define the layout geometry of the panel.
  • 9. Component of Optimization Problem Constraints: A set of constraints that allow the unknowns to take on certain values but exclude others.  For the manufacturing problem, it does not make sense to spend a negative amount of time on any activity, so we constrain all the "time" variables to be non-negative.  In the inductor design problem, we would probably want to limit the upper and lower value of layout parameters and to target an inductance value within the tolerance level.
  • 10. Are All these ingredients necessary?  Almost all optimization problems have objective function.  No objective function. In some cases (for example, design of integrated circuit layouts), the goal is to find a set of variables that satisfies the constraints of the model. The user does not particularly want to optimize anything so there is no reason to define an objective function. This type of problems is usually called a feasibility problem.
  • 11. Are All these ingredients necessary?  Variables are essential. If there are no variables, we cannot define the objective function and the problem constraints.  Constraints are not essential. In fact, the field of unconstrained optimization is a large and important one for which a lot of algorithms and software are available. It's been argued that almost all problems really do have constraints.
  • 12. What We Need for Optimization  Models: Modeling is the process of identifying objective function, variables and constraints. The goal of models is “insight” not the numbers. A good mathematical model of the optimization problem is needed.  Algorithms: Typically, an interesting model is too complicated to be able to solve in with paper and pencil. An effective and reliable numerical algorithm is needed to solve the problem. There is no universal optimization algorithm. Algorithm should have robustness (good performance for a wide class of problems), efficiency (not too much computer time) and accuracy (can identify the error)
  • 13. Flowchart of Optimal Design Procedure Need for optimization Choose design variables Formulate constraints Formulate objective function Set up variable bounds Select an optimization algorithm Obtain solution(s)
  • 14. Mathematical Formulation of Optimization Problems minimize the objective function min f ( x), x = ( x1 , x2 ,......., xn ) subject to constraints ci ( x) ≥ 0 Example ci ( x ) = 0 min ( x1 − 2 ) 2 + ( x2 − 1) 2    subject: x1 − x2 ≤ 0 2 2 x1 + x2 ≤ 2
  • 15. Constraints  Inequality constraints: x12 – x2220  Equality constraints: x1 = 2
  • 16. Variable Bounds  Maximum and minimum bounds on each design variable.  Without variable bounds the constraints completely surround the feasible region.  Variable bounds are used to confine the search algorithm within these bounds.  Ex: xi ( L ) ≤ xi ≤ xi (U )
  • 17. Classification of Optimization Methods  Single variable  Multi-variable  Constrained  Non-constrained  Single objective  Multi-objective  Linear  Non-linear
  • 18.
  • 20. Local and Global Optimizers  A local minimizer, xB*, of the region B, is defined so that: f(xB*)) f(x), f xx B.  Ex: Gradient based search methods, Newton-Rapson algorithms, Steepest Decent, Conjugate-Gradient algorithms, Levenberg-Marquardt algorithm etc.  Shortcomings: 1) One requires an initial guess to start with. 2) Convergence to an optimal solution depends on the chosen initial guess. 3) Most algorithms tend to get stuck to a sub-optimal solution. 4) An algorithm efficient in solving one optimization problem may not be efficient in solving another one. 5) These are useful over a relatively narrow range.
  • 21. Local and Global Optimizers  The global optimizer, x* , is defined so that f(x*)) f(x), f xx S where S is the search space.  Ex. Simulated Annealing algorithm, Genetic Algorithm, Ant Colony, Geometric Programming, Particle Swarm Optimization etc.
  • 22. Local and Global Optimizers
  • 23. Particle Swarm Optimization  Evolutionary computational technique based on the movement and intelligence of swarms looking for the most fertile feeding location  It was developed in 1995 by James Kennedy and Russell Eberhart  Simple algorithm, easy to implement and few parameters to adjust mainly the velocity  A “swarm” is an apparently disorganized collection (population) of moving individuals that tend to cluster together while each individual seems to be moving in a random direction
  • 24. Continued…  It uses a number of agents (particles) that constitute a swarm moving around in the search space looking for the best solution.  Each particle is treated as a point in a D- dimensional space which adjusts its “flying” according to its own flying experience as well as the flying experience of other particles  Each particle keeps track of its coordinates in the problem space which are associated with the best solution (fitness) that has achieved so far. This value is called pbest.
  • 25. Continued…  Another best value that is tracked by the PSO is the best value obtained so far by any particle in the neighbors of the particle. This value is called gbest.  The PSO concept consists of changing the velocity(or accelerating) of each particle toward its pbest and the gbest position at each time step.
  • 26. Continued… Each particle tries to modify its current position and velocity according to the distance between its current position and pbest, and the distance between its current position and gbest. vn+1 = vn + c1rand 1( ) * ( pbest,n − CurrentPosition n ) + c2 rand 2( ) * ( g best,n − CurrentPosition n ) vn+1: Velocity of particle at n+1 th iteration CurrentPosition[n+1] = CurrentPosition[n] + v[n+1] Vn : Velocity of particle at nth iteration c1 : acceleration factor related to gbest current position[n+1]: position of particle at n+1th c2 : acceleration factor related to lbest iteration rand1( ): random number between 0 and 1 current position[n]: position of particle at nth iteration rand2( ): random number between 0 and 1 v[n+1] : particle velocity at n+1th iteration gbest: gbest position of swarm pbest: pbest position of particle
  • 27. PSO Algorithm For each particle Initialize particle with feasible random number END Do For each particle Calculate the fitness value If the fitness value is better than the best fitness value (pbest) in history Set current value as the new pbest End Choose the particle with the best fitness value of all the particles as the gbest For each particle Calculate particle velocity according to velocity update equation Update particle position according to position update equation End While maximum iterations or minimum error criteria is not attained
  • 28. gbest and lbest  global version: vx[ ][ ] = vx[ ][ ] + 2*rand( )*(pbest[ ][ ] – presentx[ ][ ]) +2*rand( )*(pbestx[ ][gbest] – presentx[ ][ ])  local version: vx[ ][ ] = vx[ ][ ] + 2*rand( )*(pbest[ ][ ] – presebtx[ ][ ]) + 2*rand( )*(pbestx[ ][lbest] – presentx[ ][ ])
  • 29. Particle Swarm Optimization: Swarm Topology  In PSO, there have been two basic topologies used in the literature – Ring Topology (neighborhood of 3) – Star Topology (global neighborhood) I0 I0 I1 I1 I4 I4 I3 I2 I3 I2
  • 30. PSO Parameters: Velocity  An important parameter in PSO; typically the only one adjusted  Calmps particles’ velocities on each dimenson  Determines “fineness” with which regions are searched  If too high, can fly past optimal solutions  If too low, can get stuck in local minima
  • 31. Flow Chart for Extraction Procedure using PSO In p u t N o . o f P a r t ic le N o . o f ite r a tio n c o u n t E rro r = 0 .0 5 I n it ia liz e R a n d o m ly 1 . P o s it io n o f t h e p a r t ic le 2 . V e lo c ity o f th e P a r tic le F o r e a c h P a r tic le S im u la te M o d e l P a r a m e te r s Y 1 1 ,Y 1 2 ,Y 2 1 ,Y 2 2 g o fo r n e x t M e a s u re m e n t p a r tic le E v a lu a te F itn e s s F u n c tio n D a ta Is fitn e s s ( C u r r e n t P o s .) > Y e s f it n e s s ( g b e s t ) ? U p d a te g b e s t = c u rre n t p o s . N o Is fitn e s s ( C u r r e n t P o s ) > Y e s fitn e s s ( lb e s t) ? U p d a te lb e s t = c u rre n t p o s . N o N o Is p a r t ic le > M a x . n o . ? E v a lu a t e F it n e s s F u n c t io n E b a s e d o n g b e s t N e x t Ite r a tio n Is Y e s fitn e s s e r r o r < d e fin e d e rro r E x tr a c te d N o P a r a m e te r Is Y e s ite r a tio n c o u n t > M a x . c o u n t ? N o U p d a te L im it v e lo c ity U p d a te L im it p o s itio n v lo c ity [V m in , V m a x ] p o s it io n [P m in , P m a x ]
  • 32. Comparison of Genetic Algorithm and PSO Tested in a MATLAB Program, P4 1.7GHz CPU, 256M RAM. No. of particle/ population size = 100 Crossover Prob.= 0.9 No. of simulation runs: 10000 Mutation Prob. = .01 70 GA 60 PSO Target value of Error minimization 50 40 30 20 10 5 0 0 2000 4000 6000 8000 10000 Number of Iteration
  • 35. Inductor Optimization maximize Q ( n, d , w, s ) subject to ( 1 − tol ) Lt arg et ≤ L ( n, d , w, s ) L ( n, d , w, s ) ≤ ( 1 + tol ) Lt arg et n ≤ n ≤ nmax min d ≤ d ≤ d max min w ≤w≤d min min s ≤ s ≤ smax min d ≥ 2n ( w + s ) − 2 s
  • 37.