This is a friendly approach to particle filters. Some hints, examples, and good practices to be able to successfully apply particle filters to solve your computer vision pro
3. Motivation Forwhat? Obtainestimates of a recursive/dynamicsystem Let’sstay in computervisionapplications W H (x0,y0) 3 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
5. Motivation How? Define yourtarget Define yourfunctions Select a type of filteradaptedto 1) and 2) Implement and run Optionally: Writeyourpaper and share : ) 5 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
6. Bayesianfiltering Target: xk Itevolvesthrough time accordingtosomedynamics, properties, interaction, etc. W W H H (x0,y0) x0 y0 Prior / Dynamics / Transition… p(xk|xk-1) 6 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
8. Bayesianfiltering Posterior distribution: p(xk|z1:k) Probability density function This is all you can expect to know Typicallywewant a point-estimate of thisdistribution At each time instant: x*k At theend 8 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
9. Bayesianfiltering Time K-1 K K+1 p(zk|xk): Observation model zk-1 zk zk+1 Measurements (visible) xk-1 xk xk+1 States (hidden) p(xk|xk-1): Dynamic model 9 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
10. Bayesianfiltering How? Prediction Use thedynamics, guessfutureaccordingto Correction Obtain a new observation, and applyBayes’ rule Likelihood Prediction Posterior p(xk|xk-1) 10 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
13. Particlefilters Howtosample? Importancesampling MarkovChain Monte Carlo Gibbssampling Slicesampling … Howmanysamples? As much as requiredtotrackthe posterior! 13 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
16. SIR – example (I) Single object tracking 16 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
17. SIR – example (I) Linear-Gaussiandynamics Generate N samplesstartingfrompreviousstateaddingestimatedvelocity And someGaussiannoise Thenoisemakesthatsamples are different! 17 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
18. SIR - example (I) Likelihoodbasedonsegmentationorcolor histogram Evaluateeachpredictedsampleaccordingtothisvalue Likelihoodfunctionshouldreturnhighvaluesfor “good” hypotheses, and lowfor “bad” hypotheses 18 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
19. SIR – example (II) Eye-tracking Linear predictionwon’twork Theprojection of theeyemovementonthescreenisdifficulttopredict Define a combination of linear-Gaussian + uniform 19 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
23. SIR Problems Requirednumber of samplesincreaseexponentiallywithproblemdimension Severalobjects/elements? Define a multimodal posterior and generatemultiplepoint-estimates Clusterparticles Increasestate vector dimension Variable number of objects? Addexternalhandler Includethenumber of objects as another variable toestimate 23 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
24. Particlefilters MCMC More flexible Theproblem of dimensionissoftened Directlysamplefromthe posterior Researchers are focusing in MCMC Manyexcellentworksthatproposesolutionstomultipleobject, interaction, entering-exiting, number of samplesreduction, etc. 24 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
25. Particlefilters MCMC Generate a Markovchain of samplesdirectlyfromthe posterior 25 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
26. MCMC Metropolis-Hastings Startsomehow Propose a movement Acceptwithprobabilityequaltothe ratio betweenproposedvalue and previousone Prob. = 1 ifproposedisbetterthanprevious Prob. = ratio ifnot Metropolis-Hastings allowsobtainingsamplesforanarbitrarydistributionbymaking a chainwhichacceptsorrejectsmovements 26 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
27. MCMC Each sample is a hypothesis of the state of all objects Multipleobjects State vector includingallthedimensions of allobjects Metropolis-Hastings: Generate a chain of N samples Foreachsample, use theinformation of allthesamples at theprevioustime instant After the chain is completed, we have the sample-basedapproximation of the posterior 27 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
28. MCMC Marginalizedproposalmoves Proposemovement of a single dimension at each new sample E.g.don’tpropose a move in alldimensionsforallobjects Choose a dimension randomly and update it Burn-in period Stop when stationary function is reached. Or when maximum number of samples is reached. x W y … … x W H x x W L 28 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
30. MCMC Variable number of objects Add an external detector, and modify state size Reversible-Jump MCMC Define an Enter move (creates an object) Define an Exit move (removes an object) Define an Update move (updates existing objects) 30 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
31. Discussion Whatshould I use? SIR MCMC Kalman? 31 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
32. Discussion Ifdynamics and observation are linear, and withGaussiannoise Use Kalman, thisistheoptimumsolution Ifnot, considerusing a particlefilter 32 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
33. Discussion SIR Use itif target dimensionislow (3-5) Use itifyou plan toparallelizeprocessing Rememberparticles are independentonefromanother Wouldrequireimportantdesignissuesfor Managingmultipleobjects Managing variable number of objects 33 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
34. Discussion MCMC Use itifdimensionsincrease It can notbeparallelized Rememberthatparticlesform a chain, and eachonedependsonthepreviousone Adaptedtomultipleobjects MRF interactioniseasytoinsert Metropolis-Hastings can beefficientlyadaptedtomultipleobject 34 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011
35. Summary Define your target Determine itsdynamics Define thelikelihood Select a filterthatadaptstotheproblem Implementit Runitcarefullyselectingtheappropriateparameters of yourfunctions, number of particles, etc. 35 Marcos Nieto, PhD - mnieto@vicomtech.org 2/2/2011