SlideShare ist ein Scribd-Unternehmen logo
1 von 18
Downloaden Sie, um offline zu lesen
Piecewise Gaussian Process Modelling for
         Change-Point Detection
  Application to Atmospheric Dispersion Problems


                 Adrien Ickowicz

                      CMIS
                      CSIRO


                  February 2013
Background


     Scientic collaboration with the University College London, the
     UNSW and Universite Lille 1.
         Atmospheric specialists;
         Informatics engineer;
         Statisticians.



     Input
         Concentration value of CBRN material at sensors location;
         Wind eld.


     Output
         Source location, time of release, strength for Fire-ghters;
         Quarantine Map for Politicians and MoD.
Statistical Modelling


  Observation modelling:

                                          obs (i )
                                       Yt j           =
                                                               (i )
                                                             Dtj
                                                                             i
                                                                      (θ) + ζtj


                Cθ (x , t )h(x , t |xi , tj )dxdt                           i
                                                                           ζtj ∼ N (0, σ 2 )
         Ω×T

     where   Cθ     is the solution of the pde:
       ∂C
          +u          C   −   (K     C)       =      Q (θ)
       ∂t
             s.t.     nC = 0    at   ∂Ω


                                                                       Parameter of interest: θ ∈ (Ω × T )
Existing Techniques
Source term estimation


       The Optimization techniques.

           Gradient-based methods
             (Elbern et al [2000], Li and Niu [2005], Lushi and Stockie [2010])
           Patern search methods
             (Zheng et al [2008])
           Genetic Algorithms
             (Haupt [2005], Allen et al [2009])



       The Bayesian techniques.

           Forward modelling and MCMC
             (Patwardhan and Small [1992])
           Backward (Adjoint) modelling and MCMC
             (Issartel et al [2002], Hourdin et al [2006], Yee [2010])
Contribution : Gaussian Process modelling
Overview


    We consider several observations of a stochastic process in space
    and time.

         Idea: Bayesian non-parametric estimation.
              Tool: Gaussian Process (Rasmussen [2006])

        Joint distribution:                             y ∼ GP(m(x), κ(x, x ))

        m ∈ L2 (Ω × T , R) is the prior mean function,
                       and κ ∈ L2 (Ω2 × T 2 , R) is the prior covariance function1

    Posterior distribution:                   L y∗ |x∗ , x, y = N κ(x∗ , x)κ(x, x)−1 y,
                                                    κ(x∗ , x∗ ) − κ(x∗ , x)κ(x, x)−1 κ(x, x∗ )


     1 the   matrix   K   associated should be positive semidenite
Contribution : Gaussian Process modelling
On the Kernel Specication


      A complex non parametric modelling needs to be very careful on kernel
      shape and kernel hyper-parameters.

           Basic Kernel: Isotropic,       κ(x, x ) = α1 exp −      1
                                                                 2α2
                                                                       (x − x )2

                Hyper-parameters: α1 , α2
3




                            3




                                                         3
2




                            2




                                                         2
1




                            1




                                                         1
0




                            0




                                                         0
−1




                            −1




                                                         −1
−2




                            −2




                                                         −2
 Figure: Prediction of 3 Gaussian Process Models (and their according 0.95 CI) given 7
 noisy observations. On the left, α2 = 0.1. In the middle, α2 = 2. On the right,
 α2 = 1000.
Contribution : Gaussian Process modelling
Likelihood and Multiple Kernels


    The hyper-parameters estimation is provided through the marginal
    likelihood,

    log p (y|x) = − 1 yT (K + σ 2 In )−1 y − 1 log |K + σ 2 In | − n log 2π
                    2                        2                     2



    What if the best-tted kernel was,


      κ(x, x ) =     i
                         κi (x, x )1{x,x }∈
i




                                                Figure: Synthetic two-phase signal.
Contribution : Gaussian Process modelling
Change-Point Estimation


    A. Parametric Estimation

    We assume that there exist βi such that,

                          (x , x ) ∈ Ωi ⇔ f (x , x , βi ) ≥ 0

               and f is known. Then, θ = {(αi , βi )i }, and we have,

                               θ = argmax
                               ˆ              log p (y|x)
                                      θ


    Limitations:
        Knowledge of f
        Dimension of the parameter space
        Convexity of the marginal likelihood function
Contribution : Gaussian Process modelling
Change-Point Estimation


    B. Adaptive Estimation (1)

    Let XkNN ∩Br (i ) the sequence of observations associated with xi ,


                 XkNN ∩Br (i ) = xj |{xj ∈ Bir } ∩ {dji ≤ d(ik ) }


         k is the number of neighbours to be considered,
         r is the limiting radius.
    Justication:
         Avoid the lack of observations
         Equivalent number of observations for each estimator
         Avoid the hyper-parametrization of the likelihood
Contribution : Gaussian Process modelling
Change-Point Estimation


    B. Adaptive Estimation (2)

    Let xI = XkNN ∩Br (i ) and yI be the corresponding observations.

                         αi = argmax
                         ˆ                 log p (yI |xI )
                                  α



 Idea 1:                         Idea 2:

     Cluster on αi
                ˆ                     Build the Gram matrices Ki = κ(xI , αi )
                                                                             ˆ
                                                   xi      xi
                                      Let Λxi = {λ1 . . . λn } be the eigenvalues of
 but what if dim(ˆ i ) ≥ 2 ?
                 α                    Ki
                                      Cluster on µi = max{Λxi }
Contribution : Gaussian Process modelling
 Simulation Results




Figure:    Gaussian Process prediction with 1 classical isotropic kernel (green), 2 isotropic kernels with eigenvalue-based
change point estimation (yellow), hyper-parameter-based change point estimation (purple) and parametric estimation (blue).
                                          50                                                        50

                                          45                                                        45

                                          40                                                        40

                                          35                                                        35

                                          30                                                        30

                                          25                                                        25

                                          20                                                        20

                                          15                                                        15

                                          10                                                        10

                                           5                                                         5

                                           0                                                         0
                                               0   5   10   15   20   25   30   35   40   45   50        0   5   10   15   20   25   30   35   40   45   50




Figure:    Mean of the Gaussian Process for the two-dimensional scenario. On the left, the mean is calculated with only one
kernel. On the right, the mean is calculated with two kernels.
Contribution : Gaussian Process modelling
Simulation Results

                                                         
           10




                                                         
                                                               Evolution of the Root MSE of the
                                                                Change-point Estimation when the
                                                         
           8




                                                         
                                                         
                                                         
                                                                number of observations increase
                                                         
                                                         
    RMSE




                                                         
           6




                                                                from 20 to 100, in the 1D case.
           4




                                                         
                                                         
                                                         
                                                         
                                                                    MMLE
                                                         
           2




                                                         
                                                         
                                                         
                                                         
                                                                    JD
           0




                10    20    30       40      50
                                                                    MEV
                               Ns

Methods:
                                                    2D                       2D-donut                     3D
                           
                           
                           
                           
                Parametric   JD               0.834 (0.0034)             0.763 (0.0015)             0.666 (0.0016)
                           
-MMLE,
                           
                           
approach                     MEV               0.825 (0.0053)             0.817 (0.0021)             0.643 (0.0014)
                           
                           
-MEV, EigenValue            MMLE              0.858 (0.0025)             0.806 (0.0008)             0.666 (0.0002)
approach
                           
                           
                           
                           
-JD, Est. approach          Table:      The number of obs. is equal to 10d , where d is the dimension of the problem. 1000
                           
                           
                               simulations are provided. The variance is specied under brackets.
                           
Contribution : Gaussian Process modelling
Application to the Concentration Measurements


    We may consider the concentration measurements as observations
    of a stochastic process in space and time.

         Idea: Apply the dened approach to estimate t0 .


        Prior distribution:                                  C ∼ GP(m, κ)

        m ∈ L2 (Ω × T , R) is the prior mean function,
                       and κ ∈ L2 (Ω2 × T 2 , R) is the prior covariance function2

    Posterior distribution:              C|Y ,m=0 ∼ GP(κx ∗ x κ−1 Y , κx ∗ x ∗ − κx ∗ x κ−1 κxx ∗ )
                                                               xx                        xx



     2 the   matrix   K   associated should be positive semidenite
Contribution : Gaussian Process modelling
Kernel Specication

        Isotropic Kernel                              Drif-dependant Kernel
                                                            x
                                                            ˙          =     u (x , t )
                  1           x−x 2                         x (t 0 )   =     x0
κiso x, x    =        exp −
                 α             β2
                                       sx0 ,t0 (t )   is the solution of this system.
where α and β are hyper-parameters.
                                                                       1                   ds (x, x )
                                         κdyn x, x         =                 exp −
                                                                 σ(t , t )                2σ(t , t )2
                                        where we have:
                                      ds (x, x )       =   (x − sx ,t (t ))2 + (x − sx ,t (t ))2
                                      σ(t , t )        =   α × (|t0 − min(t , t )| + 1)β


                                          Consider the inuence of the wind eld
                                          Consider the time-decreasing correlation
                                          Consider the evolution of the process
Contribution : Gaussian Process modelling
Two Stage estimation process: Instant of Release
                                                 
  The proposed kernel is then complex:           
                                                 
                                                 
                                                 
                                                 
  κf = κiso 1{t ,t t } + κdyn 1{t ,t ≥t }
                                                 
                                               The likelihood is not convex.
                                                 
                0              0
                                                 
                                                 
                                              t0 has to be estimated separately.
                                         
                                         
                                         
      Maximum Likelihood Estimation of
                                         
                                         
                                         
                                         
      Hyperparameters
                                         




                                             Method: Exhaustive research of   t0 .


                                             Calculation of the trace of the Gram
                                             matrix.
                                                 ˆ tr = argmax tr (K (t ))
                                                 t0
                                                              t ∈T
Contribution : Gaussian Process modelling
Two Stage estimation process: Source location

  Given the time of release, we can       Estimation of the source location. Comparison between the
  calculate the location estimation.      estimators (5, 20 and 50 sensors). Target is x0   =   115, y0   =   10.

                                                                 x0
                                                                 ˆ            y0
                                                                              ˆ         σ(x0 )
                                                                                          ˆ               σ(y0 )
                                                                                                            ˆ
 x0
 ˆ    =    argmax E[C|Y ,m=0 (x , tˆ )]
                                   0        κiso       5      68.97         62.58       42.82             38.96
             x ∈Ω
                                                      20      97.13         26.37       27.64             26.08
      =    argmax κx ∗ x κ−1 Y
                  ˜ ˜ xx
                                                      50      104.47        21.60       28.94             19.47
             x ∈Ω
                                             κf        5      108.94        12.21       42.00             17.05
 where κ = κ(., tˆ )
       ˜         0                                    20      120.28         8.28       12.50              4.64
                                                      50      114.51         9.48        6.37              3.07
Contribution : Gaussian Process modelling
Zero-Inated Poisson and Dirichlet Process3

  We can also consider the concentration as a count of particles.
                                 Y   ∼ ZIP (p , λ)

           p   ∼ DP (H , α)                           log λ   ∼ GP (m, κ)


  which then dene the mixture distribution,

                                                                          −λxt
                                                                      e                k
          Pr (Y    = k |p , λ)   =   pxt 1{Y =0}   + (1 − pxt )                  λxt       1{Y =k }
                                                                          k!
                                                                  k


  Major Issue: the tractability of the likelihood calculation relies on the distribution of
both p and λ.



       3 Joint   work with Dr. G .Peters and Dr. I. Nevat
Contribution : Bibliography




      A. Ickowicz, F. Septier, P. Armand, Adaptive Algorithms for the
      Estimation of Source Term in a Complex Atmospheric Release.
      Submitted to Atmospheric Environment Journal

      A. Ickowicz, F. Septier, P. Armand, Estimating a CBRN atmospheric
      release in a complex environment using Gaussian Processes.
      15th international conference on information fusion, Singapore, Singapore,
      July 2012

      F. Septier, A. Ickowicz, P. Armand, Methodes de Monte-Carlo adaptatives
      pour la caractérisation de termes de sources.
      Technical report, CEA, EOTP A-54300-05-07-AW-26, Mar. 2012

      A. Ickowicz, F. Septier, P. Armand, Statistic Estimation for Particle
      Clouds with Lagrangian Stochastic Algorithms.
      Technical report, CEA, EOTP A-24300-01-01-AW-20, Nov. 2011

Weitere ähnliche Inhalte

Was ist angesagt?

Montpellier Math Colloquium
Montpellier Math ColloquiumMontpellier Math Colloquium
Montpellier Math ColloquiumChristian Robert
 
ABC: How Bayesian can it be?
ABC: How Bayesian can it be?ABC: How Bayesian can it be?
ABC: How Bayesian can it be?Christian Robert
 
Mesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursMesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursGabriel Peyré
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsGabriel Peyré
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse RepresentationGabriel Peyré
 
Journey to structure from motion
Journey to structure from motionJourney to structure from motion
Journey to structure from motionJa-Keoung Koo
 
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Beniamino Murgante
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image ProcessingGabriel Peyré
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportGabriel Peyré
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Gabriel Peyré
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas EberleBigMC
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsGabriel Peyré
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slidesCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slideszukun
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methodsChristian Robert
 

Was ist angesagt? (20)

Montpellier Math Colloquium
Montpellier Math ColloquiumMontpellier Math Colloquium
Montpellier Math Colloquium
 
ABC: How Bayesian can it be?
ABC: How Bayesian can it be?ABC: How Bayesian can it be?
ABC: How Bayesian can it be?
 
Mesh Processing Course : Active Contours
Mesh Processing Course : Active ContoursMesh Processing Course : Active Contours
Mesh Processing Course : Active Contours
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Journey to structure from motion
Journey to structure from motionJourney to structure from motion
Journey to structure from motion
 
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image Processing
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal Transport
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : Geodesics
 
Image denoising
Image denoisingImage denoising
Image denoising
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slidesCVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides
 
MCMC and likelihood-free methods
MCMC and likelihood-free methodsMCMC and likelihood-free methods
MCMC and likelihood-free methods
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 

Andere mochten auch

Multigaussian Kriging Min-max Autocorrelation factors
Multigaussian Kriging Min-max Autocorrelation factorsMultigaussian Kriging Min-max Autocorrelation factors
Multigaussian Kriging Min-max Autocorrelation factorsalecacer
 
Pasolli_TH1_T09_2.ppt
Pasolli_TH1_T09_2.pptPasolli_TH1_T09_2.ppt
Pasolli_TH1_T09_2.pptgrssieee
 
Flexible and efficient Gaussian process models for machine ...
Flexible and efficient Gaussian process models for machine ...Flexible and efficient Gaussian process models for machine ...
Flexible and efficient Gaussian process models for machine ...butest
 
The Role Of Translators In MT: EU 2010
The Role Of Translators In MT:  EU 2010The Role Of Translators In MT:  EU 2010
The Role Of Translators In MT: EU 2010LoriThicke
 
1 factor vs.2 factor gaussian model for zero coupon bond pricing final
1 factor vs.2 factor gaussian model for zero coupon bond pricing   final1 factor vs.2 factor gaussian model for zero coupon bond pricing   final
1 factor vs.2 factor gaussian model for zero coupon bond pricing finalFinancial Algorithms
 
Training and Inference for Deep Gaussian Processes
Training and Inference for Deep Gaussian ProcessesTraining and Inference for Deep Gaussian Processes
Training and Inference for Deep Gaussian ProcessesKeyon Vafa
 
03 the gaussian kernel
03 the gaussian kernel03 the gaussian kernel
03 the gaussian kernelMhAcKnI
 
Differentiating the translation process: A corpus analysis of editorial influ...
Differentiating the translation process: A corpus analysis of editorial influ...Differentiating the translation process: A corpus analysis of editorial influ...
Differentiating the translation process: A corpus analysis of editorial influ...Mario Bisiada
 
Inventory
InventoryInventory
Inventorytopabhi
 
Gaussian model (kabani & sumeet)
Gaussian model (kabani & sumeet)Gaussian model (kabani & sumeet)
Gaussian model (kabani & sumeet)Sumeet Khirade
 
linear equation and gaussian elimination
linear equation and gaussian eliminationlinear equation and gaussian elimination
linear equation and gaussian eliminationAju Thadikulangara
 
Gaussian Processes: Applications in Machine Learning
Gaussian Processes: Applications in Machine LearningGaussian Processes: Applications in Machine Learning
Gaussian Processes: Applications in Machine Learningbutest
 
Image encryption and decryption
Image encryption and decryptionImage encryption and decryption
Image encryption and decryptionAashish R
 
Noise filtering
Noise filteringNoise filtering
Noise filteringAlaa Ahmed
 
Google I/O 2011, Android Accelerated Rendering
Google I/O 2011, Android Accelerated RenderingGoogle I/O 2011, Android Accelerated Rendering
Google I/O 2011, Android Accelerated RenderingRomain Guy
 
Social Network Analysis & an Introduction to Tools
Social Network Analysis & an Introduction to ToolsSocial Network Analysis & an Introduction to Tools
Social Network Analysis & an Introduction to ToolsPatti Anklam
 

Andere mochten auch (18)

Multigaussian Kriging Min-max Autocorrelation factors
Multigaussian Kriging Min-max Autocorrelation factorsMultigaussian Kriging Min-max Autocorrelation factors
Multigaussian Kriging Min-max Autocorrelation factors
 
Pasolli_TH1_T09_2.ppt
Pasolli_TH1_T09_2.pptPasolli_TH1_T09_2.ppt
Pasolli_TH1_T09_2.ppt
 
Bird’s-eye view of Gaussian harmonic analysis
Bird’s-eye view of Gaussian harmonic analysisBird’s-eye view of Gaussian harmonic analysis
Bird’s-eye view of Gaussian harmonic analysis
 
Flexible and efficient Gaussian process models for machine ...
Flexible and efficient Gaussian process models for machine ...Flexible and efficient Gaussian process models for machine ...
Flexible and efficient Gaussian process models for machine ...
 
The Role Of Translators In MT: EU 2010
The Role Of Translators In MT:  EU 2010The Role Of Translators In MT:  EU 2010
The Role Of Translators In MT: EU 2010
 
1 factor vs.2 factor gaussian model for zero coupon bond pricing final
1 factor vs.2 factor gaussian model for zero coupon bond pricing   final1 factor vs.2 factor gaussian model for zero coupon bond pricing   final
1 factor vs.2 factor gaussian model for zero coupon bond pricing final
 
Training and Inference for Deep Gaussian Processes
Training and Inference for Deep Gaussian ProcessesTraining and Inference for Deep Gaussian Processes
Training and Inference for Deep Gaussian Processes
 
Kernal methods part2
Kernal methods part2Kernal methods part2
Kernal methods part2
 
03 the gaussian kernel
03 the gaussian kernel03 the gaussian kernel
03 the gaussian kernel
 
Differentiating the translation process: A corpus analysis of editorial influ...
Differentiating the translation process: A corpus analysis of editorial influ...Differentiating the translation process: A corpus analysis of editorial influ...
Differentiating the translation process: A corpus analysis of editorial influ...
 
Inventory
InventoryInventory
Inventory
 
Gaussian model (kabani & sumeet)
Gaussian model (kabani & sumeet)Gaussian model (kabani & sumeet)
Gaussian model (kabani & sumeet)
 
linear equation and gaussian elimination
linear equation and gaussian eliminationlinear equation and gaussian elimination
linear equation and gaussian elimination
 
Gaussian Processes: Applications in Machine Learning
Gaussian Processes: Applications in Machine LearningGaussian Processes: Applications in Machine Learning
Gaussian Processes: Applications in Machine Learning
 
Image encryption and decryption
Image encryption and decryptionImage encryption and decryption
Image encryption and decryption
 
Noise filtering
Noise filteringNoise filtering
Noise filtering
 
Google I/O 2011, Android Accelerated Rendering
Google I/O 2011, Android Accelerated RenderingGoogle I/O 2011, Android Accelerated Rendering
Google I/O 2011, Android Accelerated Rendering
 
Social Network Analysis & an Introduction to Tools
Social Network Analysis & an Introduction to ToolsSocial Network Analysis & an Introduction to Tools
Social Network Analysis & an Introduction to Tools
 

Ähnlich wie YSC 2013

Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...
Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...
Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...grssieee
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011antigonon
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011antigonon
 
2003 Ames.Models
2003 Ames.Models2003 Ames.Models
2003 Ames.Modelspinchung
 
Numerical Linear Algebra for Data and Link Analysis.
Numerical Linear Algebra for Data and Link Analysis.Numerical Linear Algebra for Data and Link Analysis.
Numerical Linear Algebra for Data and Link Analysis.Leonid Zhukov
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility usingkkislas
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysisrik0
 
ISI MSQE Entrance Question Paper (2010)
ISI MSQE Entrance Question Paper (2010)ISI MSQE Entrance Question Paper (2010)
ISI MSQE Entrance Question Paper (2010)CrackDSE
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodSSA KPI
 
11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...Alexander Decker
 
Cosmin Crucean: Perturbative QED on de Sitter Universe.
Cosmin Crucean: Perturbative QED on de Sitter Universe.Cosmin Crucean: Perturbative QED on de Sitter Universe.
Cosmin Crucean: Perturbative QED on de Sitter Universe.SEENET-MTP
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Sean Meyn
 
Adaptive dynamic programming for control
Adaptive dynamic programming for controlAdaptive dynamic programming for control
Adaptive dynamic programming for controlSpringer
 
Application of matrix algebra to multivariate data using standardize scores
Application of matrix algebra to multivariate data using standardize scoresApplication of matrix algebra to multivariate data using standardize scores
Application of matrix algebra to multivariate data using standardize scoresAlexander Decker
 

Ähnlich wie YSC 2013 (20)

Assignment6
Assignment6Assignment6
Assignment6
 
Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...
Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...
Mapping Ash Tree Colonization in an Agricultural Moutain Landscape_ Investiga...
 
Gz3113501354
Gz3113501354Gz3113501354
Gz3113501354
 
Gz3113501354
Gz3113501354Gz3113501354
Gz3113501354
 
Gz3113501354
Gz3113501354Gz3113501354
Gz3113501354
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011
 
2003 Ames.Models
2003 Ames.Models2003 Ames.Models
2003 Ames.Models
 
Numerical Linear Algebra for Data and Link Analysis.
Numerical Linear Algebra for Data and Link Analysis.Numerical Linear Algebra for Data and Link Analysis.
Numerical Linear Algebra for Data and Link Analysis.
 
On estimating the integrated co volatility using
On estimating the integrated co volatility usingOn estimating the integrated co volatility using
On estimating the integrated co volatility using
 
Holographic Cotton Tensor
Holographic Cotton TensorHolographic Cotton Tensor
Holographic Cotton Tensor
 
Social Network Analysis
Social Network AnalysisSocial Network Analysis
Social Network Analysis
 
ISI MSQE Entrance Question Paper (2010)
ISI MSQE Entrance Question Paper (2010)ISI MSQE Entrance Question Paper (2010)
ISI MSQE Entrance Question Paper (2010)
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...11.generalized and subset integrated autoregressive moving average bilinear t...
11.generalized and subset integrated autoregressive moving average bilinear t...
 
Cosmin Crucean: Perturbative QED on de Sitter Universe.
Cosmin Crucean: Perturbative QED on de Sitter Universe.Cosmin Crucean: Perturbative QED on de Sitter Universe.
Cosmin Crucean: Perturbative QED on de Sitter Universe.
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009
 
Pr1
Pr1Pr1
Pr1
 
Adaptive dynamic programming for control
Adaptive dynamic programming for controlAdaptive dynamic programming for control
Adaptive dynamic programming for control
 
Application of matrix algebra to multivariate data using standardize scores
Application of matrix algebra to multivariate data using standardize scoresApplication of matrix algebra to multivariate data using standardize scores
Application of matrix algebra to multivariate data using standardize scores
 

Kürzlich hochgeladen

Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 

Kürzlich hochgeladen (20)

Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 

YSC 2013

  • 1. Piecewise Gaussian Process Modelling for Change-Point Detection Application to Atmospheric Dispersion Problems Adrien Ickowicz CMIS CSIRO February 2013
  • 2. Background Scientic collaboration with the University College London, the UNSW and Universite Lille 1. Atmospheric specialists; Informatics engineer; Statisticians. Input Concentration value of CBRN material at sensors location; Wind eld. Output Source location, time of release, strength for Fire-ghters; Quarantine Map for Politicians and MoD.
  • 3. Statistical Modelling Observation modelling: obs (i ) Yt j = (i ) Dtj i (θ) + ζtj Cθ (x , t )h(x , t |xi , tj )dxdt i ζtj ∼ N (0, σ 2 ) Ω×T where Cθ is the solution of the pde: ∂C +u C − (K C) = Q (θ) ∂t s.t. nC = 0 at ∂Ω Parameter of interest: θ ∈ (Ω × T )
  • 4. Existing Techniques Source term estimation The Optimization techniques. Gradient-based methods (Elbern et al [2000], Li and Niu [2005], Lushi and Stockie [2010]) Patern search methods (Zheng et al [2008]) Genetic Algorithms (Haupt [2005], Allen et al [2009]) The Bayesian techniques. Forward modelling and MCMC (Patwardhan and Small [1992]) Backward (Adjoint) modelling and MCMC (Issartel et al [2002], Hourdin et al [2006], Yee [2010])
  • 5. Contribution : Gaussian Process modelling Overview We consider several observations of a stochastic process in space and time. Idea: Bayesian non-parametric estimation. Tool: Gaussian Process (Rasmussen [2006]) Joint distribution: y ∼ GP(m(x), κ(x, x )) m ∈ L2 (Ω × T , R) is the prior mean function, and κ ∈ L2 (Ω2 × T 2 , R) is the prior covariance function1 Posterior distribution: L y∗ |x∗ , x, y = N κ(x∗ , x)κ(x, x)−1 y, κ(x∗ , x∗ ) − κ(x∗ , x)κ(x, x)−1 κ(x, x∗ ) 1 the matrix K associated should be positive semidenite
  • 6. Contribution : Gaussian Process modelling On the Kernel Specication A complex non parametric modelling needs to be very careful on kernel shape and kernel hyper-parameters. Basic Kernel: Isotropic, κ(x, x ) = α1 exp − 1 2α2 (x − x )2 Hyper-parameters: α1 , α2 3 3 3 2 2 2 1 1 1 0 0 0 −1 −1 −1 −2 −2 −2 Figure: Prediction of 3 Gaussian Process Models (and their according 0.95 CI) given 7 noisy observations. On the left, α2 = 0.1. In the middle, α2 = 2. On the right, α2 = 1000.
  • 7. Contribution : Gaussian Process modelling Likelihood and Multiple Kernels The hyper-parameters estimation is provided through the marginal likelihood, log p (y|x) = − 1 yT (K + σ 2 In )−1 y − 1 log |K + σ 2 In | − n log 2π 2 2 2 What if the best-tted kernel was, κ(x, x ) = i κi (x, x )1{x,x }∈ i Figure: Synthetic two-phase signal.
  • 8. Contribution : Gaussian Process modelling Change-Point Estimation A. Parametric Estimation We assume that there exist βi such that, (x , x ) ∈ Ωi ⇔ f (x , x , βi ) ≥ 0 and f is known. Then, θ = {(αi , βi )i }, and we have, θ = argmax ˆ log p (y|x) θ Limitations: Knowledge of f Dimension of the parameter space Convexity of the marginal likelihood function
  • 9. Contribution : Gaussian Process modelling Change-Point Estimation B. Adaptive Estimation (1) Let XkNN ∩Br (i ) the sequence of observations associated with xi , XkNN ∩Br (i ) = xj |{xj ∈ Bir } ∩ {dji ≤ d(ik ) } k is the number of neighbours to be considered, r is the limiting radius. Justication: Avoid the lack of observations Equivalent number of observations for each estimator Avoid the hyper-parametrization of the likelihood
  • 10. Contribution : Gaussian Process modelling Change-Point Estimation B. Adaptive Estimation (2) Let xI = XkNN ∩Br (i ) and yI be the corresponding observations. αi = argmax ˆ log p (yI |xI ) α Idea 1: Idea 2: Cluster on αi ˆ Build the Gram matrices Ki = κ(xI , αi ) ˆ xi xi Let Λxi = {λ1 . . . λn } be the eigenvalues of but what if dim(ˆ i ) ≥ 2 ? α Ki Cluster on µi = max{Λxi }
  • 11. Contribution : Gaussian Process modelling Simulation Results Figure: Gaussian Process prediction with 1 classical isotropic kernel (green), 2 isotropic kernels with eigenvalue-based change point estimation (yellow), hyper-parameter-based change point estimation (purple) and parametric estimation (blue). 50 50 45 45 40 40 35 35 30 30 25 25 20 20 15 15 10 10 5 5 0 0 0 5 10 15 20 25 30 35 40 45 50 0 5 10 15 20 25 30 35 40 45 50 Figure: Mean of the Gaussian Process for the two-dimensional scenario. On the left, the mean is calculated with only one kernel. On the right, the mean is calculated with two kernels.
  • 12. Contribution : Gaussian Process modelling Simulation Results  10   Evolution of the Root MSE of the Change-point Estimation when the  8    number of observations increase   RMSE  6 from 20 to 100, in the 1D case. 4     MMLE  2     JD 0 10 20 30 40 50 MEV Ns Methods: 2D 2D-donut 3D     Parametric  JD 0.834 (0.0034) 0.763 (0.0015) 0.666 (0.0016)  -MMLE,   approach MEV 0.825 (0.0053) 0.817 (0.0021) 0.643 (0.0014)   -MEV, EigenValue  MMLE 0.858 (0.0025) 0.806 (0.0008) 0.666 (0.0002) approach     -JD, Est. approach  Table: The number of obs. is equal to 10d , where d is the dimension of the problem. 1000   simulations are provided. The variance is specied under brackets. 
  • 13. Contribution : Gaussian Process modelling Application to the Concentration Measurements We may consider the concentration measurements as observations of a stochastic process in space and time. Idea: Apply the dened approach to estimate t0 . Prior distribution: C ∼ GP(m, κ) m ∈ L2 (Ω × T , R) is the prior mean function, and κ ∈ L2 (Ω2 × T 2 , R) is the prior covariance function2 Posterior distribution: C|Y ,m=0 ∼ GP(κx ∗ x κ−1 Y , κx ∗ x ∗ − κx ∗ x κ−1 κxx ∗ ) xx xx 2 the matrix K associated should be positive semidenite
  • 14. Contribution : Gaussian Process modelling Kernel Specication Isotropic Kernel Drif-dependant Kernel x ˙ = u (x , t ) 1 x−x 2 x (t 0 ) = x0 κiso x, x = exp − α β2 sx0 ,t0 (t ) is the solution of this system. where α and β are hyper-parameters. 1 ds (x, x ) κdyn x, x = exp − σ(t , t ) 2σ(t , t )2 where we have: ds (x, x ) = (x − sx ,t (t ))2 + (x − sx ,t (t ))2 σ(t , t ) = α × (|t0 − min(t , t )| + 1)β Consider the inuence of the wind eld Consider the time-decreasing correlation Consider the evolution of the process
  • 15. Contribution : Gaussian Process modelling Two Stage estimation process: Instant of Release  The proposed kernel is then complex:      κf = κiso 1{t ,t t } + κdyn 1{t ,t ≥t }  The likelihood is not convex.  0 0    t0 has to be estimated separately.    Maximum Likelihood Estimation of     Hyperparameters  Method: Exhaustive research of t0 . Calculation of the trace of the Gram matrix. ˆ tr = argmax tr (K (t )) t0 t ∈T
  • 16. Contribution : Gaussian Process modelling Two Stage estimation process: Source location Given the time of release, we can Estimation of the source location. Comparison between the calculate the location estimation. estimators (5, 20 and 50 sensors). Target is x0 = 115, y0 = 10. x0 ˆ y0 ˆ σ(x0 ) ˆ σ(y0 ) ˆ x0 ˆ = argmax E[C|Y ,m=0 (x , tˆ )] 0 κiso 5 68.97 62.58 42.82 38.96 x ∈Ω 20 97.13 26.37 27.64 26.08 = argmax κx ∗ x κ−1 Y ˜ ˜ xx 50 104.47 21.60 28.94 19.47 x ∈Ω κf 5 108.94 12.21 42.00 17.05 where κ = κ(., tˆ ) ˜ 0 20 120.28 8.28 12.50 4.64 50 114.51 9.48 6.37 3.07
  • 17. Contribution : Gaussian Process modelling Zero-Inated Poisson and Dirichlet Process3 We can also consider the concentration as a count of particles. Y ∼ ZIP (p , λ) p ∼ DP (H , α) log λ ∼ GP (m, κ) which then dene the mixture distribution, −λxt e k Pr (Y = k |p , λ) = pxt 1{Y =0} + (1 − pxt ) λxt 1{Y =k } k! k Major Issue: the tractability of the likelihood calculation relies on the distribution of both p and λ. 3 Joint work with Dr. G .Peters and Dr. I. Nevat
  • 18. Contribution : Bibliography A. Ickowicz, F. Septier, P. Armand, Adaptive Algorithms for the Estimation of Source Term in a Complex Atmospheric Release. Submitted to Atmospheric Environment Journal A. Ickowicz, F. Septier, P. Armand, Estimating a CBRN atmospheric release in a complex environment using Gaussian Processes. 15th international conference on information fusion, Singapore, Singapore, July 2012 F. Septier, A. Ickowicz, P. Armand, Methodes de Monte-Carlo adaptatives pour la caractérisation de termes de sources. Technical report, CEA, EOTP A-54300-05-07-AW-26, Mar. 2012 A. Ickowicz, F. Septier, P. Armand, Statistic Estimation for Particle Clouds with Lagrangian Stochastic Algorithms. Technical report, CEA, EOTP A-24300-01-01-AW-20, Nov. 2011