SlideShare ist ein Scribd-Unternehmen logo
Adaptive MCMH for Bayesian inference of spatial
            autologistic models

          discussion by Pierre E. Jacob & Christian P. Robert

                            Universit´ Paris-Dauphine and CREST
                                     e
                           http://statisfaction.wordpress.com


                  Adap’skiii, Park City, Utah, Jan. 03, 2011




                                                      Adaptive MCMH for Bayesian inference of spatial autologistic mo
discussion by Pierre E. Jacob & Christian P. Robert   1/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z
         c MCMC cannot be implemented!




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z
         c MCMC cannot be implemented!
        =⇒ adaptive MCMH




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Illustrations




  Potts model: if y takes values on a grid Y of size k n and


                                 f (y|θ) ∝ exp θ                  Iyl =yi
                                                            l∼i

  where l∼i denotes a neighbourhood relation, n moderately large
  prohibits the computation of the normalising constant




                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
    discussion by Pierre E. Jacob & Christian P. Robert   3/ 7
Compare ABC with adaptive MCMH?


  ABC algorithm
  For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
  simulating
                       θ ∼ π(θ) , z ∼ f (z|θ ) ,
  until the auxiliary variable z is equal to the observed value, z = y.

                                                                           [Tavar´ et al., 1997]
                                                                                 e

  ABC can also be applied when the normalizing constant is
  unknown, hence it could be compared to adaptive MCMH, e.g. for
  a given number of y’s drawn from f , or in terms of execution time.



                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   4/ 7
ˆ
Some questions on the estimator R(θt , ϑ)



  ˆ                                                1
  R(θt , ϑ) =
                   m0 + m0              θi ∈St {θt } I(||θi      − ϑ|| ≤ η)
                                                                                           
                                                         m0       (i)         m0     (t)
                                                              g(zj , θt )        g(zj , θt ) 
  ×                      I(||θi − ϑ|| ≤ η)
                                                                  (i)
                                                                           +
                                                                                     (t)
      
        θi ∈St {θt }                                     j=1   g(zj , ϑ)      j=1 g(zj , ϑ)
                                                                                              


        Why does it bypass the number of repetitions of the θi ’s?




                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert    5/ 7
ˆ
Some questions on the estimator R(θt , ϑ)



  ˆ                                                1
  R(θt , ϑ) =
                   m0 + m0              θi ∈St {θt } I(||θi      − ϑ|| ≤ η)
                                                                                           
                                                         m0       (i)         m0     (t)
                                                              g(zj , θt )        g(zj , θt ) 
  ×                      I(||θi − ϑ|| ≤ η)
                                                                  (i)
                                                                           +
                                                                                     (t)
      
        θi ∈St {θt }                                     j=1   g(zj , ϑ)      j=1 g(zj , ϑ)
                                                                                              


        Why does it bypass the number of repetitions of the θi ’s?
                                          (i)
        Why resample the yj ’s only to inverse-weight them later?
                                                                                                 (i)
        Why not stick to the original sample and weight the yj ’s
        directly? This would avoid the necessity to choose m0 .



                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert    5/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?
        How would the current theoretical framework cope with these
        additional adaptations?




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?
        How would the current theoretical framework cope with these
        additional adaptations?
        In the same spirit, could m (and m0 ) be considered as
        algorithmic parameters and be adapted as well? What
        criterion would be used to adapt them?


                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.
        In PMCMC, the acceptance rate is growing with the number
        of particles used to estimate the likelihood, ie when the
        estimation is more precise, the acceptance rate increases and
        converges towards the acceptance rate of an idealized
        standard MH algorithm.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.
        In PMCMC, the acceptance rate is growing with the number
        of particles used to estimate the likelihood, ie when the
        estimation is more precise, the acceptance rate increases and
        converges towards the acceptance rate of an idealized
        standard MH algorithm.
        In adaptive MCMH, is there such a link betweeen m, m0 and
        the number of iterations on one side and the acceptance rate
        of the algorithm on the other side? Should it be growing when
        the estimation becomes more and more precise?



                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7

Weitere ähnliche Inhalte

Was ist angesagt?

A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTJiahao Chen
 
Doering Savov
Doering SavovDoering Savov
Doering Savovgh
 
Differential Geometry
Differential GeometryDifferential Geometry
Differential Geometrylapuyade
 
The Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysisThe Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysisDaniel Bruggisser
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihoodChristian Robert
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsMichael Stumpf
 
Parameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial DecisionsParameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial DecisionsDaniel Bruggisser
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorSSA KPI
 
Notes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationNotes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationLeo Vivas
 
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Leo Vivas
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPierre Jacob
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...LARCA UPC
 
Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Christian Robert
 

Was ist angesagt? (19)

A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
 
Doering Savov
Doering SavovDoering Savov
Doering Savov
 
Differential Geometry
Differential GeometryDifferential Geometry
Differential Geometry
 
The Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysisThe Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysis
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihood
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUs
 
Parameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial DecisionsParameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial Decisions
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 
Notes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationNotes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementation
 
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
 
Mixture Models for Image Analysis
Mixture Models for Image AnalysisMixture Models for Image Analysis
Mixture Models for Image Analysis
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ Warwick
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...
 
Ben Gal
Ben Gal Ben Gal
Ben Gal
 
Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?
 
Em
EmEm
Em
 
Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility
 
smtlectures.1
smtlectures.1smtlectures.1
smtlectures.1
 

Ähnlich wie Discussion of Faming Liang's talk

EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysiszukun
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsBigMC
 
Considerate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionConsiderate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionMichael Stumpf
 
ABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsUmberto Picchini
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionKonkuk University, Korea
 
Mcmc & lkd free II
Mcmc & lkd free IIMcmc & lkd free II
Mcmc & lkd free IIDeb Roy
 
Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)Umberto Picchini
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayesPhong Vo
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Deb Roy
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Christian Robert
 
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesAccelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesMatt Moores
 
Ada boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataAda boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataShadhin Rahman
 
2_GLMs_printable.pdf
2_GLMs_printable.pdf2_GLMs_printable.pdf
2_GLMs_printable.pdfElio Laureano
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapterChristian Robert
 
ABC in London, May 5, 2011
ABC in London, May 5, 2011ABC in London, May 5, 2011
ABC in London, May 5, 2011Christian Robert
 
Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘 Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘 Jungkyu Lee
 

Ähnlich wie Discussion of Faming Liang's talk (20)

EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Lecture_9.pdf
Lecture_9.pdfLecture_9.pdf
Lecture_9.pdf
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering models
 
Considerate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionConsiderate Approaches to ABC Model Selection
Considerate Approaches to ABC Model Selection
 
ABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space models
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
 
Mcmc & lkd free II
Mcmc & lkd free IIMcmc & lkd free II
Mcmc & lkd free II
 
Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
 
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesAccelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
 
Presentation.pdf
Presentation.pdfPresentation.pdf
Presentation.pdf
 
Jsm09 talk
Jsm09 talkJsm09 talk
Jsm09 talk
 
Ada boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataAda boost brown boost performance with noisy data
Ada boost brown boost performance with noisy data
 
2_GLMs_printable.pdf
2_GLMs_printable.pdf2_GLMs_printable.pdf
2_GLMs_printable.pdf
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
ABC in London, May 5, 2011
ABC in London, May 5, 2011ABC in London, May 5, 2011
ABC in London, May 5, 2011
 
Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘 Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘
 

Mehr von Christian Robert

Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloChristian Robert
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 

Mehr von Christian Robert (20)

Adaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte CarloAdaptive Restore algorithm & importance Monte Carlo
Adaptive Restore algorithm & importance Monte Carlo
 
Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 

Kürzlich hochgeladen

IATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdffIATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdff17thcssbs2
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
 
Post Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdf
Post Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdfPost Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdf
Post Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdfPragya - UEM Kolkata Quiz Club
 
Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17
Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17
Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17Celine George
 
The Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational ResourcesThe Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational Resourcesaileywriter
 
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17Celine George
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfQucHHunhnh
 
How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17Celine George
 
Advances in production technology of Grapes.pdf
Advances in production technology of Grapes.pdfAdvances in production technology of Grapes.pdf
Advances in production technology of Grapes.pdfDr. M. Kumaresan Hort.
 
Benefits and Challenges of Using Open Educational Resources
Benefits and Challenges of Using Open Educational ResourcesBenefits and Challenges of Using Open Educational Resources
Benefits and Challenges of Using Open Educational Resourcesdimpy50
 
Basic phrases for greeting and assisting costumers
Basic phrases for greeting and assisting costumersBasic phrases for greeting and assisting costumers
Basic phrases for greeting and assisting costumersPedroFerreira53928
 
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringBasic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringDenish Jangid
 
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdfTelling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdfTechSoup
 
Gyanartha SciBizTech Quiz slideshare.pptx
Gyanartha SciBizTech Quiz slideshare.pptxGyanartha SciBizTech Quiz slideshare.pptx
Gyanartha SciBizTech Quiz slideshare.pptxShibin Azad
 
Championnat de France de Tennis de table/
Championnat de France de Tennis de table/Championnat de France de Tennis de table/
Championnat de France de Tennis de table/siemaillard
 
Salient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxSalient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxakshayaramakrishnan21
 
Morse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptxMorse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptxjmorse8
 

Kürzlich hochgeladen (20)

IATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdffIATP How-to Foreign Travel May 2024.pdff
IATP How-to Foreign Travel May 2024.pdff
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
 
NCERT Solutions Power Sharing Class 10 Notes pdf
NCERT Solutions Power Sharing Class 10 Notes pdfNCERT Solutions Power Sharing Class 10 Notes pdf
NCERT Solutions Power Sharing Class 10 Notes pdf
 
Post Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdf
Post Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdfPost Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdf
Post Exam Fun(da) Intra UEM General Quiz 2024 - Prelims q&a.pdf
 
Operations Management - Book1.p - Dr. Abdulfatah A. Salem
Operations Management - Book1.p  - Dr. Abdulfatah A. SalemOperations Management - Book1.p  - Dr. Abdulfatah A. Salem
Operations Management - Book1.p - Dr. Abdulfatah A. Salem
 
Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17
Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17
Removal Strategy _ FEFO _ Working with Perishable Products in Odoo 17
 
The Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational ResourcesThe Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational Resources
 
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 2 STEPS Using Odoo 17
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
 
How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17
 
Advances in production technology of Grapes.pdf
Advances in production technology of Grapes.pdfAdvances in production technology of Grapes.pdf
Advances in production technology of Grapes.pdf
 
Benefits and Challenges of Using Open Educational Resources
Benefits and Challenges of Using Open Educational ResourcesBenefits and Challenges of Using Open Educational Resources
Benefits and Challenges of Using Open Educational Resources
 
Basic phrases for greeting and assisting costumers
Basic phrases for greeting and assisting costumersBasic phrases for greeting and assisting costumers
Basic phrases for greeting and assisting costumers
 
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringBasic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
 
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdfTelling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
 
Gyanartha SciBizTech Quiz slideshare.pptx
Gyanartha SciBizTech Quiz slideshare.pptxGyanartha SciBizTech Quiz slideshare.pptx
Gyanartha SciBizTech Quiz slideshare.pptx
 
Championnat de France de Tennis de table/
Championnat de France de Tennis de table/Championnat de France de Tennis de table/
Championnat de France de Tennis de table/
 
Word Stress rules esl .pptx
Word Stress rules esl               .pptxWord Stress rules esl               .pptx
Word Stress rules esl .pptx
 
Salient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxSalient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptx
 
Morse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptxMorse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptx
 

Discussion of Faming Liang's talk

  • 1. Adaptive MCMH for Bayesian inference of spatial autologistic models discussion by Pierre E. Jacob & Christian P. Robert Universit´ Paris-Dauphine and CREST e http://statisfaction.wordpress.com Adap’skiii, Park City, Utah, Jan. 03, 2011 Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 1/ 7
  • 2. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 3. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented! Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 4. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented! =⇒ adaptive MCMH Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 5. Illustrations Potts model: if y takes values on a grid Y of size k n and f (y|θ) ∝ exp θ Iyl =yi l∼i where l∼i denotes a neighbourhood relation, n moderately large prohibits the computation of the normalising constant Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 3/ 7
  • 6. Compare ABC with adaptive MCMH? ABC algorithm For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly simulating θ ∼ π(θ) , z ∼ f (z|θ ) , until the auxiliary variable z is equal to the observed value, z = y. [Tavar´ et al., 1997] e ABC can also be applied when the normalizing constant is unknown, hence it could be compared to adaptive MCMH, e.g. for a given number of y’s drawn from f , or in terms of execution time. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 4/ 7
  • 7. ˆ Some questions on the estimator R(θt , ϑ) ˆ 1 R(θt , ϑ) = m0 + m0 θi ∈St {θt } I(||θi − ϑ|| ≤ η)     m0 (i) m0 (t)  g(zj , θt ) g(zj , θt )  × I(||θi − ϑ|| ≤ η) (i) + (t)  θi ∈St {θt } j=1 g(zj , ϑ) j=1 g(zj , ϑ)  Why does it bypass the number of repetitions of the θi ’s? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 5/ 7
  • 8. ˆ Some questions on the estimator R(θt , ϑ) ˆ 1 R(θt , ϑ) = m0 + m0 θi ∈St {θt } I(||θi − ϑ|| ≤ η)     m0 (i) m0 (t)  g(zj , θt ) g(zj , θt )  × I(||θi − ϑ|| ≤ η) (i) + (t)  θi ∈St {θt } j=1 g(zj , ϑ) j=1 g(zj , ϑ)  Why does it bypass the number of repetitions of the θi ’s? (i) Why resample the yj ’s only to inverse-weight them later? (i) Why not stick to the original sample and weight the yj ’s directly? This would avoid the necessity to choose m0 . Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 5/ 7
  • 9. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 10. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 11. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? How would the current theoretical framework cope with these additional adaptations? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 12. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? How would the current theoretical framework cope with these additional adaptations? In the same spirit, could m (and m0 ) be considered as algorithmic parameters and be adapted as well? What criterion would be used to adapt them? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 13. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7
  • 14. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. In PMCMC, the acceptance rate is growing with the number of particles used to estimate the likelihood, ie when the estimation is more precise, the acceptance rate increases and converges towards the acceptance rate of an idealized standard MH algorithm. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7
  • 15. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. In PMCMC, the acceptance rate is growing with the number of particles used to estimate the likelihood, ie when the estimation is more precise, the acceptance rate increases and converges towards the acceptance rate of an idealized standard MH algorithm. In adaptive MCMH, is there such a link betweeen m, m0 and the number of iterations on one side and the acceptance rate of the algorithm on the other side? Should it be growing when the estimation becomes more and more precise? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7