SlideShare ist ein Scribd-Unternehmen logo
1 von 15
Downloaden Sie, um offline zu lesen
Adaptive MCMH for Bayesian inference of spatial
            autologistic models

          discussion by Pierre E. Jacob & Christian P. Robert

                            Universit´ Paris-Dauphine and CREST
                                     e
                           http://statisfaction.wordpress.com


                  Adap’skiii, Park City, Utah, Jan. 03, 2011




                                                      Adaptive MCMH for Bayesian inference of spatial autologistic mo
discussion by Pierre E. Jacob & Christian P. Robert   1/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z
         c MCMC cannot be implemented!




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z
         c MCMC cannot be implemented!
        =⇒ adaptive MCMH




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Illustrations




  Potts model: if y takes values on a grid Y of size k n and


                                 f (y|θ) ∝ exp θ                  Iyl =yi
                                                            l∼i

  where l∼i denotes a neighbourhood relation, n moderately large
  prohibits the computation of the normalising constant




                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
    discussion by Pierre E. Jacob & Christian P. Robert   3/ 7
Compare ABC with adaptive MCMH?


  ABC algorithm
  For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
  simulating
                       θ ∼ π(θ) , z ∼ f (z|θ ) ,
  until the auxiliary variable z is equal to the observed value, z = y.

                                                                           [Tavar´ et al., 1997]
                                                                                 e

  ABC can also be applied when the normalizing constant is
  unknown, hence it could be compared to adaptive MCMH, e.g. for
  a given number of y’s drawn from f , or in terms of execution time.



                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   4/ 7
ˆ
Some questions on the estimator R(θt , ϑ)



  ˆ                                                1
  R(θt , ϑ) =
                   m0 + m0              θi ∈St {θt } I(||θi      − ϑ|| ≤ η)
                                                                                           
                                                         m0       (i)         m0     (t)
                                                              g(zj , θt )        g(zj , θt ) 
  ×                      I(||θi − ϑ|| ≤ η)
                                                                  (i)
                                                                           +
                                                                                     (t)
      
        θi ∈St {θt }                                     j=1   g(zj , ϑ)      j=1 g(zj , ϑ)
                                                                                              


        Why does it bypass the number of repetitions of the θi ’s?




                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert    5/ 7
ˆ
Some questions on the estimator R(θt , ϑ)



  ˆ                                                1
  R(θt , ϑ) =
                   m0 + m0              θi ∈St {θt } I(||θi      − ϑ|| ≤ η)
                                                                                           
                                                         m0       (i)         m0     (t)
                                                              g(zj , θt )        g(zj , θt ) 
  ×                      I(||θi − ϑ|| ≤ η)
                                                                  (i)
                                                                           +
                                                                                     (t)
      
        θi ∈St {θt }                                     j=1   g(zj , ϑ)      j=1 g(zj , ϑ)
                                                                                              


        Why does it bypass the number of repetitions of the θi ’s?
                                          (i)
        Why resample the yj ’s only to inverse-weight them later?
                                                                                                 (i)
        Why not stick to the original sample and weight the yj ’s
        directly? This would avoid the necessity to choose m0 .



                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert    5/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?
        How would the current theoretical framework cope with these
        additional adaptations?




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?
        How would the current theoretical framework cope with these
        additional adaptations?
        In the same spirit, could m (and m0 ) be considered as
        algorithmic parameters and be adapted as well? What
        criterion would be used to adapt them?


                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.
        In PMCMC, the acceptance rate is growing with the number
        of particles used to estimate the likelihood, ie when the
        estimation is more precise, the acceptance rate increases and
        converges towards the acceptance rate of an idealized
        standard MH algorithm.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.
        In PMCMC, the acceptance rate is growing with the number
        of particles used to estimate the likelihood, ie when the
        estimation is more precise, the acceptance rate increases and
        converges towards the acceptance rate of an idealized
        standard MH algorithm.
        In adaptive MCMH, is there such a link betweeen m, m0 and
        the number of iterations on one side and the acceptance rate
        of the algorithm on the other side? Should it be growing when
        the estimation becomes more and more precise?



                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7

Weitere ähnliche Inhalte

Was ist angesagt?

A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
Jiahao Chen
 
Doering Savov
Doering SavovDoering Savov
Doering Savov
gh
 
Differential Geometry
Differential GeometryDifferential Geometry
Differential Geometry
lapuyade
 
Notes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationNotes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementation
Leo Vivas
 
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Leo Vivas
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...
LARCA UPC
 

Was ist angesagt? (19)

A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
 
Doering Savov
Doering SavovDoering Savov
Doering Savov
 
Differential Geometry
Differential GeometryDifferential Geometry
Differential Geometry
 
The Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysisThe Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysis
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihood
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUs
 
Parameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial DecisionsParameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial Decisions
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 
Notes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationNotes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementation
 
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
 
Mixture Models for Image Analysis
Mixture Models for Image AnalysisMixture Models for Image Analysis
Mixture Models for Image Analysis
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ Warwick
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...
 
Ben Gal
Ben Gal Ben Gal
Ben Gal
 
Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?
 
Em
EmEm
Em
 
Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility
 
smtlectures.1
smtlectures.1smtlectures.1
smtlectures.1
 

Ähnlich wie Discussion of Faming Liang's talk

EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
zukun
 
Mcmc & lkd free II
Mcmc & lkd free IIMcmc & lkd free II
Mcmc & lkd free II
Deb Roy
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
Phong Vo
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
Deb Roy
 

Ähnlich wie Discussion of Faming Liang's talk (20)

EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Lecture_9.pdf
Lecture_9.pdfLecture_9.pdf
Lecture_9.pdf
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering models
 
Considerate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionConsiderate Approaches to ABC Model Selection
Considerate Approaches to ABC Model Selection
 
ABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space models
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
 
Mcmc & lkd free II
Mcmc & lkd free IIMcmc & lkd free II
Mcmc & lkd free II
 
Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
 
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesAccelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
 
Presentation.pdf
Presentation.pdfPresentation.pdf
Presentation.pdf
 
Jsm09 talk
Jsm09 talkJsm09 talk
Jsm09 talk
 
Ada boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataAda boost brown boost performance with noisy data
Ada boost brown boost performance with noisy data
 
2_GLMs_printable.pdf
2_GLMs_printable.pdf2_GLMs_printable.pdf
2_GLMs_printable.pdf
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
ABC in London, May 5, 2011
ABC in London, May 5, 2011ABC in London, May 5, 2011
ABC in London, May 5, 2011
 
Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘 Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘
 

Mehr von Christian Robert

Mehr von Christian Robert (20)

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 

Kürzlich hochgeladen

Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
ZurliaSoop
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 

Kürzlich hochgeladen (20)

Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
Google Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptxGoogle Gemini An AI Revolution in Education.pptx
Google Gemini An AI Revolution in Education.pptx
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 

Discussion of Faming Liang's talk

  • 1. Adaptive MCMH for Bayesian inference of spatial autologistic models discussion by Pierre E. Jacob & Christian P. Robert Universit´ Paris-Dauphine and CREST e http://statisfaction.wordpress.com Adap’skiii, Park City, Utah, Jan. 03, 2011 Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 1/ 7
  • 2. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 3. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented! Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 4. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented! =⇒ adaptive MCMH Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 5. Illustrations Potts model: if y takes values on a grid Y of size k n and f (y|θ) ∝ exp θ Iyl =yi l∼i where l∼i denotes a neighbourhood relation, n moderately large prohibits the computation of the normalising constant Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 3/ 7
  • 6. Compare ABC with adaptive MCMH? ABC algorithm For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly simulating θ ∼ π(θ) , z ∼ f (z|θ ) , until the auxiliary variable z is equal to the observed value, z = y. [Tavar´ et al., 1997] e ABC can also be applied when the normalizing constant is unknown, hence it could be compared to adaptive MCMH, e.g. for a given number of y’s drawn from f , or in terms of execution time. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 4/ 7
  • 7. ˆ Some questions on the estimator R(θt , ϑ) ˆ 1 R(θt , ϑ) = m0 + m0 θi ∈St {θt } I(||θi − ϑ|| ≤ η)     m0 (i) m0 (t)  g(zj , θt ) g(zj , θt )  × I(||θi − ϑ|| ≤ η) (i) + (t)  θi ∈St {θt } j=1 g(zj , ϑ) j=1 g(zj , ϑ)  Why does it bypass the number of repetitions of the θi ’s? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 5/ 7
  • 8. ˆ Some questions on the estimator R(θt , ϑ) ˆ 1 R(θt , ϑ) = m0 + m0 θi ∈St {θt } I(||θi − ϑ|| ≤ η)     m0 (i) m0 (t)  g(zj , θt ) g(zj , θt )  × I(||θi − ϑ|| ≤ η) (i) + (t)  θi ∈St {θt } j=1 g(zj , ϑ) j=1 g(zj , ϑ)  Why does it bypass the number of repetitions of the θi ’s? (i) Why resample the yj ’s only to inverse-weight them later? (i) Why not stick to the original sample and weight the yj ’s directly? This would avoid the necessity to choose m0 . Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 5/ 7
  • 9. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 10. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 11. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? How would the current theoretical framework cope with these additional adaptations? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 12. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? How would the current theoretical framework cope with these additional adaptations? In the same spirit, could m (and m0 ) be considered as algorithmic parameters and be adapted as well? What criterion would be used to adapt them? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 13. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7
  • 14. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. In PMCMC, the acceptance rate is growing with the number of particles used to estimate the likelihood, ie when the estimation is more precise, the acceptance rate increases and converges towards the acceptance rate of an idealized standard MH algorithm. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7
  • 15. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. In PMCMC, the acceptance rate is growing with the number of particles used to estimate the likelihood, ie when the estimation is more precise, the acceptance rate increases and converges towards the acceptance rate of an idealized standard MH algorithm. In adaptive MCMH, is there such a link betweeen m, m0 and the number of iterations on one side and the acceptance rate of the algorithm on the other side? Should it be growing when the estimation becomes more and more precise? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7