The grouped independence Metropolis–Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms are pseudo-marginal methods used to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but tends to give conservative approximations of the posterior and is still expensive. A new method is developed to accelerate the GIMH method by using a Gaussian process (GP) approximation...
Bayesian inference in the presence of an intractable likelihood function is computationally challeng...
This paper is concerned with improving the performance of Markov chain algorithms for Monte Carlo si...
International audienceBecause of their multimodality, mixture posterior distributions are difficult ...
The grouped independence Metropolis–Hastings (GIMH) and Markov chain within Metropolis (MCWM) algori...
Abstract—Kernel methods have revolutionized the fields of pattern recognition and machine learning. ...
The marginal likelihood can be notoriously difficult to compute, and particularly so in high-dimensi...
Scope of this work Gaussian Process models (GPMs) are extensively used in data analysis given their ...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
Gaussian Process (GP) models are a powerful and flexible tool for non-parametric regression and clas...
We introduce a powerful and flexible MCMC algorithm for stochastic simulation. The method builds on ...
Approximate Bayesian computation (ABC) methods are used to approximate posterior distributions using...
Approximate Bayesian computation (ABC) methods are used to approximate posterior distributions using...
This thesis is concerned with developing efficient MCMC (Markov Chain Monte Carlo) techniques for no...
Markov chain Monte Carlo (MCMC) algorithms have become powerful tools for Bayesian inference. Howeve...
We propose a new class of learning algorithms that combines variational approximation and Markov cha...
Bayesian inference in the presence of an intractable likelihood function is computationally challeng...
This paper is concerned with improving the performance of Markov chain algorithms for Monte Carlo si...
International audienceBecause of their multimodality, mixture posterior distributions are difficult ...
The grouped independence Metropolis–Hastings (GIMH) and Markov chain within Metropolis (MCWM) algori...
Abstract—Kernel methods have revolutionized the fields of pattern recognition and machine learning. ...
The marginal likelihood can be notoriously difficult to compute, and particularly so in high-dimensi...
Scope of this work Gaussian Process models (GPMs) are extensively used in data analysis given their ...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
Gaussian Process (GP) models are a powerful and flexible tool for non-parametric regression and clas...
We introduce a powerful and flexible MCMC algorithm for stochastic simulation. The method builds on ...
Approximate Bayesian computation (ABC) methods are used to approximate posterior distributions using...
Approximate Bayesian computation (ABC) methods are used to approximate posterior distributions using...
This thesis is concerned with developing efficient MCMC (Markov Chain Monte Carlo) techniques for no...
Markov chain Monte Carlo (MCMC) algorithms have become powerful tools for Bayesian inference. Howeve...
We propose a new class of learning algorithms that combines variational approximation and Markov cha...
Bayesian inference in the presence of an intractable likelihood function is computationally challeng...
This paper is concerned with improving the performance of Markov chain algorithms for Monte Carlo si...
International audienceBecause of their multimodality, mixture posterior distributions are difficult ...