Markov chain Monte Carlo (MCMC) algorithms have become powerful tools for Bayesian inference. However, they do not scale well to large-data problems. Divide-and-conquer strategies, which split the data into batches and, for each batch, run independent MCMC algorithms targeting the corresponding subposterior, can spread the computational burden across a number of separate computer cores. The challenge with such strategies is in recombining the subposteriors to approximate the full posterior. By creating a Gaussian-process approximation for each log-subposterior density we create a tractable approximation for the full posterior. This approximation is exploited through three methodologies: firstly a Hamiltonian Monte Carlo algorithm targeting ...
Markov Chain Monte Carlo (MCMC) is a common way to do posterior inference in Bayesian methods. Hamil...
International audienceComplex hierarchical models lead to a complicated likelihood and then, in a Ba...
Gaussian Process (GP) models are a powerful and flexible tool for non-parametric regression and clas...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
Divide-and-conquer strategies for Monte Carlo algorithms are an increasingly popular approach to mak...
While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distr...
Embarrassingly parallel Markov Chain Monte Carlo (MCMC) exploits parallel computing to scale Bayesia...
Abstract—Kernel methods have revolutionized the fields of pattern recognition and machine learning. ...
Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
The grouped independence Metropolis–Hastings (GIMH) and Markov chain within Metropolis (MCWM) algori...
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large ...
<p>We propose subsampling Markov chain Monte Carlo (MCMC), an MCMC framework where the likelihood fu...
International audienceBecause of their multimodality, mixture posterior distributions are difficult ...
Abstract—Kernel methods have revolutionized the fields of pattern recognition and machine learning. ...
Markov Chain Monte Carlo (MCMC) is a common way to do posterior inference in Bayesian methods. Hamil...
International audienceComplex hierarchical models lead to a complicated likelihood and then, in a Ba...
Gaussian Process (GP) models are a powerful and flexible tool for non-parametric regression and clas...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
Divide-and-conquer strategies for Monte Carlo algorithms are an increasingly popular approach to mak...
While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distr...
Embarrassingly parallel Markov Chain Monte Carlo (MCMC) exploits parallel computing to scale Bayesia...
Abstract—Kernel methods have revolutionized the fields of pattern recognition and machine learning. ...
Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the gold standard technique for...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
The grouped independence Metropolis–Hastings (GIMH) and Markov chain within Metropolis (MCWM) algori...
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large ...
<p>We propose subsampling Markov chain Monte Carlo (MCMC), an MCMC framework where the likelihood fu...
International audienceBecause of their multimodality, mixture posterior distributions are difficult ...
Abstract—Kernel methods have revolutionized the fields of pattern recognition and machine learning. ...
Markov Chain Monte Carlo (MCMC) is a common way to do posterior inference in Bayesian methods. Hamil...
International audienceComplex hierarchical models lead to a complicated likelihood and then, in a Ba...
Gaussian Process (GP) models are a powerful and flexible tool for non-parametric regression and clas...