Embarrassingly parallel Markov Chain Monte Carlo (MCMC) exploits parallel computing to scale Bayesian inference to large datasets by using a two-step approach. First, MCMC is run in parallel on (sub) posteriors defined on data partitions. Then, a server combines local results. While efficient, this framework is very sensitive to the quality of subposterior sampling. Common sampling problems such as missing modes or misrepresentation of low-density regions are amplified–instead of being corrected–in the combination phase, leading to catastrophic failures. In this work, we propose a novel combination strategy to mitigate this issue. Our strategy, Parallel Active Inference (PAI), leverages Gaussian Process (GP) surrogate modeling and active le...
Computational intensity and sequential nature of estimation techniques for Bayesian methods in stati...
The advent of probabilistic programming languages has galvanized scientists to write increasingly di...
© 2018, Indian Statistical Institute. The rapid development of computing power and efficient Markov ...
Embarrassingly parallel Markov Chain Monte Carlo (MCMC) exploits parallel computing to scale Bayesia...
Communication costs, resulting from synchro-nization requirements during learning, can greatly slow ...
<p>Communication costs, resulting from synchronization requirements during learning, can greatly slo...
Performing Bayesian inference via Markov chain Monte Carlo (MCMC) can be exceedingly expensive when ...
Divide-and-conquer strategies for Monte Carlo algorithms are an increasingly popular approach to mak...
While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distr...
This paper introduces the Parallel Hierarchical Sampler (PHS), a class of Markov chain Monte Carlo ...
Markov chain Monte Carlo (MCMC) algorithms have become powerful tools for Bayesian inference. Howeve...
In big data context, traditional MCMC methods, such as Metropolis-Hastings algorithms and hybrid Mon...
We propose Subsampling MCMC, a Markov Chain Monte Carlo (MCMC) framework where the likelihood functi...
Global fits of physics models require efficient methods for exploring high-dimensional and/or multim...
Funding Information: We thank the U.S. National Science Foundation, Institute of Education Sciences,...
Computational intensity and sequential nature of estimation techniques for Bayesian methods in stati...
The advent of probabilistic programming languages has galvanized scientists to write increasingly di...
© 2018, Indian Statistical Institute. The rapid development of computing power and efficient Markov ...
Embarrassingly parallel Markov Chain Monte Carlo (MCMC) exploits parallel computing to scale Bayesia...
Communication costs, resulting from synchro-nization requirements during learning, can greatly slow ...
<p>Communication costs, resulting from synchronization requirements during learning, can greatly slo...
Performing Bayesian inference via Markov chain Monte Carlo (MCMC) can be exceedingly expensive when ...
Divide-and-conquer strategies for Monte Carlo algorithms are an increasingly popular approach to mak...
While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distr...
This paper introduces the Parallel Hierarchical Sampler (PHS), a class of Markov chain Monte Carlo ...
Markov chain Monte Carlo (MCMC) algorithms have become powerful tools for Bayesian inference. Howeve...
In big data context, traditional MCMC methods, such as Metropolis-Hastings algorithms and hybrid Mon...
We propose Subsampling MCMC, a Markov Chain Monte Carlo (MCMC) framework where the likelihood functi...
Global fits of physics models require efficient methods for exploring high-dimensional and/or multim...
Funding Information: We thank the U.S. National Science Foundation, Institute of Education Sciences,...
Computational intensity and sequential nature of estimation techniques for Bayesian methods in stati...
The advent of probabilistic programming languages has galvanized scientists to write increasingly di...
© 2018, Indian Statistical Institute. The rapid development of computing power and efficient Markov ...