Bayesian coresets approximate a posterior distribution by building a small weighted subset of the data points. Any inference procedure that is too computationally expensive to be run on the full posterior can instead be run inexpensively on the coreset, with results that approximate those on the full data. However, current approaches are limited by either a significant run-time or the need for the user to specify a low-cost approximation to the full posterior. We propose a Bayesian coreset construction algorithm that first selects a uniformly random subset of data, and then optimizes the weights using a novel quasi-Newton method. Our algorithm is a simple to implement, black-box method, that does not require the user to specify a low-cost p...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...
Coherent uncertainty quantification is a key strength of Bayesian methods. But modern algorithms for...
A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during Bayesia...
Recent advances in coreset methods have shown that a selection of representative datapoints can repl...
Standard Bayesian inference algorithms are prohibitively expensive in the regime of modern large-sca...
Abstract Scalable training of Bayesian nonparametric models is a notoriously difficult challenge. We...
This dissertation studies a general framework using spike-and-slab prior distributions to facilitate...
Sequential Monte Carlo samplers represent a compelling approach to posterior inference in Bayesian m...
We present a novel stochastic variational Gaussian process ($\mathcal{GP}$) inference method, based ...
Computational Bayesian statistics builds approximations to the posterior distribution either bysampl...
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large ...
Many modern statistical applications involve inference for complicated stochastic models for which t...
Approximating probability densities is a core problem in Bayesian statistics, where the inference in...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...
Coherent uncertainty quantification is a key strength of Bayesian methods. But modern algorithms for...
A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during Bayesia...
Recent advances in coreset methods have shown that a selection of representative datapoints can repl...
Standard Bayesian inference algorithms are prohibitively expensive in the regime of modern large-sca...
Abstract Scalable training of Bayesian nonparametric models is a notoriously difficult challenge. We...
This dissertation studies a general framework using spike-and-slab prior distributions to facilitate...
Sequential Monte Carlo samplers represent a compelling approach to posterior inference in Bayesian m...
We present a novel stochastic variational Gaussian process ($\mathcal{GP}$) inference method, based ...
Computational Bayesian statistics builds approximations to the posterior distribution either bysampl...
This paper introduces a framework for speeding up Bayesian inference conducted in presence of large ...
Many modern statistical applications involve inference for complicated stochastic models for which t...
Approximating probability densities is a core problem in Bayesian statistics, where the inference in...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...