Recent advances in coreset methods have shown that a selection of representative datapoints can replace massive volumes of data for Bayesian inference, preserving the relevant statistical information and significantly accelerating subsequent downstream tasks. Existing variational coreset constructions rely on either selecting subsets of the observed datapoints, or jointly performing approximate inference and optimizing pseudodata in the observed space akin to inducing points methods in Gaussian Processes. So far, both approaches are limited by complexities in evaluating their objectives for general purpose models, and require generating samples from a typically intractable posterior over the coreset throughout inference and testing. In this...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
We advocate an optimization-centric view of Bayesian inference. Our inspiration is the representatio...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
We present a novel stochastic variational Gaussian process ($\mathcal{GP}$) inference method, based ...
Bayesian coresets approximate a posterior distribution by building a small weighted subset of the da...
Approximating probability densities is a core problem in Bayesian statistics, where the inference in...
Standard Bayesian inference algorithms are prohibitively expensive in the regime of modern large-sca...
We develop an optimization algorithm suitable for Bayesian learning in complex models. Our approach ...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...
Variational inference is a powerful paradigm for approximate Bayesian inference with a number of app...
Variational inference has recently emerged as a popular alternative to the classical Markov chain Mo...
A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during Bayesia...
There has been considerable recent interest in Bayesian modeling of high-dimensional networks via la...
Variational Bayes methods approximate the posterior density by a family of tractable distributions a...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
We advocate an optimization-centric view of Bayesian inference. Our inspiration is the representatio...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
We present a novel stochastic variational Gaussian process ($\mathcal{GP}$) inference method, based ...
Bayesian coresets approximate a posterior distribution by building a small weighted subset of the da...
Approximating probability densities is a core problem in Bayesian statistics, where the inference in...
Standard Bayesian inference algorithms are prohibitively expensive in the regime of modern large-sca...
We develop an optimization algorithm suitable for Bayesian learning in complex models. Our approach ...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...
Variational inference is a powerful paradigm for approximate Bayesian inference with a number of app...
Variational inference has recently emerged as a popular alternative to the classical Markov chain Mo...
A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during Bayesia...
There has been considerable recent interest in Bayesian modeling of high-dimensional networks via la...
Variational Bayes methods approximate the posterior density by a family of tractable distributions a...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
We advocate an optimization-centric view of Bayesian inference. Our inspiration is the representatio...