Traditional approaches to variational inference rely on parametric families of variational distributions, with the choice of family playing a critical role in determining the accuracy of the resulting posterior approximation. Simple mean-field families often lead to poor approximations, while rich families of distributions like normalizing flows can be difficult to optimize and usually do not incorporate the known structure of the target distribution due to their black-box nature. To expand the space of flexible variational families, we revisit Variational Rejection Sampling (VRS) [Grover et al., 2018], which combines a parametric proposal distribution with rejection sampling to define a rich non-parametric family of distributions that expl...
Stochastic gradient-based optimisation for discrete latent variable models is challenging due to the...
Latent Gaussian models (LGMs) are perhaps the most commonly used class of models in statistical appl...
Recent advances in coreset methods have shown that a selection of representative datapoints can repl...
Variational Bayes methods approximate the posterior density by a family of tractable distributions a...
We investigate a local reparameterizaton technique for greatly reducing the variance of stochastic g...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Optimization with noisy gradients has become ubiquitous in statistics and machine learning. Reparame...
Variational inference has recently emerged as a popular alternative to the classical Markov chain Mo...
Stochastic variational inference algorithms are derived for fitting various heteroskedastic time ser...
Key to effective generic, or "black-box", variational inference is the selection of an approximation...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Stochastic gradient Markov Chain Monte Carlo (SGMCMC) is considered the gold standard for Bayesian i...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Probability density function estimation with weighted samples is the main foundation of all adaptive...
Stochastic gradient-based optimisation for discrete latent variable models is challenging due to the...
Latent Gaussian models (LGMs) are perhaps the most commonly used class of models in statistical appl...
Recent advances in coreset methods have shown that a selection of representative datapoints can repl...
Variational Bayes methods approximate the posterior density by a family of tractable distributions a...
We investigate a local reparameterizaton technique for greatly reducing the variance of stochastic g...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Optimization with noisy gradients has become ubiquitous in statistics and machine learning. Reparame...
Variational inference has recently emerged as a popular alternative to the classical Markov chain Mo...
Stochastic variational inference algorithms are derived for fitting various heteroskedastic time ser...
Key to effective generic, or "black-box", variational inference is the selection of an approximation...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Stochastic gradient Markov Chain Monte Carlo (SGMCMC) is considered the gold standard for Bayesian i...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Probability density function estimation with weighted samples is the main foundation of all adaptive...
Stochastic gradient-based optimisation for discrete latent variable models is challenging due to the...
Latent Gaussian models (LGMs) are perhaps the most commonly used class of models in statistical appl...
Recent advances in coreset methods have shown that a selection of representative datapoints can repl...