Abstract Stochastic variational inference makes it possible to approximate posterior distributions induced by large datasets quickly using stochastic optimization. The algorithm relies on the use of fully factorized variational distributions. However, this "mean-field" independence approximation limits the fidelity of the posterior approximation, and introduces local optima. We show how to relax the mean-field approximation to allow arbitrary dependencies between global parameters and local hidden variables, producing better parameter estimates by reducing bias, sensitivity to local optima, and sensitivity to hyperparameters
Variational inference approximates the posterior distribution of a probabilistic model with a parame...
Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Stochastic variational inference makes it possible to approximate posterior distributions induced by...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
Stochastic variational inference finds good posterior approximations of probabilistic models with ve...
<p>Stochastic variational inference finds good posterior approximations of probabilistic models with...
Variational inference is an optimization-based method for approximating the posterior distribution o...
Variational inference is a popular alternative to Markov chain Monte Carlo methods that constructs ...
We investigate a local reparameterizaton technique for greatly reducing the variance of stochastic g...
Variational inference is an optimization-based method for approximating the posterior distribution o...
Stochastic variational inference is a promising method for fitting large-scale probabilistic models ...
The field of statistical machine learning has seen a rapid progress in complex hierarchical Bayesian...
Variational inference approximates the posterior distribution of a probabilistic model with a parame...
Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation...
Variational inference has become a widely used method to approximate posteriors in complex latent va...
Stochastic variational inference makes it possible to approximate posterior distributions induced by...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
How can we perform efficient inference and learning in directed probabilistic models, in the presenc...
Stochastic variational inference finds good posterior approximations of probabilistic models with ve...
<p>Stochastic variational inference finds good posterior approximations of probabilistic models with...
Variational inference is an optimization-based method for approximating the posterior distribution o...
Variational inference is a popular alternative to Markov chain Monte Carlo methods that constructs ...
We investigate a local reparameterizaton technique for greatly reducing the variance of stochastic g...
Variational inference is an optimization-based method for approximating the posterior distribution o...
Stochastic variational inference is a promising method for fitting large-scale probabilistic models ...
The field of statistical machine learning has seen a rapid progress in complex hierarchical Bayesian...
Variational inference approximates the posterior distribution of a probabilistic model with a parame...
Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation...
Variational inference has become a widely used method to approximate posteriors in complex latent va...