We present a general method for deriving collapsed variational inference algorithms for probabilistic models in the conjugate exponential family. Our method unifies many existing approaches to collapsed variational inference. Our collapsed variational inference leads to a new lower bound on the marginal likelihood. We exploit the information geometry of the bound to derive much faster optimization methods based on conjugate gradients for these models. Our approach is very general and is easily applied to any model where the mean field update equations have been derived. Empirically we show significant speed-ups for probabilistic inference using our bound
We consider inference in a broad class of non-conjugate probabilistic models based on minimising the...
The central objective of this thesis is to develop new algorithms for inference in probabilistic gra...
In this paper, we introduce a new form of amortized variational inference by using the forward KL di...
We present a general method for deriving collapsed variational inference algo-rithms for probabilist...
Variational message passing is an efficient Bayesian inference method in factorized probabilistic mo...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
Item does not contain fulltextStochastic variational inference offers an attractive option as a defa...
Stochastic variational inference for collapsed models has recently been successfully applied to larg...
Variational Message Passing facilitates automated variational inference in factorized probabilistic ...
We propose a new variational inference method based on a proximal framework that uses the Kullback-L...
In many modern data analysis problems, the available data is not static but, instead, comes in a str...
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for ap...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
Variational Gaussian (VG) inference methods that optimize a lower bound to the marginal likelihood a...
<p>One of the core problems of modern statistics is to approximate difficult-to-compute probability ...
We consider inference in a broad class of non-conjugate probabilistic models based on minimising the...
The central objective of this thesis is to develop new algorithms for inference in probabilistic gra...
In this paper, we introduce a new form of amortized variational inference by using the forward KL di...
We present a general method for deriving collapsed variational inference algo-rithms for probabilist...
Variational message passing is an efficient Bayesian inference method in factorized probabilistic mo...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
Item does not contain fulltextStochastic variational inference offers an attractive option as a defa...
Stochastic variational inference for collapsed models has recently been successfully applied to larg...
Variational Message Passing facilitates automated variational inference in factorized probabilistic ...
We propose a new variational inference method based on a proximal framework that uses the Kullback-L...
In many modern data analysis problems, the available data is not static but, instead, comes in a str...
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for ap...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
Variational Gaussian (VG) inference methods that optimize a lower bound to the marginal likelihood a...
<p>One of the core problems of modern statistics is to approximate difficult-to-compute probability ...
We consider inference in a broad class of non-conjugate probabilistic models based on minimising the...
The central objective of this thesis is to develop new algorithms for inference in probabilistic gra...
In this paper, we introduce a new form of amortized variational inference by using the forward KL di...