Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, whi...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
We propose a new class of learning algorithms that combines variational approximation and Markov cha...
Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation...
Many recent advances in large scale probabilistic inference rely on variational methods. The success...
Recent advances in stochastic gradient variational inference have made it possible to perform variat...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
Recent advances in stochastic gradient variational inference have made it possi-ble to perform varia...
Recent advances in stochastic gradient varia-tional inference have made it possible to perform varia...
A new transdimensional Sequential Monte Carlo (SMC) algorithm called SMCVB is proposed. In an SMC ap...
A core problem in statistics and probabilistic machine learning is to compute probability distributi...
Variational Bayesian Monte Carlo (VBMC) is a recently introduced framework that uses Gaussian proces...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
<p>One of the core problems of modern statistics is to approximate difficult-to-compute probability ...
The central objective of this thesis is to develop new algorithms for inference in probabilistic gra...
We develop a method to combine Markov chain Monte Carlo (MCMC) and variational inference (VI), lever...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
We propose a new class of learning algorithms that combines variational approximation and Markov cha...
Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation...
Many recent advances in large scale probabilistic inference rely on variational methods. The success...
Recent advances in stochastic gradient variational inference have made it possible to perform variat...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
Recent advances in stochastic gradient variational inference have made it possi-ble to perform varia...
Recent advances in stochastic gradient varia-tional inference have made it possible to perform varia...
A new transdimensional Sequential Monte Carlo (SMC) algorithm called SMCVB is proposed. In an SMC ap...
A core problem in statistics and probabilistic machine learning is to compute probability distributi...
Variational Bayesian Monte Carlo (VBMC) is a recently introduced framework that uses Gaussian proces...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
<p>One of the core problems of modern statistics is to approximate difficult-to-compute probability ...
The central objective of this thesis is to develop new algorithms for inference in probabilistic gra...
We develop a method to combine Markov chain Monte Carlo (MCMC) and variational inference (VI), lever...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
We propose a new class of learning algorithms that combines variational approximation and Markov cha...
Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation...