Variational inference relies on flexible approximate posterior distributions. Normalizing flows provide a general recipe to construct flexible variational posteriors. We introduce Sylvester normalizing flows, which can be seen as a generalization of planar flows. Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible. We compare the performance of Sylvester normalizing flows against planar flows and inverse autoregressive flows and demonstrate that they compare favorably on several datasets
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
In this paper, we propose an approach to effectively accelerating the computation of continuous norm...
Gaussian Processes (GP) can be used as flexible, non-parametric function priors. Inspired by the gro...
The choice of approximate posterior distribution is one of the core problems in variational infer-en...
The automation of probabilistic reasoning is one of the primary aims of machine learning. Recently, ...
Normalizing flows is a promising avenue in both density estimation and variational inference, which ...
Variational auto-encoders (VAE) are scalable and powerful generative models. However, the choice of ...
The framework of normalizing flows provides a general strategy for flexible variational inference of...
Variational inference is a powerful framework, used to approximate intractable posteriors through va...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...
Stokes inversion techniques are very powerful methods for obtaining information on the thermodynamic...
Abstract Stochastic variational inference makes it possible to approximate posterior distributions i...
We identify a new variational inference scheme for dynamical systems whose transition function is mo...
International audienceBuilding on the recent trend of new deep generative models known as Normalizin...
Normalizing flow (NF) has gained popularity over traditional maximum likelihood based methods due to...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
In this paper, we propose an approach to effectively accelerating the computation of continuous norm...
Gaussian Processes (GP) can be used as flexible, non-parametric function priors. Inspired by the gro...
The choice of approximate posterior distribution is one of the core problems in variational infer-en...
The automation of probabilistic reasoning is one of the primary aims of machine learning. Recently, ...
Normalizing flows is a promising avenue in both density estimation and variational inference, which ...
Variational auto-encoders (VAE) are scalable and powerful generative models. However, the choice of ...
The framework of normalizing flows provides a general strategy for flexible variational inference of...
Variational inference is a powerful framework, used to approximate intractable posteriors through va...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...
Stokes inversion techniques are very powerful methods for obtaining information on the thermodynamic...
Abstract Stochastic variational inference makes it possible to approximate posterior distributions i...
We identify a new variational inference scheme for dynamical systems whose transition function is mo...
International audienceBuilding on the recent trend of new deep generative models known as Normalizin...
Normalizing flow (NF) has gained popularity over traditional maximum likelihood based methods due to...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
In this paper, we propose an approach to effectively accelerating the computation of continuous norm...
Gaussian Processes (GP) can be used as flexible, non-parametric function priors. Inspired by the gro...