The framework of normalizing flows provides a general strategy for flexible variational inference of posteriors over latent variables. We propose a new type of normalizing flow, inverse autoregressive flow (IAF), that, in contrast to earlier published flows, scales well to high-dimensional latent spaces. The proposed flow consists of a chain of invertible transformations, where each transformation is based on an autoregressive neural network. In experiments, we show that IAF significantly improves upon diagonal Gaussian approximate posteriors. In addition, we demonstrate that a novel type of variational autoencoder, coupled with IAF, is competitive with neural autoregressive models in terms of attained log-likelihood on natural images, whil...
International audienceBayesian methods were studied in this paper using deep neural networks. We are...
Normalizing flows is a promising avenue in both density estimation and variational inference, which ...
Time series are widely used in applications such as finance, robotics, telecommunications, astronomy...
Variational auto-encoders (VAE) are scalable and powerful generative models. However, the choice of ...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...
Many models could generate text conditioned on some context, but those approaches don’t provide us w...
Natural language processing (NLP) has pervasive applications in everyday life, and has recently witn...
The two key characteristics of a normalizing flow is that it is invertible (in particular, dimension...
Variational inference relies on flexible approximate posterior distributions. Normalizing flows prov...
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative ...
The automation of probabilistic reasoning is one of the primary aims of machine learning. Recently, ...
Flow-based generative models are an important class of exact inference models that admit efficient i...
Normalizing flows have emerged as an important family of deep neural networks for modelling complex ...
Statistical learning methods often embed the data in a latent space where the classification or regr...
The choice of approximate posterior distribution is one of the core problems in variational infer-en...
International audienceBayesian methods were studied in this paper using deep neural networks. We are...
Normalizing flows is a promising avenue in both density estimation and variational inference, which ...
Time series are widely used in applications such as finance, robotics, telecommunications, astronomy...
Variational auto-encoders (VAE) are scalable and powerful generative models. However, the choice of ...
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment th...
Many models could generate text conditioned on some context, but those approaches don’t provide us w...
Natural language processing (NLP) has pervasive applications in everyday life, and has recently witn...
The two key characteristics of a normalizing flow is that it is invertible (in particular, dimension...
Variational inference relies on flexible approximate posterior distributions. Normalizing flows prov...
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative ...
The automation of probabilistic reasoning is one of the primary aims of machine learning. Recently, ...
Flow-based generative models are an important class of exact inference models that admit efficient i...
Normalizing flows have emerged as an important family of deep neural networks for modelling complex ...
Statistical learning methods often embed the data in a latent space where the classification or regr...
The choice of approximate posterior distribution is one of the core problems in variational infer-en...
International audienceBayesian methods were studied in this paper using deep neural networks. We are...
Normalizing flows is a promising avenue in both density estimation and variational inference, which ...
Time series are widely used in applications such as finance, robotics, telecommunications, astronomy...