Variational auto-encoders (VAE) are scalable and powerful generative models. However, the choice of the variational posterior determines tractability and flexibility of the VAE. Commonly, latent variables are modeled using the normal distribution with a diagonal covariance matrix. This results in computational efficiency but typically it is not flexible enough to match the true posterior distribution. One fashion of enriching the variational posterior distribution is application of normalizing flows, i.e., a series of invertible transformations to latent variables with a simple posterior. In this paper, we follow this line of thinking and propose a volume-preserving flow that uses a series of Householder transformations. We show empirically...
Density estimation, compression, and data generation are crucial tasks in artificial intelligence. V...
Normalizing flow (NF) has gained popularity over traditional maximum likelihood based methods due to...
none1noAn essential prerequisite for random generation of good quality samples in Variational Autoen...
The framework of normalizing flows provides a general strategy for flexible variational inference of...
Variational inference relies on flexible approximate posterior distributions. Normalizing flows prov...
The choice of approximate posterior distribution is one of the core problems in variational infer-en...
Variational auto-encoders (VAEs) are a powerful approach to unsupervised learning. They enable scala...
We propose a new structure for the variational auto-encoders (VAEs) prior, with the weakly informati...
Many models could generate text conditioned on some context, but those approaches don’t provide us w...
The automation of probabilistic reasoning is one of the primary aims of machine learning. Recently, ...
The variational autoencoder (VAE) is a powerful generative model that can estimate the probability o...
Variational autoencoders (VAEs) is a strong family of deep generative models based on variational in...
This paper explores two useful modifications of the recent variational autoencoder (VAE), a popular ...
Variational autoencoders (VAEs), as an important aspect of generative models, have received a lot of...
Density estimation, compression, and data generation are crucial tasks in artificial intelligence. V...
Density estimation, compression, and data generation are crucial tasks in artificial intelligence. V...
Normalizing flow (NF) has gained popularity over traditional maximum likelihood based methods due to...
none1noAn essential prerequisite for random generation of good quality samples in Variational Autoen...
The framework of normalizing flows provides a general strategy for flexible variational inference of...
Variational inference relies on flexible approximate posterior distributions. Normalizing flows prov...
The choice of approximate posterior distribution is one of the core problems in variational infer-en...
Variational auto-encoders (VAEs) are a powerful approach to unsupervised learning. They enable scala...
We propose a new structure for the variational auto-encoders (VAEs) prior, with the weakly informati...
Many models could generate text conditioned on some context, but those approaches don’t provide us w...
The automation of probabilistic reasoning is one of the primary aims of machine learning. Recently, ...
The variational autoencoder (VAE) is a powerful generative model that can estimate the probability o...
Variational autoencoders (VAEs) is a strong family of deep generative models based on variational in...
This paper explores two useful modifications of the recent variational autoencoder (VAE), a popular ...
Variational autoencoders (VAEs), as an important aspect of generative models, have received a lot of...
Density estimation, compression, and data generation are crucial tasks in artificial intelligence. V...
Density estimation, compression, and data generation are crucial tasks in artificial intelligence. V...
Normalizing flow (NF) has gained popularity over traditional maximum likelihood based methods due to...
none1noAn essential prerequisite for random generation of good quality samples in Variational Autoen...