We introduce a deep, generative autoencoder ca-pable of learning hierarchies of distributed rep-resentations from data. Successive deep stochas-tic hidden layers are equipped with autoregres-sive connections, which enable the model to be sampled from quickly and exactly via ancestral sampling. We derive an efficient approximate parameter estimation method based on the mini-mum description length (MDL) principle, which can be seen as maximising a variational lower bound on the log-likelihood, with a feedforward neural network implementing approximate infer-ence. We demonstrate state-of-the-art generative performance on a number of classic data sets: several UCI data sets, MNIST and Atari 2600 games. 1
Despite the recent progress in machine learning and deep learning, unsupervised learning still remai...
We focus on generative autoencoders, such as variational or adversarial autoencoders, which jointly ...
A deep latent variable model is a powerful tool for modelling complex distributions. However, in ord...
We introduce a deep, generative autoencoder ca-pable of learning hierarchies of distributed rep-rese...
We consider the problem of learning deep gener-ative models from data. We formulate a method that ge...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a gen-eralised...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised ...
There has been a lot of recent interest in designing neural network models to estimate a distributio...
When using deep, multi-layered architectures to build generative models of data, it is difficult to ...
Advances in variational inference enable parameterisation of probabilistic models by deep neural net...
Deep generative models allow us to learn hidden representations of data and generate new examples. T...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
A deep generative model is characterized by a representation space, its distribution, and a neural n...
Advances in deep generative models are at the forefront of deep learning research because of the pro...
We introduce a new approach to learning in hierarchical latent-variable generative models called the...
Despite the recent progress in machine learning and deep learning, unsupervised learning still remai...
We focus on generative autoencoders, such as variational or adversarial autoencoders, which jointly ...
A deep latent variable model is a powerful tool for modelling complex distributions. However, in ord...
We introduce a deep, generative autoencoder ca-pable of learning hierarchies of distributed rep-rese...
We consider the problem of learning deep gener-ative models from data. We formulate a method that ge...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a gen-eralised...
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised ...
There has been a lot of recent interest in designing neural network models to estimate a distributio...
When using deep, multi-layered architectures to build generative models of data, it is difficult to ...
Advances in variational inference enable parameterisation of probabilistic models by deep neural net...
Deep generative models allow us to learn hidden representations of data and generate new examples. T...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
A deep generative model is characterized by a representation space, its distribution, and a neural n...
Advances in deep generative models are at the forefront of deep learning research because of the pro...
We introduce a new approach to learning in hierarchical latent-variable generative models called the...
Despite the recent progress in machine learning and deep learning, unsupervised learning still remai...
We focus on generative autoencoders, such as variational or adversarial autoencoders, which jointly ...
A deep latent variable model is a powerful tool for modelling complex distributions. However, in ord...