This study investigates the effects of Markov chain Monte Carlo (MCMC) sampling in unsupervised Maximum Likelihood (ML) learning. Our attention is restricted to the family of unnormalized probability densities for which the negative log density (or energy function) is a ConvNet. We find that many of the techniques used to stabilize training in previous studies are not necessary. ML learning with a ConvNet potential requires only a few hyper-parameters and no regularization. Using this minimal framework, we identify a variety of ML learning outcomes that depend solely on the implementation of MCMC sampling.On one hand, we show that it is easy to train an energy-based model which can sample realistic images with short-run Langevin. ML can be ...
Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in diff...
Bayesian learning in undirected graphical models—computing posterior distributions over parameters a...
In models that define probabilities via energies, maximum likelihood learning typically involves usi...
This work presents strategies to learn an Energy-Based Model (EBM) according to the desired length o...
Energy-based models are a powerful and flexible tool for studying emergent properties in systems wit...
Due to the intractable partition function, training energy-based models (EBMs) by maximum likelihood...
International audienceRestricted Boltzmann Machines are simple and powerful generative models that c...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
AbstractCarefully injected noise can speed the average convergence of Markov chain Monte Carlo (MCMC...
Generating random samples from a prescribed distribution is one of the most important and challengin...
We exhibit examples of high-dimensional unimodal posterior distributions arising in non-linear regre...
Many recent and often (Adaptive) Markov Chain Monte Carlo (A)MCMC methods are associated in practice...
Many recent and often adaptive Markov Chain Monte Carlo (MCMC) methods are associated in practice to...
Restricted Boltzmann Machines are simple and powerful generative models that can encode any complex ...
Drawing samples from a known distribution is a core computational challenge common to many disciplin...
Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in diff...
Bayesian learning in undirected graphical models—computing posterior distributions over parameters a...
In models that define probabilities via energies, maximum likelihood learning typically involves usi...
This work presents strategies to learn an Energy-Based Model (EBM) according to the desired length o...
Energy-based models are a powerful and flexible tool for studying emergent properties in systems wit...
Due to the intractable partition function, training energy-based models (EBMs) by maximum likelihood...
International audienceRestricted Boltzmann Machines are simple and powerful generative models that c...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
AbstractCarefully injected noise can speed the average convergence of Markov chain Monte Carlo (MCMC...
Generating random samples from a prescribed distribution is one of the most important and challengin...
We exhibit examples of high-dimensional unimodal posterior distributions arising in non-linear regre...
Many recent and often (Adaptive) Markov Chain Monte Carlo (A)MCMC methods are associated in practice...
Many recent and often adaptive Markov Chain Monte Carlo (MCMC) methods are associated in practice to...
Restricted Boltzmann Machines are simple and powerful generative models that can encode any complex ...
Drawing samples from a known distribution is a core computational challenge common to many disciplin...
Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in diff...
Bayesian learning in undirected graphical models—computing posterior distributions over parameters a...
In models that define probabilities via energies, maximum likelihood learning typically involves usi...