We propose a general purpose Bayesian inference algorithm for expensive likelihoods, replacing the stochastic term in the Langevin equation with a deterministic density gradient term. The particle density is evaluated from the current particle positions using a Normalizing Flow (NF), which is differentiable and has good generalization properties in high dimensions. We take advantage of NF preconditioning and NF based Metropolis-Hastings updates for a faster and unbiased convergence. We show on various examples that the method is competitive against state of the art sampling methods.Comment: 15 pages, 8 figure
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT), a method that combines a ...
The first part of this thesis concerns the inference of un-normalized statistical models. We study t...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
This work consists of two separate parts. In the first part we extend the work on exact simulation o...
The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
Normalizing flows model a complex target distribution in terms of a bijective transform operating on...
Stochastic gradient Markov Chain Monte Carlo algorithms are popular samplers for approximate inferen...
In this paper, we study the computational complexity of sampling from a Bayesian posterior (or pseud...
We propose a computational method (with acronym ALDI) for sampling from a given target distribution ...
One of the many things we like about this paper is that it forces us to change our perspective on Me...
The computational cost of usual Monte Carlo methods for sampling a posteriori laws in Bayesian infer...
This paper introduces the Langevin Monte Carlo Filter (LMCF), a particle filter with a Markov chain ...
International audienceStochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorit...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT), a method that combines a ...
The first part of this thesis concerns the inference of un-normalized statistical models. We study t...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
This work consists of two separate parts. In the first part we extend the work on exact simulation o...
The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
Normalizing flows model a complex target distribution in terms of a bijective transform operating on...
Stochastic gradient Markov Chain Monte Carlo algorithms are popular samplers for approximate inferen...
In this paper, we study the computational complexity of sampling from a Bayesian posterior (or pseud...
We propose a computational method (with acronym ALDI) for sampling from a given target distribution ...
One of the many things we like about this paper is that it forces us to change our perspective on Me...
The computational cost of usual Monte Carlo methods for sampling a posteriori laws in Bayesian infer...
This paper introduces the Langevin Monte Carlo Filter (LMCF), a particle filter with a Markov chain ...
International audienceStochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorit...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT), a method that combines a ...
The first part of this thesis concerns the inference of un-normalized statistical models. We study t...