International audienceWe extend the Langevin Monte Carlo (LMC) algorithm to compactly supported measures via a projection step, akin to projected Stochastic Gradient Descent (SGD). We show that (projected) LMC allows to sample in polynomial time from a log-concave distribution with smooth potential. This gives a new Markov chain to sample from a log-concave distribution. Our main result shows in particular that when the target distribution is uniform, LMC mixes in O(n 7) steps (where n is the dimension). We also provide preliminary experimental evidence that LMC performs at least as well as hit-and-run, for which a better mixing time of O(n 4) was proved by Lovász and Vempala
International audienceMarkov Chain Monte Carlo sampling algorithms are efficient Bayesian tools to e...
Many posteriors can be factored into a product of a conditional density which is easy to sample dire...
We consider a family of unadjusted HMC samplers, which includes standard position HMC samplers and d...
International audienceThis paper presents a detailed theoretical analysis of the Langevin Monte Carl...
This paper presents two new Langevin Markov chain Monte Carlo methods that use con-vex analysis to s...
International audienceIn this paper, two new algorithms to sample from possibly non-smooth log-conca...
A well-known first-order method for sampling from log-concave probability distributions is the Unadj...
For sampling from a log-concave density, we study implicit integrators resulting from θ- method disc...
The class of logconcave functions in R n is a common generalization of Gaussians and of indicator fu...
We obtain quantitative bounds on the mixing properties of the Hamiltonian Monte Carlo (HMC) algorith...
The first part of this thesis concerns the inference of un-normalized statistical models. We study t...
We study the mixing time of the Metropolis-adjusted Langevin algorithm (MALA) for sampling from a lo...
Sampling from probability distributions is a problem of significant importance in Statistics and Mac...
Generating samples from multivariate distributions efficiently is an important task in Monte Carlo i...
International audienceMarkov Chain Monte Carlo sampling algorithms are efficient Bayesian tools to e...
Many posteriors can be factored into a product of a conditional density which is easy to sample dire...
We consider a family of unadjusted HMC samplers, which includes standard position HMC samplers and d...
International audienceThis paper presents a detailed theoretical analysis of the Langevin Monte Carl...
This paper presents two new Langevin Markov chain Monte Carlo methods that use con-vex analysis to s...
International audienceIn this paper, two new algorithms to sample from possibly non-smooth log-conca...
A well-known first-order method for sampling from log-concave probability distributions is the Unadj...
For sampling from a log-concave density, we study implicit integrators resulting from θ- method disc...
The class of logconcave functions in R n is a common generalization of Gaussians and of indicator fu...
We obtain quantitative bounds on the mixing properties of the Hamiltonian Monte Carlo (HMC) algorith...
The first part of this thesis concerns the inference of un-normalized statistical models. We study t...
We study the mixing time of the Metropolis-adjusted Langevin algorithm (MALA) for sampling from a lo...
Sampling from probability distributions is a problem of significant importance in Statistics and Mac...
Generating samples from multivariate distributions efficiently is an important task in Monte Carlo i...
International audienceMarkov Chain Monte Carlo sampling algorithms are efficient Bayesian tools to e...
Many posteriors can be factored into a product of a conditional density which is easy to sample dire...
We consider a family of unadjusted HMC samplers, which includes standard position HMC samplers and d...