International audienceGradient-descent-based algorithms and their stochastic versions have widespread applications in machine learning and statistical inference. In this work we perform an analytic study of the performances of one of them, the Langevin algorithm, in the context of noisy high-dimensional inference. We employ the Langevin algorithm to sample the posterior probability measure for the spiked matrix-tensor model. The typical behaviour of this algorithm is described by a system of integro-differential equations that we call the Langevin state evolution, whose solution is compared with the one of the state evolution of approximate message passing (AMP). Our results show that, remarkably, the algorithmic threshold of the Langevin a...
The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorithm which makes local moves by...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
Stochastic gradient Markov Chain Monte Carlo algorithms are popular samplers for approximate inferen...
International audienceGradient-descent-based algorithms and their stochastic versions have widesprea...
Gradient-descent-based algorithms and their stochastic versions have widespread applications in mach...
We introduce a novel and efficient algorithm called the stochastic approximate gradient descent (SAG...
International audienceStochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorit...
International audienceIn this work we analyse quantitatively the interplay between the loss landscap...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
In this paper we propose a new framework for learning from large scale datasets based on iterative l...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
In this paper, we study the computational complexity of sampling from a Bayesian posterior (or pseud...
Abstract We introduce a novel geometry-informed irreversible perturbation that accele...
In this paper we investigate how gradient-based algorithms such as gradient descent (GD), (multi-pas...
The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorithm which makes local moves by...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
Stochastic gradient Markov Chain Monte Carlo algorithms are popular samplers for approximate inferen...
International audienceGradient-descent-based algorithms and their stochastic versions have widesprea...
Gradient-descent-based algorithms and their stochastic versions have widespread applications in mach...
We introduce a novel and efficient algorithm called the stochastic approximate gradient descent (SAG...
International audienceStochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorit...
International audienceIn this work we analyse quantitatively the interplay between the loss landscap...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
In this paper we propose a new framework for learning from large scale datasets based on iterative l...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
In this paper, we study the computational complexity of sampling from a Bayesian posterior (or pseud...
Abstract We introduce a novel geometry-informed irreversible perturbation that accele...
In this paper we investigate how gradient-based algorithms such as gradient descent (GD), (multi-pas...
The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorithm which makes local moves by...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
Stochastic gradient Markov Chain Monte Carlo algorithms are popular samplers for approximate inferen...