Year after years, the amount of data that we continuously generate is increasing. When this situation started the main challenge was to find a way to store the huge quantity of information. Nowadays, with the increasing availability of storage facilities, this problem is solved but it gives us a new issue to deal with: find tools that allow us to learn from this large data sets. In this thesis, a framework for Bayesian learning with the ability to scale to large data sets is studied. We present the Stochastic Gradient Langevin Dynamics (SGLD) framework and show that in some cases its approximation of the posterior distribution is quite poor. A reason for this can be that SGLD estimates the gradient of the log-likelihood with a high variabil...
Stochastic gradient MCMC methods, such as stochastic gradient Langevin dynamics (SGLD), employ fast ...
It is well known that Markov chain Monte Carlo (MCMC) methods scale poorly with dataset size. A popu...
Despite the powerful advantages of Bayesian inference such as quantifying uncertainty, ac- curate av...
Year after years, the amount of data that we continuously generate is increasing. When this situatio...
International audienceStochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorit...
In this paper we propose a new framework for learning from large scale datasets based on iterative l...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
Stochastic gradient MCMC methods, such as stochastic gradient Langevin dynamics (SGLD), employ fast ...
Stochastic gradient MCMC methods, such as stochastic gradient Langevin dynamics (SGLD), employ fast ...
It is well known that Markov chain Monte Carlo (MCMC) methods scale poorly with dataset size. A popu...
Despite the powerful advantages of Bayesian inference such as quantifying uncertainty, ac- curate av...
Year after years, the amount of data that we continuously generate is increasing. When this situatio...
International audienceStochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorit...
In this paper we propose a new framework for learning from large scale datasets based on iterative l...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Best Paper AwardInternational audienceOne way to avoid overfitting in machine learning is to use mod...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
We propose an adaptively weighted stochastic gradient Langevin dynamics algorithm (SGLD), so-called ...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally e...
Stochastic gradient MCMC methods, such as stochastic gradient Langevin dynamics (SGLD), employ fast ...
Stochastic gradient MCMC methods, such as stochastic gradient Langevin dynamics (SGLD), employ fast ...
It is well known that Markov chain Monte Carlo (MCMC) methods scale poorly with dataset size. A popu...
Despite the powerful advantages of Bayesian inference such as quantifying uncertainty, ac- curate av...