This thesis presents work which uses Machine Learning techniques in a variety of sampling situations which appear in physics. In the first Chapter some background on Machine Learning will be presented which will lay the foundations required for the later Chapters. Next we will look at how a specific Machine Learning model, the Restricted Boltzmann Machine, can be trained to approximate a target distribution from data which has already been sampled from the target distribution. We estimate observables on states sampled from trained models and compare them to observables estimated directly from the training data. We present a technique for estimating the likelihood function of the model using annealed importance sampling. Finally we presen...
In this thesis, we leverage powerful statistical frameworks for optimal sequential estimation and tr...
Machine learning algorithms relying on deep new networks recently allowed a great leap forward in ar...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
We discuss several algorithms for sampling from unnormalized probability distributions in statistica...
Probabilistic inference is at the core of many recent advances in machine learning. Unfortunately, ...
At the extremes, two antithetical approaches to describing natural processes exist. Theoretical mode...
Numbers are present everywhere, and when they are collected and recorded we refer to them as data. M...
Neural network models able to approximate and sample high-dimensional probability distributions are ...
Monte Carlo simulations are a crucial component when analysing the Standard Model and New physics pr...
The exchange of ideas between statistical physics and computer science has been very fruitful and is...
We study probabilistic models that are known incompletely, up to an intractable normalising constant...
International audienceRestricted Boltzmann Machines are simple and powerful generative models that c...
This doctoral thesis in computational statistics utilizes both Monte Carlo methods(approximate Bayes...
This thesis is concerned with methods for Bayesian inference and their applications in astrophysics....
In this thesis, we leverage powerful statistical frameworks for optimal sequential estimation and tr...
Machine learning algorithms relying on deep new networks recently allowed a great leap forward in ar...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
We discuss several algorithms for sampling from unnormalized probability distributions in statistica...
Probabilistic inference is at the core of many recent advances in machine learning. Unfortunately, ...
At the extremes, two antithetical approaches to describing natural processes exist. Theoretical mode...
Numbers are present everywhere, and when they are collected and recorded we refer to them as data. M...
Neural network models able to approximate and sample high-dimensional probability distributions are ...
Monte Carlo simulations are a crucial component when analysing the Standard Model and New physics pr...
The exchange of ideas between statistical physics and computer science has been very fruitful and is...
We study probabilistic models that are known incompletely, up to an intractable normalising constant...
International audienceRestricted Boltzmann Machines are simple and powerful generative models that c...
This doctoral thesis in computational statistics utilizes both Monte Carlo methods(approximate Bayes...
This thesis is concerned with methods for Bayesian inference and their applications in astrophysics....
In this thesis, we leverage powerful statistical frameworks for optimal sequential estimation and tr...
Machine learning algorithms relying on deep new networks recently allowed a great leap forward in ar...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...