This thesis presents five contributions to machine learning, with themes of differentiability and Bayesian inference. We present Firefly Monte Carlo, an auxiliary variable Markov chain Monte Carlo algorithm that only queries a potentially small subset of data at each iteration yet simulates from the exact posterior distribution. We describe the design and implementation of Autograd, a software package for efficiently computing derivatives of functions written in Python/Numpy using reverse accumulation mode differentiation. Using Autograd, we develop a convolutional neural network that takes arbitrary graphs, such as organic molecules, as input. This generalizes standard molecular feature representations and allows end-to-end adaptation o...
Monte Carlo methods are are an ubiquitous tool in modern statistics. Under the Bayesian paradigm, th...
Probabilistic modeling lets us infer, predict and make decisions based on incomplete or noisy data. ...
We propose Pathfinder, a variational method for approximately sampling from differentiable probabili...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
A variety of machine learning problems can be unifiedly viewed as optimizing a set of variables that...
Differentiable programming has emerged as a key programming paradigm empowering rapid developments o...
While modern machine learning and deep learning seem to dominate the areas where scalability and mod...
A general approach to Bayesian learning revisits some classical results, which study which functiona...
This thesis develops new methods for efficient approximate inference in probabilistic models. Such m...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic a...
Bayesian optimization has proven to be a highly effective methodology for the global optimization of...
Discrete expectations arise in various machine learning tasks, and we often need to backpropagate th...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
The deep learning community has devised a diverse set of methods to make gradient optimization, usin...
Monte Carlo methods are are an ubiquitous tool in modern statistics. Under the Bayesian paradigm, th...
Probabilistic modeling lets us infer, predict and make decisions based on incomplete or noisy data. ...
We propose Pathfinder, a variational method for approximately sampling from differentiable probabili...
Automatic decision making and pattern recognition under uncertainty are difficult tasks that are ubi...
A variety of machine learning problems can be unifiedly viewed as optimizing a set of variables that...
Differentiable programming has emerged as a key programming paradigm empowering rapid developments o...
While modern machine learning and deep learning seem to dominate the areas where scalability and mod...
A general approach to Bayesian learning revisits some classical results, which study which functiona...
This thesis develops new methods for efficient approximate inference in probabilistic models. Such m...
Bayesian machine learning has gained tremendous attention in the machine learning community over the...
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic a...
Bayesian optimization has proven to be a highly effective methodology for the global optimization of...
Discrete expectations arise in various machine learning tasks, and we often need to backpropagate th...
Variational inference is one of the tools that now lies at the heart of the modern data analysis lif...
The deep learning community has devised a diverse set of methods to make gradient optimization, usin...
Monte Carlo methods are are an ubiquitous tool in modern statistics. Under the Bayesian paradigm, th...
Probabilistic modeling lets us infer, predict and make decisions based on incomplete or noisy data. ...
We propose Pathfinder, a variational method for approximately sampling from differentiable probabili...