We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterior linearisation (PL) as extensions of Newton's method for optimising the parameters of a Bayesian posterior distribution. This viewpoint explicitly casts inference algorithms under the framework of numerical optimisation. We show that common approximations to Newton's method from the optimisation literature, namely Gauss-Newton and quasi-Newton methods (e.g., the BFGS algorithm), are still valid under this 'Bayes-Newton' framework. This leads to a suite of novel algorithms which are guaranteed to result in positive semi-definite (PSD) covariance matrices, unlike standard VI and EP. Our unifying viewpoint provides new insights into the connect...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
The Bayesian Conjugate Gradient method (BayesCG) is a probabilistic generalization of the Conjugate ...
The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coh...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...
The results in this thesis are based on applications of the expectation propagation algorithm to app...
We develop a fast and accurate approach to approximate posterior distributions in the Bayesian empir...
Variational Bayes (VB) has been proposed as a method to facilitate calculations of the posterior dis...
Variational Bayes (VB) has been proposed as a method to facilitate calculations of the posterior dis...
Abstract. This manuscript proposes a probabilistic framework for algorithms that iteratively solve u...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
We propose Pathfinder, a variational method for approximately sampling from differentiable probabili...
Variational methods for approximate Bayesian inference provide fast, flexible, deterministic alterna...
Abstract: Particle Metropolis-Hastings enables Bayesian parameter inference in general nonlinear sta...
We present a novel method for approximate inference. Using some of the constructs from expectation p...
We present a novel method for approximate inference. Using some of the constructs from expectation p...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
The Bayesian Conjugate Gradient method (BayesCG) is a probabilistic generalization of the Conjugate ...
The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coh...
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterio...
The results in this thesis are based on applications of the expectation propagation algorithm to app...
We develop a fast and accurate approach to approximate posterior distributions in the Bayesian empir...
Variational Bayes (VB) has been proposed as a method to facilitate calculations of the posterior dis...
Variational Bayes (VB) has been proposed as a method to facilitate calculations of the posterior dis...
Abstract. This manuscript proposes a probabilistic framework for algorithms that iteratively solve u...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
We propose Pathfinder, a variational method for approximately sampling from differentiable probabili...
Variational methods for approximate Bayesian inference provide fast, flexible, deterministic alterna...
Abstract: Particle Metropolis-Hastings enables Bayesian parameter inference in general nonlinear sta...
We present a novel method for approximate inference. Using some of the constructs from expectation p...
We present a novel method for approximate inference. Using some of the constructs from expectation p...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
The Bayesian Conjugate Gradient method (BayesCG) is a probabilistic generalization of the Conjugate ...
The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coh...