Learning is the ability to generalise beyond training examples; but because many generalisations are consistent with a given set of observations, all machine learning methods rely on inductive biases to select certain generalisations over others. This thesis explores how the model structure and priors affect the inductiven biases of probabilistic models, and our ability to learn and make inferences from data. Specifically we present theoretical analyses alongside algorithmic and modelling advances in three areas of probabilistic machine learning: sparse Gaussian process approximations and invariant covariance functions, learning flexible priors for variational autoencoders, and probabilistic approaches for few-shot learning. As inference i...
Gaussian process (GP) models form a core part of probabilistic machine learning. Con-siderable resea...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Good sparse approximations are essential for practical inference in Gaussian Processes as the comput...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
One of the most notable distinctions between humans and most other animals is our ability to grow co...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Sparse Gaussian processes and various extensions thereof are enabled through inducing points, that ...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Probabilistic inference is at the core of many recent advances in machine learning. Unfortunately, ...
Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to in...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Non-parametric models and techniques enjoy a growing popularity in the field of machine learning, an...
Gaussian process (GP) models form a core part of probabilistic machine learning. Con-siderable resea...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Good sparse approximations are essential for practical inference in Gaussian Processes as the comput...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either ...
One of the most notable distinctions between humans and most other animals is our ability to grow co...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Sparse Gaussian processes and various extensions thereof are enabled through inducing points, that ...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Probabilistic inference is at the core of many recent advances in machine learning. Unfortunately, ...
Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to in...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Non-parametric models and techniques enjoy a growing popularity in the field of machine learning, an...
Gaussian process (GP) models form a core part of probabilistic machine learning. Con-siderable resea...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...