Gaussian process classification is a popular method with a number of appealing properties. We show how to scale the model within a variational inducing point framework, outperforming the state of the art on benchmark datasets. Importantly, the variational formulation can be exploited to allow classification in problems with millions of data points, as we demonstrate in experiments.JH was supported by a MRC fellowship, AM and ZG by EPSRC grant EP/I036575/1, and a Google Focussed Research award.This is the final version of the article. It was first available from JMLR via http://jmlr.org/proceedings/papers/v38/hensman15.pd
Variational methods have been recently considered for scaling the training process of Gaussian proce...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process classification is a popular method with a number of appealing proper-ties. We show ...
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma d...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points ...
We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theor...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process classification is a popular method with a number of appealing proper-ties. We show ...
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma d...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points ...
We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theor...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...