Gaussian process classification is a popular method with a number of appealing properties. We show how to scale the model within a variational inducing point framework, outperforming the state of the art on benchmark datasets. Importantly, the variational formulation can be exploited to allow classification in problems with millions of data points, as we demonstrate in experiments
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process classification is a popular method with a number of appealing proper-ties. We show ...
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma d...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theor...
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points ...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process classification is a popular method with a number of appealing proper-ties. We show ...
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma d...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theor...
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points ...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
© ICLR 2016: San Juan, Puerto Rico. All Rights Reserved. We develop a scalable deep non-parametric g...
High dimensional time series are endemic in applications of machine learning such as robotics (senso...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...