We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets
Gaussian process classification is a popular method with a number of appealing properties. We show h...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Temporal data modeling plays a vital role in various research including finance, environmental scien...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
Gaussian processes (GPs) are a good choice for function approximation as they are flexible, robust t...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Temporal data modeling plays a vital role in various research including finance, environmental scien...
Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models...
We introduce a variational inference framework for training the Gaussian process latent variable mod...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
stitute two of the most important foci of modern machine learning research. In this preliminary work...