Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have been applied to both regression and non-linear dimensionality reduction, and offer desirable properties such as uncertainty estimates, robustness to over-fitting, and principled ways for tuning hyper-parameters. However the scalability of these models to big datasets remains an active topic of research. We introduce a novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm. This is done by exploiting the decoupling of the data given the inducing points to re-formulate the evidence lower bound in a Map-Reduce setting. We show that the inference scal...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
International audienceIn this paper a sparse approximation of inference for multi-output Gaussian Pr...
Statistical inference for functions is an important topic for regression and classification problems...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
Heteroscedastic regression considering the varying noises among observations has many applications i...
International audienceIn this paper a sparse approximation of inference for multi-output Gaussian Pr...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
International audienceIn this paper a sparse approximation of inference for multi-output Gaussian Pr...
Statistical inference for functions is an important topic for regression and classification problems...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
A natural extension to standard Gaussian process (GP) regression is the use of non-stationary Gaussi...
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dime...
Heteroscedastic regression considering the varying noises among observations has many applications i...
International audienceIn this paper a sparse approximation of inference for multi-output Gaussian Pr...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
International audienceIn this paper a sparse approximation of inference for multi-output Gaussian Pr...
Statistical inference for functions is an important topic for regression and classification problems...