Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits. In this paper, we improve their computational efficiency by using a dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation. Our dual parameterization speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning. The approach has the same memory cost as the current SVGP methods, but it is faster and more accurate.Peer reviewe
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian pr...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
Gaussian process (GP) models form a core part of probabilistic machine learning. Con-siderable resea...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian pr...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
Gaussian process (GP) models form a core part of probabilistic machine learning. Con-siderable resea...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...