Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points to form a low-rank approximation to the covariance matrix. In this work, we instead exploit a sparse approximation of the precision matrix. We propose variational nearest neighbor Gaussian process (VNNGP), which introduces a prior that only retains correlations within K nearest-neighboring observations, thereby inducing sparse precision structure. Using the variational framework, VNNGP's objective can be factorized over both observations and inducing points, enabling stochastic optimization with a time complexity of O($K^3$). Hence, we can arbitrarily scale the inducing point size, even to the point of putting inducing points at every observe...
This work brings together two powerful concepts in Gaussian processes: the variational approach to s...
International audienceThis work brings together two powerful concepts in Gaussian processes: the var...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
10 pagesApproximations to Gaussian processes based on inducing variables, combined with variational ...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
This work brings together two powerful concepts in Gaussian processes: the variational approach to s...
International audienceThis work brings together two powerful concepts in Gaussian processes: the var...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
10 pagesApproximations to Gaussian processes based on inducing variables, combined with variational ...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
This work brings together two powerful concepts in Gaussian processes: the variational approach to s...
International audienceThis work brings together two powerful concepts in Gaussian processes: the var...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...