While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models based on inducing variables for big data, little attention is afforded to the other less explored class of low-rank GP approximations that exploit the sparse spectral representation of a GP kernel. This paper presents such an effort to advance the state of the art of sparse spectrum GP models to achieve competitive predictive performance for massive datasets. Our generalized framework of stochastic variational Bayesian sparse spectrum GP (sVBSSGP) models addresses their shortcomings by adopting a Bayesian treatment of the spectral frequencies to avoid overfitting, modeling these frequencies jointly in its variational distribution to enable their ...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian Process (GP) inference is a probabilistic kernel method where the GP is treated as a latent...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
This paper presents a variational Bayesian kernel selection (VBKS) algorithm for sparse Gaussian pro...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
This is the final version of the article. It first appeared at http://jmlr.org/proceedings/papers/v3...
Statistical inference for functions is an important topic for regression and classification problems...
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsif...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian Process (GP) inference is a probabilistic kernel method where the GP is treated as a latent...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
This paper presents a variational Bayesian kernel selection (VBKS) algorithm for sparse Gaussian pro...
In this article, we propose a scalable Gaussian process (GP) regression method that combines the adv...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
This is the final version of the article. It first appeared at http://jmlr.org/proceedings/papers/v3...
Statistical inference for functions is an important topic for regression and classification problems...
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsif...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian Process (GP) inference is a probabilistic kernel method where the GP is treated as a latent...