This work brings together two powerful concepts in Gaussian processes: the variational approach to sparse approximation and the spectral representation of Gaussian processes. This gives rise to an approximation that inherits the benefits of the variational approach but with the representational power and computational scalability of spectral representations. The work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances. We derive these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures. Under the assumption of additive Gaussian noise, our method requires only a single pas...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
This work brings together two powerful concepts in Gaussian processes: the variational approach to s...
International audienceThis work brings together two powerful concepts in Gaussian processes: the var...
This is the final version of the article. It first appeared at http://jmlr.org/proceedings/papers/v3...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning,...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
A Gaussian Process (GP) is a prominent mathematical framework for stochastic function approximation ...
Gaussian processes are flexible distributions over functions, which provide a nonparametric nonlinea...
Gaussian Processes (GPs) provide an extremely powerful mechanism to model a variety of problems but ...
We introduce the convolutional spectral kernel (CSK), a novel family of non-stationary, nonparametri...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points ...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
This work brings together two powerful concepts in Gaussian processes: the variational approach to s...
International audienceThis work brings together two powerful concepts in Gaussian processes: the var...
This is the final version of the article. It first appeared at http://jmlr.org/proceedings/papers/v3...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning,...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
A Gaussian Process (GP) is a prominent mathematical framework for stochastic function approximation ...
Gaussian processes are flexible distributions over functions, which provide a nonparametric nonlinea...
Gaussian Processes (GPs) provide an extremely powerful mechanism to model a variety of problems but ...
We introduce the convolutional spectral kernel (CSK), a novel family of non-stationary, nonparametri...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points ...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable resear...