Gaussian Processes (GPs) provide powerful probabilistic frameworks for interpolation, forecasting, and smoothing, but have been hampered by computational scaling issues. Here we investigate data sampled on one dimension (e.g., a scalar or vector time series sampled at arbitrarily-spaced intervals), for which state-space models are popular due to their linearly-scaling computational costs. It has long been conjectured that state-space models are general, able to approximate any one-dimensional GP. We provide the first general proof of this conjecture, showing that any stationary GP on one dimension with vector-valued observations governed by a Lebesgue-integrable continuous kernel can be approximated to any desired precision using a specific...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
In this paper we study the accuracy and convergence of state-space approximations of Gaussian proces...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...
Deep generative models are widely used for modelling high-dimensional time series, such as video ani...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
Gaussian processes are a powerful and flexible class of nonparametric models that use covariance fun...
The use of Gaussian processes (GPs) is supported by efficient sampling algorithms, a rich methodolog...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Gaussian processes provide a flexible framework for forecasting, removing noise, and interpreting lo...
State-space inference and learning with Gaussian processes (GPs) is an unsolved problem. We propose ...
Stochastic gradient descent (SGD) and its variants have established themselves as the go-to algorith...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
In this paper we study the accuracy and convergence of state-space approximations of Gaussian proces...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...
Deep generative models are widely used for modelling high-dimensional time series, such as video ani...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
Gaussian processes are a powerful and flexible class of nonparametric models that use covariance fun...
The use of Gaussian processes (GPs) is supported by efficient sampling algorithms, a rich methodolog...
Gaussian process latent variable models (GPLVM) are a flexible and non-linear approach to dimensiona...
Gaussian processes provide a flexible framework for forecasting, removing noise, and interpreting lo...
State-space inference and learning with Gaussian processes (GPs) is an unsolved problem. We propose ...
Stochastic gradient descent (SGD) and its variants have established themselves as the go-to algorith...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
In this paper we study the accuracy and convergence of state-space approximations of Gaussian proces...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...