Making predictions and quantifying their uncertainty when the input data is sequential is a fundamental learning challenge, recently attracting increasing attention. We develop SigGPDE, a new scalable sparse variational inference framework for Gaussian Processes (GPs) on sequential data. Our contribution is twofold. First, we construct inducing variables underpinning the sparse approximation so that the resulting evidence lower bound (ELBO) does not require any matrix inversion. Second, we show that the gradients of the GP signature kernel are solutions of a hyperbolic partial differential equation (PDE). This theoretical insight allows us to build an efficient back-propagation algorithm to optimize the ELBO. We showcase the significant com...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...
10 pagesApproximations to Gaussian processes based on inducing variables, combined with variational ...
Statistical inference for functions is an important topic for regression and classification problems...
Stochastic gradient descent (SGD) and its variants have established themselves as the go-to algorith...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamen...
10 pagesApproximations to Gaussian processes based on inducing variables, combined with variational ...
Statistical inference for functions is an important topic for regression and classification problems...
Stochastic gradient descent (SGD) and its variants have established themselves as the go-to algorith...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
While much research effort has been dedicated to scaling up sparse Gaussian process (GP) models base...
Nott∗ We develop a fast deterministic variational approximation scheme for Gaussian process (GP) reg...
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filteri...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
Gaussian processes (GPs) are a powerful tool for probabilistic inference over functions. They have b...
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse...