We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low-rank kernel approximations: based on random Fourier features, and based on truncating the kernel's Mercer expansion. In particular, we bound the Kullback–Leibler divergence between an exact GP and one resulting from one of the afore-described low-rank approximations to its kernel, as well as between their corresponding predictive densities, and we also bound the error between predictive mean vectors and between predictive covariance matrices computed using the exact versus using the approximate GP. We provide experiments on both simulated data and standard benchmarks to evaluate the effectiveness of our theoretical bounds
11 pagesInternational audienceA fundamental drawback of kernel-based statistical models is their lim...
This report tends to provide details on how to perform predictions using Gaussian process regression...
This paper examines experimental design procedures used to develop surrogates of computational model...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
A wealth of computationally efficient approximation methods for Gaussian process regression have bee...
Gaussian processes are distributions over functions that are versatile and mathematically convenient...
While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performan...
Gaussian process (GP) predictors are an important component of many Bayesian approaches to machine l...
The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the dat...
The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the dat...
Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as k...
The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the dat...
Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as k...
Kernel-based models such as kernel ridge regression and Gaussian processes are ubiquitous in machine...
Gaussian processes are a powerful and flexible class of nonparametric models that use covariance fun...
11 pagesInternational audienceA fundamental drawback of kernel-based statistical models is their lim...
This report tends to provide details on how to perform predictions using Gaussian process regression...
This paper examines experimental design procedures used to develop surrogates of computational model...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
A wealth of computationally efficient approximation methods for Gaussian process regression have bee...
Gaussian processes are distributions over functions that are versatile and mathematically convenient...
While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performan...
Gaussian process (GP) predictors are an important component of many Bayesian approaches to machine l...
The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the dat...
The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the dat...
Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as k...
The expressive power of a Gaussian process (GP) model comes at a cost of poor scalability in the dat...
Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as k...
Kernel-based models such as kernel ridge regression and Gaussian processes are ubiquitous in machine...
Gaussian processes are a powerful and flexible class of nonparametric models that use covariance fun...
11 pagesInternational audienceA fundamental drawback of kernel-based statistical models is their lim...
This report tends to provide details on how to perform predictions using Gaussian process regression...
This paper examines experimental design procedures used to develop surrogates of computational model...