Kernel-based models such as kernel ridge regression and Gaussian processes are ubiquitous in machine learning applications for regression and optimization. It is well known that a major downside for kernel-based models is the high computational cost; given a dataset of $n$ samples, the cost grows as $\mathcal{O}(n^3)$. Existing sparse approximation methods can yield a significant reduction in the computational cost, effectively reducing the actual cost down to as low as $\mathcal{O}(n)$ in certain cases. Despite this remarkable empirical success, significant gaps remain in the existing results for the analytical bounds on the error due to approximation. In this work, we provide novel confidence intervals for the Nystr\"om method and the spa...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
One approach to improving the running time of kernel-based machine learning methods is to build a sm...
The paper studies convex stochastic optimization problems in a reproducing kernel Hilbert space (RKH...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Gaussian processes are distributions over functions that are versatile and mathematically convenient...
Kernel approximation is commonly used to scale kernel-based algorithms to applications contain-ing a...
Gaussian processes are distributions over functions that are versatile and mathematically convenient...
Gaussian processes scale prohibitively with the size of the dataset. In response, many approximation...
This paper studies an intriguing phenomenon related to the good generalization performance of estima...
This is the final version of the article. It first appeared at http://jmlr.org/proceedings/papers/v3...
A wealth of computationally efficient approximation methods for Gaussian process regression have bee...
We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low...
The kernel function and its hyperparameters are the central model selection choice in a Gaussian pro...
Cette thèse a pour objectif d’étudier et de valider expérimentalement les bénéfices, en terme de qua...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
One approach to improving the running time of kernel-based machine learning methods is to build a sm...
The paper studies convex stochastic optimization problems in a reproducing kernel Hilbert space (RKH...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Excellent variational approximations to Gaussian process posteriors have been developed which avoid ...
Gaussian processes are distributions over functions that are versatile and mathematically convenient...
Kernel approximation is commonly used to scale kernel-based algorithms to applications contain-ing a...
Gaussian processes are distributions over functions that are versatile and mathematically convenient...
Gaussian processes scale prohibitively with the size of the dataset. In response, many approximation...
This paper studies an intriguing phenomenon related to the good generalization performance of estima...
This is the final version of the article. It first appeared at http://jmlr.org/proceedings/papers/v3...
A wealth of computationally efficient approximation methods for Gaussian process regression have bee...
We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low...
The kernel function and its hyperparameters are the central model selection choice in a Gaussian pro...
Cette thèse a pour objectif d’étudier et de valider expérimentalement les bénéfices, en terme de qua...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
One approach to improving the running time of kernel-based machine learning methods is to build a sm...
The paper studies convex stochastic optimization problems in a reproducing kernel Hilbert space (RKH...