In this paper we propose a new basis selection criterion for building sparse GP regression models that provides promising gains in accuracy as well as efficiency over previous methods. Our algorithm is much faster than that of Smola and Bartlett, while, in generalization it greatly outperforms the information gain approach proposed by Seeger et al, especially on the quality of predictive distributions.
We develop an approach for a sparse representation for Gaussian Process (GP) models in order to over...
Gaussian processes; Non-parametric regression; System identification. Abstract: We provide a method ...
This paper proposes an approach for online training of a sparse multi-output Gaussian process (GP) m...
Abstract — This paper considers the basis vector selection issue invloved in forward selection algor...
We propose an efficient optimization algorithm for selecting a subset of train-ing data to induce sp...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
We propose an efficient optimization algorithm for selecting a subset of training data to induce spa...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsif...
We present a new Gaussian process (GP) regression model whose co-variance is parameterized by the th...
While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performan...
Abstract—We propose an efficient optimization algorithm to select a subset of training data as the i...
We present a new Gaussian process (GP) regression model whose covariance is parameterized by the th...
We develop an approach for a sparse representation for Gaussian Process (GP) models in order to over...
Gaussian processes; Non-parametric regression; System identification. Abstract: We provide a method ...
This paper proposes an approach for online training of a sparse multi-output Gaussian process (GP) m...
Abstract — This paper considers the basis vector selection issue invloved in forward selection algor...
We propose an efficient optimization algorithm for selecting a subset of train-ing data to induce sp...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
Most existing sparse Gaussian process (g.p.) models seek computational advantages by basing their co...
We propose an efficient optimization algorithm for selecting a subset of training data to induce spa...
We provide a new unifying view, including all existing proper probabilistic sparse approximations fo...
We present a new sparse Gaussian Process (GP) model for regression. The key novel idea is to sparsif...
We present a new Gaussian process (GP) regression model whose co-variance is parameterized by the th...
While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performan...
Abstract—We propose an efficient optimization algorithm to select a subset of training data as the i...
We present a new Gaussian process (GP) regression model whose covariance is parameterized by the th...
We develop an approach for a sparse representation for Gaussian Process (GP) models in order to over...
Gaussian processes; Non-parametric regression; System identification. Abstract: We provide a method ...
This paper proposes an approach for online training of a sparse multi-output Gaussian process (GP) m...