We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theoretical and practical motivations are proposed. The first method solves the open problem of efficient discrete inducing set selection in the context of inducing point based approximation to full GPs. When inducing points need to be chosen from the training set, the proposed method is the only principled approach to date for joint tuning of inducing set and GP hyperparameters while scaling linearly in the number of training set size during learning. Empirically it achieves a trade-off between speed and accuracy that is comparable to other state-of-arts inducing point GP methods. The second method is a novel framework for building flexible proba...
Abstract—Exact Gaussian process (GP) regression has OðN3Þ runtime for data size N, making it intract...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theor...
Abstract. The expressive power of Gaussian process (GP) models comes at a cost of poor scalability i...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
Sparse pseudo-point approximations for Gaussian process (GP) models provide a suite of methods that ...
Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to in...
Exact Gaussian process (GP) regression is not available for n 10, 000 (O(n3) for learning and O(n) ...
The expressive power of Gaussian process (GP) models comes at a cost of poor scalability in the size...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Abstract—Exact Gaussian process (GP) regression has OðN3Þ runtime for data size N, making it intract...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
We explore ways to scale Gaussian processes (GP) to large datasets. Two methods with different theor...
Abstract. The expressive power of Gaussian process (GP) models comes at a cost of poor scalability i...
Gaussian process (GP) models are widely used to perform Bayesian nonlinear regression and classifica...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while c...
Sparse pseudo-point approximations for Gaussian process (GP) models provide a suite of methods that ...
Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to in...
Exact Gaussian process (GP) regression is not available for n 10, 000 (O(n3) for learning and O(n) ...
The expressive power of Gaussian process (GP) models comes at a cost of poor scalability in the size...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Abstract—Exact Gaussian process (GP) regression has OðN3Þ runtime for data size N, making it intract...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...