Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involvin...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
A method for large scale Gaussian process classification has been recently proposed based on expecta...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma d...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Gaussian processes are non-parametric models that can be used to carry out supervised and unsupervi...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
A method for large scale Gaussian process classification has been recently proposed based on expecta...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We propose a scalable stochastic variational approach to GP classification building on Pólya-Gamma d...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Gaussian processes are non-parametric models that can be used to carry out supervised and unsupervi...
Gaussian process classification is a popular method with a number of appealing properties. We show h...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
Variational inference techniques based on inducing variables provide an elegant framework for scalab...