A method for large scale Gaussian process classification has been recently proposed based on expectation propagation (EP). Such a method allows Gaussian process classifiers to be trained on very large datasets that were out of the reach of previous deployments of EP and has been shown to be competitive with related techniques based on stochastic variational inference. Nevertheless, the memory resources required scale linearly with the dataset size, unlike in variational methods. This is a severe limitation when the number of instances is very large. Here we show that this problem is avoided when stochastic EP is used to train the model
This paper considers probabilistic multinomial probit classification using Gaussian process (GP) pri...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes are non-parametric models that can be used to carry out supervised and unsupervi...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
This is the final version of the article. It first appeared from Neural Information Processing Syste...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
This paper considers probabilistic multinomial probit classification using Gaussian process (GP) pri...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes are non-parametric models that can be used to carry out supervised and unsupervi...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
We introduce stochastic variational inference for Gaussian process models. This enables the applicat...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
Gaussian processes are attractive models for probabilistic classification but unfortunately exact in...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (G...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
This is the final version of the article. It first appeared from Neural Information Processing Syste...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
Rich and complex time-series data, such as those generated from engineering systems, financial marke...
This paper considers probabilistic multinomial probit classification using Gaussian process (GP) pri...
stitute two of the most important foci of modern machine learning research. In this preliminary work...
Gaussian processes are non-parametric models that can be used to carry out supervised and unsupervi...