Multi-class Gaussian Process Classifiers (MGPCs) are often affected by overfitting problems when labeling errors occur far from the decision boundaries. To prevent this, we investigate a robust MGPC (RMGPC) which considers labeling errors independently of their distance to the decision boundaries. Expectation propagation is used for approximate inference. Experiments with several datasets in which noise is injected in the labels illustrate the benefits of RMGPC. This method performs better than other Gaussian process alternatives based on considering latent Gaussian noise or heavy-tailed processes. When no noise is injected in the labels, RMGPC still performs equal or better than the other methods. Finally, we show how RMGPC can be used for...
A method for large scale Gaussian process classification has been recently proposed based on expecta...
2007 I, Edward Snelson, confirm that the work presented in this thesis is my own. Where information ...
It is a common practice in the machine learning community to assume that the observed data are noise...
Abstract. Gaussian process classifiers (GPCs) are a fully statistical model for kernel classificatio...
We investigate adversarial robustness of Gaussian Process classification (GPC) models. Specifically,...
We investigate adversarial robustness of Gaussian Process classification (GPC) models. Specifically,...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Many real-world classification tasks involve the prediction of multiple, inter-dependent class label...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
This paper studies the problem of deriving fast and accurate classification algorithms with uncertai...
We present new methods for fast Gaussian process (GP) inference in large-scale scenarios including e...
We present a novel multi-output Gaussian process model for multi-class classification. We build on t...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
In binary Gaussian process classification the prior class membership probabilities are obtained by t...
A method for large scale Gaussian process classification has been recently proposed based on expecta...
2007 I, Edward Snelson, confirm that the work presented in this thesis is my own. Where information ...
It is a common practice in the machine learning community to assume that the observed data are noise...
Abstract. Gaussian process classifiers (GPCs) are a fully statistical model for kernel classificatio...
We investigate adversarial robustness of Gaussian Process classification (GPC) models. Specifically,...
We investigate adversarial robustness of Gaussian Process classification (GPC) models. Specifically,...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
We address the limitations of Gaussian processes for multiclass classification in the setting where ...
Many real-world classification tasks involve the prediction of multiple, inter-dependent class label...
This paper introduces a novel Gaussian process (GP) classification method that combines advantages o...
This paper studies the problem of deriving fast and accurate classification algorithms with uncertai...
We present new methods for fast Gaussian process (GP) inference in large-scale scenarios including e...
We present a novel multi-output Gaussian process model for multi-class classification. We build on t...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
In binary Gaussian process classification the prior class membership probabilities are obtained by t...
A method for large scale Gaussian process classification has been recently proposed based on expecta...
2007 I, Edward Snelson, confirm that the work presented in this thesis is my own. Where information ...
It is a common practice in the machine learning community to assume that the observed data are noise...