We propose an algorithm to predict the leave-one-out (LOO) error for kernel based classifiers. To achieve this goal with computational efficiency, we cast the LOO error approximation task into a classification problem. This means that we need to learn a classification of whether or not a given training sample - if left out of the data set - would be misclassified. For this learning task, simple data dependent features are proposed, inspired by geometrical intuition. Our approach allows to reliably select a good model as demonstrated in simulations on Support Vector and Linear Programming Machines. Comparisons to existing learning theoretical bounds, e,g. the span bound, are given for various model selection scenarios
Three estimates of the leave-one-out error for nu-support vector (SV) machine binary classifiers are...
Three estimates of the leave-one-out error for nu-support vector (SV) machine binary classifiers are...
Compared with conventional two-class learning schemes, one-class classification simply uses a single...
Abstract Three estimates of the leave-one-out error for *-support vector (SV) machine binary classif...
We study the leave-one-out and generalization errors of voting combinations of learning machines. A ...
Abstract. We study the leave-one-out and generalization errors of voting combinations of learning ma...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
Motivated by the Golub-Heath-Wahba formula for ridge regression, we first present a new leave-one-ou...
In this paper we propose a new learning algorithm for kernel classifiers. Former approaches like Qua...
A finite concave minimization algorithm is proposed for constructing kernel classifiers that use a m...
Abstract The selection of parameters in the support vector machine (SVM) is an important step for co...
A finite concave minimization algorithm is proposed for constructing kernel classifiers that use a m...
Minimizing bounds of leave-one-out (loo) errors is an important & efficient approach for support vec...
Three estimates of the leave-one-out error for nu-support vector (SV) machine binary classifiers are...
Three estimates of the leave-one-out error for nu-support vector (SV) machine binary classifiers are...
Compared with conventional two-class learning schemes, one-class classification simply uses a single...
Abstract Three estimates of the leave-one-out error for *-support vector (SV) machine binary classif...
We study the leave-one-out and generalization errors of voting combinations of learning machines. A ...
Abstract. We study the leave-one-out and generalization errors of voting combinations of learning ma...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
Motivated by the Golub-Heath-Wahba formula for ridge regression, we first present a new leave-one-ou...
In this paper we propose a new learning algorithm for kernel classifiers. Former approaches like Qua...
A finite concave minimization algorithm is proposed for constructing kernel classifiers that use a m...
Abstract The selection of parameters in the support vector machine (SVM) is an important step for co...
A finite concave minimization algorithm is proposed for constructing kernel classifiers that use a m...
Minimizing bounds of leave-one-out (loo) errors is an important & efficient approach for support vec...
Three estimates of the leave-one-out error for nu-support vector (SV) machine binary classifiers are...
Three estimates of the leave-one-out error for nu-support vector (SV) machine binary classifiers are...
Compared with conventional two-class learning schemes, one-class classification simply uses a single...