We derive in this work new upper bounds for estimating the generalization error of kernel classifiers, that is the misclassification rate that the models will perform on new and previously unseen data. Though this paper is more targeted towards the error estimation topic, the generalization error can be obviously exploited, in practice, for model selection purposes as well. The derived bounds are based on Rademacher complexity and result to be particularly useful when a set of unlabeled samples are available, in addition to the (labeled) training examples: we will show that, by exploiting further unlabeled patterns, the confidence term of the conventional Rademacher complexity bound can be reduced by a factor of three. Moreover, the availab...
In this work we propose some new generalization bounds for binary classifiers, based on global Radem...
In this work we propose some new generalization bounds for binary classifiers, based on global Radem...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
In this work we propose some new generalization bounds for binary classifiers, based on global Radem...
In this work we propose some new generalization bounds for binary classifiers, based on global Radem...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
The Maximal Discrepancy and the Rademacher Complexity are powerful statistical tools that can be exp...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
In this work we propose some new generalization bounds for binary classifiers, based on global Radem...
In this work we propose some new generalization bounds for binary classifiers, based on global Radem...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...