We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), with which we estab...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We investigate the behaviour of global and local Rademacher averages. We present new error bounds wh...
We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our ...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
Sequential algorithms of active learning based on the estimation of the level sets of the empirical ...
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), with which we estab...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
We investigate the behaviour of global and local Rademacher averages. We present new error bounds wh...
We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our ...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
When dealing with kernel methods, one has to decide which kernel and which values for the hyperparam...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection...
Sequential algorithms of active learning based on the estimation of the level sets of the empirical ...
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), with which we estab...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...
We derive in this work new upper bounds for estimating the generalization error of kernel classifier...