International audienceWe survey recent results on efficient margin-based algorithms for adaptive sampling in binary classification tasks. Using the so-called Mammen-Tsybakov low noise condition to parametrize the distribution of covariates, and assuming linear label noise, we state bounds on the convergence rate of the adaptive sampler to the Bayes risk. These bounds show that, excluding logarithmic factors, the average risk converges to the Bayes risk at rate $N^{-(1+a)(2+a)/2(3+a)}$ where N denotes the number of queried labels, and a is the nonnegative exponent in the low noise condition. For all $a > \sqrt3-1$ this convergence rate is asymptotically faster than the rate $N^{-(1+a)/(2+a)}$ achieved by the fully supervised version of the b...
In the present paper, the EB two-action problem under the linear error loss is considered for the fa...
30 pages; To appear in the Annals of StatisticsWe consider the problem of adaptation to the margin a...
A classical condition for fast learning rates is the margin condition, first introduced by Mammen an...
International audienceWe survey recent results on efficient margin-based algorithms for adaptive sam...
We introduce efficient margin-based algorithms for selective sampling and filtering in binary classi...
We present a simple noise-robust margin-based active learn-ing algorithm to find homogeneous (passin...
We present a simple noise-robust margin-based active learn-ing algorithm to find homogeneous (passin...
We consider the problem of adaptation to the margin in binary classi-fication. We suggest a penalize...
This thesis presents a general discussion of active learning and adaptive sampling. In many practica...
It is shown here that adaptivity in sampling results in dramatic improvements in the recovery of spa...
We address the problem of classification when data are collected from two samples with measurement e...
Running machine learning algorithms on large and rapidly growing volumes of data is often computatio...
The effect of measurement errors in discriminant analysis is investigated. Given observations $Z=X+\...
In this paper, we propose an adaptive kNN method for classification, in which different k are select...
In the present paper, the EB two-action problem under the linear error loss is considered for the fa...
In the present paper, the EB two-action problem under the linear error loss is considered for the fa...
30 pages; To appear in the Annals of StatisticsWe consider the problem of adaptation to the margin a...
A classical condition for fast learning rates is the margin condition, first introduced by Mammen an...
International audienceWe survey recent results on efficient margin-based algorithms for adaptive sam...
We introduce efficient margin-based algorithms for selective sampling and filtering in binary classi...
We present a simple noise-robust margin-based active learn-ing algorithm to find homogeneous (passin...
We present a simple noise-robust margin-based active learn-ing algorithm to find homogeneous (passin...
We consider the problem of adaptation to the margin in binary classi-fication. We suggest a penalize...
This thesis presents a general discussion of active learning and adaptive sampling. In many practica...
It is shown here that adaptivity in sampling results in dramatic improvements in the recovery of spa...
We address the problem of classification when data are collected from two samples with measurement e...
Running machine learning algorithms on large and rapidly growing volumes of data is often computatio...
The effect of measurement errors in discriminant analysis is investigated. Given observations $Z=X+\...
In this paper, we propose an adaptive kNN method for classification, in which different k are select...
In the present paper, the EB two-action problem under the linear error loss is considered for the fa...
In the present paper, the EB two-action problem under the linear error loss is considered for the fa...
30 pages; To appear in the Annals of StatisticsWe consider the problem of adaptation to the margin a...
A classical condition for fast learning rates is the margin condition, first introduced by Mammen an...