We develop unified information-theoretic machinery for deriving lower bounds for passive and active learning schemes. Our bounds involve the so-called Alexander\u27s capacity function. The supremum of this function has been recently rediscovered by Hanneke in the context of active learning under the name of disagreement coefficient. For passive learning, our lower bounds match the upper bounds of Gine and Koltchinskii up to constants and generalize analogous results of Massart and Nedelec. For active learning, we provide first known lower bounds based on the capacity function rather than the disagreement coefficient
We study the problem of active learning in a stream-based setting, allowing the distribution of the ...
<p>We present a polynomial-time noise-robust margin-based active learning algorithm to find homogene...
Active learning is a type of sequential design for supervised machine learning, in which the learnin...
We develop unified information-theoretic machinery for deriving lower bounds for passive and active ...
We develop unified information-theoretic machinery for deriving lower bounds for passive and active ...
This paper analyzes the potential advantages and theoretical challenges of active learning algorit...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
The original and most widely studied PAC model for learning assumes a passive learner in the sense t...
We study pool-based active learning in the presence of noise, that is, the agnostic setting. It is k...
This paper presents a rigorous statistical analysis characterizing regimes in which active learning ...
This paper presents a rigorous statistical analysis characterizing regimes in which active learning ...
The original and most widely studied PAC model for learning assumes a passive learner in the sense t...
Traditional models of active learning assume a learner can directly manipulate or query a covariate ...
We investigate a topic at the interface of machine learning and cognitive science. Human active lear...
This dissertation develops and analyzes active learning algorithms for binary classification problem...
We study the problem of active learning in a stream-based setting, allowing the distribution of the ...
<p>We present a polynomial-time noise-robust margin-based active learning algorithm to find homogene...
Active learning is a type of sequential design for supervised machine learning, in which the learnin...
We develop unified information-theoretic machinery for deriving lower bounds for passive and active ...
We develop unified information-theoretic machinery for deriving lower bounds for passive and active ...
This paper analyzes the potential advantages and theoretical challenges of active learning algorit...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
The original and most widely studied PAC model for learning assumes a passive learner in the sense t...
We study pool-based active learning in the presence of noise, that is, the agnostic setting. It is k...
This paper presents a rigorous statistical analysis characterizing regimes in which active learning ...
This paper presents a rigorous statistical analysis characterizing regimes in which active learning ...
The original and most widely studied PAC model for learning assumes a passive learner in the sense t...
Traditional models of active learning assume a learner can directly manipulate or query a covariate ...
We investigate a topic at the interface of machine learning and cognitive science. Human active lear...
This dissertation develops and analyzes active learning algorithms for binary classification problem...
We study the problem of active learning in a stream-based setting, allowing the distribution of the ...
<p>We present a polynomial-time noise-robust margin-based active learning algorithm to find homogene...
Active learning is a type of sequential design for supervised machine learning, in which the learnin...