The goal of active learning is to achieve the same accuracy achievable by passive learning, while using much fewer labels. Exponential savings in terms of label complexity have been proved in very special cases, but fundamental lower bounds show that such improvements are impossible in general. This suggests a need to explore alternative goals for active learning. Learning with abstention is one such alternative. In this setting, the active learning algorithm may abstain from prediction and incur an error that is marginally smaller than random guessing. We develop the first computationally efficient active learning algorithm with abstention. Our algorithm provably achieves $\mathsf{polylog}(\frac{1}{\varepsilon})$ label complexity, without ...
This dissertation develops and analyzes active learning algorithms for binary classification problem...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
Recent decades have witnessed great success of machine learning, especially for tasks where large an...
We study pool-based active learning in the presence of noise, i.e. the agnostic set-ting. Previous w...
AbstractWe state and analyze the first active learning algorithm that finds an ϵ-optimal hypothesis ...
We study pool-based active learning in the presence of noise, i.e. the agnostic setting. Previous wo...
We present an agnostic active learning algorithm for any hypothesis class of bounded VC dimension un...
Traditional supervised machine learning algorithms are expected to have access to a large corpus of ...
We study active learning where the labeler can not only return incorrect labels but also abstain fro...
We study active learning where the labeler can not only return incorrect labels but also abstain fro...
We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically,...
We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically,...
AbstractWe state and analyze the first active learning algorithm that finds an ϵ-optimal hypothesis ...
We study pool-based active learning in the presence of noise, that is, the agnostic setting. It is k...
We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically,...
This dissertation develops and analyzes active learning algorithms for binary classification problem...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
Recent decades have witnessed great success of machine learning, especially for tasks where large an...
We study pool-based active learning in the presence of noise, i.e. the agnostic set-ting. Previous w...
AbstractWe state and analyze the first active learning algorithm that finds an ϵ-optimal hypothesis ...
We study pool-based active learning in the presence of noise, i.e. the agnostic setting. Previous wo...
We present an agnostic active learning algorithm for any hypothesis class of bounded VC dimension un...
Traditional supervised machine learning algorithms are expected to have access to a large corpus of ...
We study active learning where the labeler can not only return incorrect labels but also abstain fro...
We study active learning where the labeler can not only return incorrect labels but also abstain fro...
We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically,...
We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically,...
AbstractWe state and analyze the first active learning algorithm that finds an ϵ-optimal hypothesis ...
We study pool-based active learning in the presence of noise, that is, the agnostic setting. It is k...
We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically,...
This dissertation develops and analyzes active learning algorithms for binary classification problem...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
Recent decades have witnessed great success of machine learning, especially for tasks where large an...