Methods that learn from prior information about input features such as generalized expectation (GE) have been used to train accurate models with very little effort. In this paper, we propose an active learning approach in which the machine solicits labels on features rather than instances. In both simulated and real user experiments on two sequence labeling tasks we show that our active learning method outperforms passive learning with features as well as traditional active learning with instances. Preliminary experiments suggest that novel interfaces which intelligently solicit labels on multiple features facilitate more efficient annotation
Active learning is a technique that helps to minimize the annotation budget required for the creatio...
<p>Most classic machine learning methods depend on the assumption that humans can annotate all the d...
Conventional active learning algorithms assume a single labeler that produces noiseless label at a g...
A common obstacle preventing the rapid deployment of supervised machine learning algorithms is the l...
In recent decades, the availability of a large amount of data has propelled the field of machine lea...
We extend the traditional active learning framework to include feedback on features in addition to l...
Abstract. An improved active learning method taking advantage of feature selection technique is prop...
Active learning reduces annotation costs for supervised learning by concentrating labelling efforts ...
Active learning typically focuses on training a model on few labeled examples alone, while unlabeled...
Active learning (AL) consists of asking human annotators to annotate automatically selected data tha...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
Active learning seeks to train the best classifier at the lowest annotation cost by intelligently pi...
Traditional supervised machine learning algorithms are expected to have access to a large corpus of ...
Obtaining labels can be expensive or time-consuming, but unlabeled data is often abundant and easier...
Recent decades have witnessed great success of machine learning, especially for tasks where large an...
Active learning is a technique that helps to minimize the annotation budget required for the creatio...
<p>Most classic machine learning methods depend on the assumption that humans can annotate all the d...
Conventional active learning algorithms assume a single labeler that produces noiseless label at a g...
A common obstacle preventing the rapid deployment of supervised machine learning algorithms is the l...
In recent decades, the availability of a large amount of data has propelled the field of machine lea...
We extend the traditional active learning framework to include feedback on features in addition to l...
Abstract. An improved active learning method taking advantage of feature selection technique is prop...
Active learning reduces annotation costs for supervised learning by concentrating labelling efforts ...
Active learning typically focuses on training a model on few labeled examples alone, while unlabeled...
Active learning (AL) consists of asking human annotators to annotate automatically selected data tha...
In many settings in practice it is expensive to obtain labeled data while unlabeled data is abundant...
Active learning seeks to train the best classifier at the lowest annotation cost by intelligently pi...
Traditional supervised machine learning algorithms are expected to have access to a large corpus of ...
Obtaining labels can be expensive or time-consuming, but unlabeled data is often abundant and easier...
Recent decades have witnessed great success of machine learning, especially for tasks where large an...
Active learning is a technique that helps to minimize the annotation budget required for the creatio...
<p>Most classic machine learning methods depend on the assumption that humans can annotate all the d...
Conventional active learning algorithms assume a single labeler that produces noiseless label at a g...