Crowdsourcing platforms empower individuals and businesses to rapidly gather large amounts of hu-man input. The challenge arises how to trust the results obtained from crowdsourced laypersons. Our work is a contribution to the growing exploration of how to successfully leverage large groups of crowd-sourced laypersons to collect high-quality image annotation results
Various stakeholders have called for human oversight of algorithmic processes, as a means to mitigat...
Labeled data is a prerequisite for successfully applying machine learning techniques to a wide range...
The creation of golden standard datasets is a costly business. Optimally more than one judgment per ...
Crowdsourcing is leveraged to rapidly and inexpensively collect annotations, but concerns have been ...
While traditional approaches to image analysis have typically relied upon either manual annotation b...
Collecting high quality annotations to construct an evaluation dataset is essential for assessing th...
We introduce a method to greatly reduce the amount of redundant annotations required when crowdsourc...
High quality segmentations must be captured consistently for applications such as biomedical image a...
International audienceValidating user tags helps to refine them, making them more useful for finding...
© 2018 Copyright is held by the owner/author(s). In this work, we propose two ensemble methods lever...
We propose a screening approach to find reliable and effectively expert crowd workers in image quali...
International audienceThis paper explores processing techniques to deal with noisy data in crowdsour...
Abstract. In this paper, we introduce the CrowdTruth open-source soft-ware framework for machine-hum...
Some complex problems, such as image tagging and natural lan-guage processing, are very challenging ...
Crowdsourcing systems empower individuals and companies to outsource labor-intensive tasks that cann...
Various stakeholders have called for human oversight of algorithmic processes, as a means to mitigat...
Labeled data is a prerequisite for successfully applying machine learning techniques to a wide range...
The creation of golden standard datasets is a costly business. Optimally more than one judgment per ...
Crowdsourcing is leveraged to rapidly and inexpensively collect annotations, but concerns have been ...
While traditional approaches to image analysis have typically relied upon either manual annotation b...
Collecting high quality annotations to construct an evaluation dataset is essential for assessing th...
We introduce a method to greatly reduce the amount of redundant annotations required when crowdsourc...
High quality segmentations must be captured consistently for applications such as biomedical image a...
International audienceValidating user tags helps to refine them, making them more useful for finding...
© 2018 Copyright is held by the owner/author(s). In this work, we propose two ensemble methods lever...
We propose a screening approach to find reliable and effectively expert crowd workers in image quali...
International audienceThis paper explores processing techniques to deal with noisy data in crowdsour...
Abstract. In this paper, we introduce the CrowdTruth open-source soft-ware framework for machine-hum...
Some complex problems, such as image tagging and natural lan-guage processing, are very challenging ...
Crowdsourcing systems empower individuals and companies to outsource labor-intensive tasks that cann...
Various stakeholders have called for human oversight of algorithmic processes, as a means to mitigat...
Labeled data is a prerequisite for successfully applying machine learning techniques to a wide range...
The creation of golden standard datasets is a costly business. Optimally more than one judgment per ...