Experts or (crowd of) non-experts ? the question of the annotators’ expertise viewed from crowdsourcing.Manual corpus annotation is more and more performed using crowdsourcing : produced by a crowd of persons, through the Web, for free, or for a very small remuneration. Our experiments question the commonly accepted vision : crowdsourcing a task is not having a crowd of non-experts performing it, but rather identifying experts of the (annotation) task in the crowd. Those experiments therefore contribute to the reflection on the corpus annotators’ expertise
International audienceCrowdsourcing is an emerging technique that enables to involve humans into inf...
International audienceCrowdsourcing platforms enable to propose simple human intelligence tasks to a...
This paper explores the application of sensemaking theory to support non-expert crowds in intricate ...
International audienceLe " Crowdsourcing " représente aujourd'hui une méthode de travail tr es popul...
International audienceThe crowdsourcing consists in the externalisation of tasks to a crowd of peopl...
International audienceIn a spirit of open innovation, organisations mobilise the crowd in order to c...
We develop an NLP method for inferring potential contributors among multitude of users within crowds...
One of the foremost challenges for information technology over the last few years has been to explor...
International audienceAlthough at the heart of “crowdsourcing” the definition of the word “crowd” is...
Dans une logique d’open innovation, mobiliser la foule via une démarche de crowdsourcing permet aux ...
The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks...
Tasks that require users to have expert knowledge are diffi- cult to crowdsource. They are mostly to...
International audienceWith crowdsourcing, many organisations use the crowd’s resources and skills. T...
International audienceCrowdsourcing is an emerging technique that enables to involve humans into inf...
International audienceCrowdsourcing platforms enable to propose simple human intelligence tasks to a...
This paper explores the application of sensemaking theory to support non-expert crowds in intricate ...
International audienceLe " Crowdsourcing " représente aujourd'hui une méthode de travail tr es popul...
International audienceThe crowdsourcing consists in the externalisation of tasks to a crowd of peopl...
International audienceIn a spirit of open innovation, organisations mobilise the crowd in order to c...
We develop an NLP method for inferring potential contributors among multitude of users within crowds...
One of the foremost challenges for information technology over the last few years has been to explor...
International audienceAlthough at the heart of “crowdsourcing” the definition of the word “crowd” is...
Dans une logique d’open innovation, mobiliser la foule via une démarche de crowdsourcing permet aux ...
The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks...
Tasks that require users to have expert knowledge are diffi- cult to crowdsource. They are mostly to...
International audienceWith crowdsourcing, many organisations use the crowd’s resources and skills. T...
International audienceCrowdsourcing is an emerging technique that enables to involve humans into inf...
International audienceCrowdsourcing platforms enable to propose simple human intelligence tasks to a...
This paper explores the application of sensemaking theory to support non-expert crowds in intricate ...