Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. This paper shows through a series of experiments on INEX data that crowdsourcing can be a good alternative for relevance assessment in the context of XML retrieval
Different information retrieval (IR) systems often return very diverse results lists for the same qu...
Crowdsourcing relevance judgments for the evaluation of search engines is used increasingly to overc...
In Information Retrieval (IR) evaluation, preference judgments are collected by presenting to the as...
Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. This...
Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. Thi...
htmlabstractThe performance of information retrieval (IR) systems is commonly evaluated using a test...
Evaluation is instrumental in the development and management of effective information retrieval syst...
Abstract. We consider the problem of acquiring relevance judgements for in-formation retrieval (IR) ...
International audienceThis paper investigates the effect of performance measures and relevance funct...
Abstract. Within the area of Information Retrieval (IR) the importance of appropriate ranking of res...
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the ...
This paper reviews several evaluation measures developed for evaluating XML information retrieval (I...
This paper reviews several evaluation measures developed for evaluating XML information retrieval (I...
Within the area of Information Retrieval (IR) the importance of appropriate ranking of results has i...
Information retrieval and feedback in {XML} are rather new fields for researchers; natural questions...
Different information retrieval (IR) systems often return very diverse results lists for the same qu...
Crowdsourcing relevance judgments for the evaluation of search engines is used increasingly to overc...
In Information Retrieval (IR) evaluation, preference judgments are collected by presenting to the as...
Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. This...
Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. Thi...
htmlabstractThe performance of information retrieval (IR) systems is commonly evaluated using a test...
Evaluation is instrumental in the development and management of effective information retrieval syst...
Abstract. We consider the problem of acquiring relevance judgements for in-formation retrieval (IR) ...
International audienceThis paper investigates the effect of performance measures and relevance funct...
Abstract. Within the area of Information Retrieval (IR) the importance of appropriate ranking of res...
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the ...
This paper reviews several evaluation measures developed for evaluating XML information retrieval (I...
This paper reviews several evaluation measures developed for evaluating XML information retrieval (I...
Within the area of Information Retrieval (IR) the importance of appropriate ranking of results has i...
Information retrieval and feedback in {XML} are rather new fields for researchers; natural questions...
Different information retrieval (IR) systems often return very diverse results lists for the same qu...
Crowdsourcing relevance judgments for the evaluation of search engines is used increasingly to overc...
In Information Retrieval (IR) evaluation, preference judgments are collected by presenting to the as...