We consider the problem of ranking Web documents within a multicriteria framework and propose a novel approach for the purpose. We focus on the design of a set of criteria aiming at capturing complementary aspects of relevance, while acknowledging that each criterion has a limited precision. Moreover, we provide algorithmic solutions to aggregate these criteria to get the ranking of relevant documents, while taking into account the specificities of the Web information retrieval problem. We report on results of preliminary experiments that give first justification of the pertinence of the proposed approach to improve retrieval effectiveness.ou
International audienceClassical information retrieval (IR) methods often lose valuable information w...
In this paper, we consider the problem of document ranking in a non- traditional retrieval task, cal...
Offline evaluation of information retrieval systems typically focuses on a single effectiveness meas...
In this paper, we report our experiments in themixed query task of the Web track for TREC 2004. We d...
Research in Information Retrieval shows performance improvement when many sources of evidence are co...
A new model for aggregating multiple criteria evaluations for relevance assessment is proposed. An I...
International audienceClassical information retrieval methods often lose valuable information when a...
Carterette, Benjamin A.Information retrieval (IR) is the process of obtaining relevant information f...
Maximizing only the relevance between queries and documents will not satisfy users if they want the ...
Over the last three decades, research in Information Retrieval (IR) shows performance improvement wh...
When using Information Retrieval Systems (IRS), users often present search queries made of ad-hoc ke...
In plain, uncomplicated language, and using detailed examples to explain the key concepts, models, a...
Information retrieval (IR) is an important research area that studies how to find the most useful in...
This paper is a detailed comparative analysis of different document ranking algorithms, focusing on ...
In this thesis, the author designed three sets of preference based ranking algorithms for informatio...
International audienceClassical information retrieval (IR) methods often lose valuable information w...
In this paper, we consider the problem of document ranking in a non- traditional retrieval task, cal...
Offline evaluation of information retrieval systems typically focuses on a single effectiveness meas...
In this paper, we report our experiments in themixed query task of the Web track for TREC 2004. We d...
Research in Information Retrieval shows performance improvement when many sources of evidence are co...
A new model for aggregating multiple criteria evaluations for relevance assessment is proposed. An I...
International audienceClassical information retrieval methods often lose valuable information when a...
Carterette, Benjamin A.Information retrieval (IR) is the process of obtaining relevant information f...
Maximizing only the relevance between queries and documents will not satisfy users if they want the ...
Over the last three decades, research in Information Retrieval (IR) shows performance improvement wh...
When using Information Retrieval Systems (IRS), users often present search queries made of ad-hoc ke...
In plain, uncomplicated language, and using detailed examples to explain the key concepts, models, a...
Information retrieval (IR) is an important research area that studies how to find the most useful in...
This paper is a detailed comparative analysis of different document ranking algorithms, focusing on ...
In this thesis, the author designed three sets of preference based ranking algorithms for informatio...
International audienceClassical information retrieval (IR) methods often lose valuable information w...
In this paper, we consider the problem of document ranking in a non- traditional retrieval task, cal...
Offline evaluation of information retrieval systems typically focuses on a single effectiveness meas...