This technical report collects three years of experimentation in interactive cross-language information retrieval by SICS in the annual Cross-language Evaluation Forum (CLEF) evaluation campaigns 2003, 2004, and 2005. We varied simulated task context and measured user performance in document assessment task to find that choice of language and task context indeed have effects on the amount of efforts users need to expend to achieve task completion
This paper reports on the participation of ITC-irst in the Cross Language Evaluation Forum 2003; in ...
This paper describes the official runs of our team for CLEF 2002. Wetook part in the monolingual tas...
The problem of finding documents written in a language that the searcher cannot read is perhaps the ...
This technical report collects three years of experimentation in interactive cross-language informat...
An experiment on how users assess relevance in a foreign language they know well is reported. Result...
Abstract. The problem of finding documents written in a language that the searcher cannot read is pe...
Purpose – This paper aims to investigate how readers assess relevance of retrieved documents in a fo...
Abstract. The problem of nding documents that are written in a lan-guage that the searcher cannot re...
Abstract. The CLEF 2003 Interactive Track (iCLEF) was the third year of a shared experiment design t...
Topic creation and relevance assessment are considered as crucial components of the evaluation proce...
The Cross-Language Evaluation Forum (CLEF) provides an infrastructure aimed at supporting the develo...
This paper reports on the participation of ITC-irst in the Cross Language Evaluation Forum (CLEF) of...
The objective of the Cross-Language Evaluation Forum (CLEF) is to promote research in the multilingu...
An experiment on how users assess relevance in a foreign language they know well is reported. Result...
Abstract. Improvement in cross-language information retrieval results can come from a variety of sou...
This paper reports on the participation of ITC-irst in the Cross Language Evaluation Forum 2003; in ...
This paper describes the official runs of our team for CLEF 2002. Wetook part in the monolingual tas...
The problem of finding documents written in a language that the searcher cannot read is perhaps the ...
This technical report collects three years of experimentation in interactive cross-language informat...
An experiment on how users assess relevance in a foreign language they know well is reported. Result...
Abstract. The problem of finding documents written in a language that the searcher cannot read is pe...
Purpose – This paper aims to investigate how readers assess relevance of retrieved documents in a fo...
Abstract. The problem of nding documents that are written in a lan-guage that the searcher cannot re...
Abstract. The CLEF 2003 Interactive Track (iCLEF) was the third year of a shared experiment design t...
Topic creation and relevance assessment are considered as crucial components of the evaluation proce...
The Cross-Language Evaluation Forum (CLEF) provides an infrastructure aimed at supporting the develo...
This paper reports on the participation of ITC-irst in the Cross Language Evaluation Forum (CLEF) of...
The objective of the Cross-Language Evaluation Forum (CLEF) is to promote research in the multilingu...
An experiment on how users assess relevance in a foreign language they know well is reported. Result...
Abstract. Improvement in cross-language information retrieval results can come from a variety of sou...
This paper reports on the participation of ITC-irst in the Cross Language Evaluation Forum 2003; in ...
This paper describes the official runs of our team for CLEF 2002. Wetook part in the monolingual tas...
The problem of finding documents written in a language that the searcher cannot read is perhaps the ...