Back to Main Conference 2000
LREC 2000main
A Novelty-based Evaluation Method for Information Retrieval
Proceedings of the Second International Conference on Language Resources and Evaluation (LREC 2000)
Abstract
In information retrieval research, precision and recall have long been used to evaluate IR systems. However, given that a number of retrieval systems resembling one another are already available to the public, it is valuable to retrieve novel relevant documents, i.e., documents that cannot be retrieved by those existing systems. In view of this problem, we propose an evaluation method that favors systems retrieving as many novel documents as possible. We also used our method to evaluate systems that participated in the IREX workshop.