Back to Main Conference 2010
LREC 2010main
Fine-grained Linguistic Evaluation of Question Answering Systems
Proceedings of the Seventh International Conference on Language Resources and Evaluation (LREC 2010)
Abstract
Question answering systems are complex systems using natural language processing. Some evaluation campaigns are organized to evaluate such systems in order to propose a classification of systems based on final results (number of correct answers). Nevertheless, teams need to evaluate more precisely the results obtained by their systems if they want to do a diagnostic evaluation. There are no tools or methods to do these evaluations systematically. We present REVISE, a tool for glass box evaluation based on diagnostic of question answering system results.