Back to Main Conference 2002
LREC 2002main

Proposal for Evaluating Ontology Refinement Methods

Proceedings of the Third International Conference on Language Resources and Evaluation (LREC 2002)

DOI:10.63317/4vykzxym8yba

Abstract

Ontologies are a tool for Knowledge Representation that is now widely used, but the effort employed to build an ontology is still high. There are a few automatic and semi-automatic methods for extending ontologies with domain-specific information, but they use different training and test data, and different evaluation metrics. The work described in this paper is an attempt to build a benchmark corpus that can be used for comparing these systems. We provide standard evaluation metrics as well as two different annotated corpora: one in which every unknown word has been labelled with the places where it should be added onto the ontology, and other in which only the high-frequency unknown terms have been annotated.

Details

Paper ID
lrec2002-main-038
Pages
N/A
BibKey
alfonseca-manandhar-2002-proposal
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
N/A
Conference
Third International Conference on Language Resources and Evaluation
Location
Las Palmas, Spain
Date
29 May 2002 31 May 2002

Authors

  • EA

    Enrique Alfonseca

  • SM

    Suresh Manandhar

Links