SUMMARY : Session O12-W Tokenization and Tagging

 

Title The importance of precise tokenizing for deep grammars
Authors M. Forst, R. Kaplan
Abstract We present a non-deterministic finite-state transducer that acts as a tokenizer and normalizer for free text that is input to a broad-coverage LFG of German. We compare the basic tokenizer used in an earlier version of the grammar and the more sophisticated tokenizer that we now use. The revised tokenizer increases the coverage of the grammar in terms of full parses from 68.3% to 73.4% on sentences 8,001 through 10,000 of the TiGer Corpus.
Keywords finite-state transducers, tokenization, German, LFG, parsing, hand-crafted grammars
Full paper The importance of precise tokenizing for deep grammars