File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/evalu/02/c02-1042_evalu.xml
Size: 1,244 bytes
Last Modified: 2025-10-06 13:58:48
<?xml version="1.0" standalone="yes"?> <Paper uid="C02-1042"> <Title>Using Knowledge to Facilitate Factoid Answer Pinpointing</Title> <Section position="5" start_page="0" end_page="2" type="evalu"> <SectionTitle> 4. Performance Evaluation </SectionTitle> <Paragraph position="0"> In TREC-10's QA track, Webclopedia received an overall Mean Reciprocal Rank (MRR) score of 0.435, which put it among the top 4 performers of the 68 entrants (the average MRR score for the main QA task was about 0.234).</Paragraph> <Paragraph position="1"> The pinpointing heuristics are fairly accurate: when Webclopedia finds answers, it usually ranks them in the first place (1 st place: 35.5%; not found: 41.87%).</Paragraph> <Paragraph position="2"> We determined the impact of each knowledge source on system performance, using the TREC-10 test corpus using the standard MRR scoring. We applied the system to the questions of each knowledge type separately, with and without its specific knowledge source/algorithm. Results are shown in Table 1, columns A (without) and B (with). To indicate overall effect, we also show (in columns C and D) the percentage of questions in TREC-10 and -9 respecively of each knowledge type.</Paragraph> </Section> class="xml-element"></Paper>