File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/01/h01-1058_concl.xml
Size: 1,481 bytes
Last Modified: 2025-10-06 13:53:02
<?xml version="1.0" standalone="yes"?> <Paper uid="H01-1058"> <Title>On Combining Language Models : Oracle Approach</Title> <Section position="8" start_page="0" end_page="0" type="concl"> <SectionTitle> 6. CONCLUSIONS </SectionTitle> <Paragraph position="0"> We have presented our recent work on language model combining. We have shown that although a simple interpolation of LMs improves the performance, it fails to reach the performance of an oracle. We have proposed a method for LM combination that mimics the behavior of the oracle. Although our work is not complete without a neural network that mimics the oracle, we argue that the universal approximation theory ensures the success of such a method. However, extensive experiments are required to reach the goal with the main focus on the selection of features. At the moment, the number of concepts, the number of filler classes and the number of 3-gram hits in a sentence (all normalized by the length of the sentence) and the behavior of n-grams in a context are the features that we consider to use. Also, it has been observed that the performance of the oracle is still far from the best possible performance. This is partly due to the very small number of LMs used in the rescoring, partly due to the oracle's hard decision combining strategy and partly due to the static combination with the acoustic model. The work is in progress towards the goal of filling the performance gap.</Paragraph> </Section> class="xml-element"></Paper>