File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/06/w06-2711_intro.xml
Size: 1,569 bytes
Last Modified: 2025-10-06 14:04:06
<?xml version="1.0" standalone="yes"?> <Paper uid="W06-2711"> <Title>The SAMMIE Multimodal Dialogue Corpus Meets the Nite XML Toolkit</Title> <Section position="2" start_page="0" end_page="0" type="intro"> <SectionTitle> 1 Introduction </SectionTitle> <Paragraph position="0"> In the TALK project2 we are developing a multimodal dialogue system for an MP3 application for in-car and in-home use. The system should support natural, flexible interaction and collaborative behavior. To achieve this, it needs to provide advanced adaptive multimodal output.</Paragraph> <Paragraph position="1"> To determine the interaction strategies and range of linguistic behavior naturally occurring in this scenario, we conducted two WOZ experiments: SAMMIE-1 involved only spoken interaction, SAMMIE-2 was multimodal, with speech and screen input and output.3 We have been annotating the corpus on several layers, representing linguistic, multimodal and context information. The annotated corpus multimodal presentation and interaction strategies both within and across the annotation layers; (ii) to design an initial policy for reinforcement learning of multimodal clarifications.4 We use the Nite XML Toolkit (NXT) (Carletta et al., 2003) to represent and browse the data and to develop annotation tools.</Paragraph> <Paragraph position="2"> Below we briefly describe our experiment setup, the collected data and the annotation layers; we comment on methods and tools for data representation and annotation, and then present our NXT data model.</Paragraph> </Section> class="xml-element"></Paper>