File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/intro/92/p92-1045_intro.xml
Size: 3,546 bytes
Last Modified: 2025-10-06 14:05:23
<?xml version="1.0" standalone="yes"?> <Paper uid="P92-1045"> <Title>INFORMATION STATES AS FIRST CLASS CITIZENS</Title> <Section position="3" start_page="0" end_page="303" type="intro"> <SectionTitle> INTRODUCTION </SectionTitle> <Paragraph position="0"> Classical first-order logic (hereafter called elementary logic) is often used as logical representation language. For instance, elementary logic has proven very useful when formalising mathematical structures like in axiomatic set theory, number theory etc. Also, in natural language processing (NLP) systems, &quot;toy&quot; examples are easily formalised in elementary logic: Every man lies. John is a man.</Paragraph> <Paragraph position="1"> So, John lies. (1)</Paragraph> <Paragraph position="3"> The formalisation is judged adequate since the model theory of elementary logic is in correspondence with intuitions (when some logical maturity is gained and some logical innocence is lost) -moreover the proof theory gives a reasonable notion of entailment for the &quot;toy&quot; examples.</Paragraph> <Paragraph position="4"> Extending this success story to linguistically more complicated cases is difficult. Two problematic topics are: Anaphora It must be explained how, in a text, a dependent manages to pick up a referent that was introduced by its antecedent.</Paragraph> <Paragraph position="5"> Every man lies. John is a man.</Paragraph> <Paragraph position="6"> So, he lies. (3) Attitude reports Propositional attitudes involves reports about cognition (belief/knowledge), perception etc. Mary believes that every man lies.</Paragraph> <Paragraph position="7"> John is a man.</Paragraph> <Paragraph position="8"> So, Mary believes that John lies. (4) It is a characteristic that if one starts with the &quot;toy&quot; examples in elementary logic it is very difficult to make progress for the above-mentioned problematic topics. Much of the work on the first three topics comes from the last decade -in case of the last topic pioneering work by Hintikka, Kripke and Montague started in the sixties. The aim of this paper is to show that by taking an abstract notion of information states as starting point the &quot;toy&quot; examples and the limitations of elementary logic are better understood. We argue that information states are to be taken serious in logic-based approaches to NLP. Furthermore, we think that information states can be regarded as sets of possibilities (structural aspects can be added, but should not be taken as stand-alone).</Paragraph> <Paragraph position="9"> Information states are at the meta-level only when elementary logic is used. Information states are still mainly at the meta-level when intensional logics (e.g. modal logic) are used, but some manipulations are available at the object level. This limited access is problematic in connection with (extensional) notions like (standard) identity, variables etc. Information states can be put at object level by using a so-called simple type theory (a classical higher-order logic based on the simply typed A-calculus) -- this gives a very elegant framework for NLP applications.</Paragraph> <Paragraph position="10"> The point is not that elementary or the various intensional logics are wrong -- on the contrary they include many important ideas -- but for the purpose of understanding, integrating and implementing a formalisation one is better off with a simple type theory (stronger type theories are possible, of course).</Paragraph> </Section> class="xml-element"></Paper>