File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/86/h86-1013_metho.xml
Size: 42,338 bytes
Last Modified: 2025-10-06 14:11:53
<?xml version="1.0" standalone="yes"?> <Paper uid="H86-1013"> <Title>COMMONSENSE METAPHYSICS AND LEXICAL SEMANTICS</Title> <Section position="1" start_page="0" end_page="0" type="metho"> <SectionTitle> COMMONSENSE METAPHYSICS AND LEXICAL SEMANTICS </SectionTitle> <Paragraph position="0"/> </Section> <Section position="2" start_page="0" end_page="127" type="metho"> <SectionTitle> SRI International 1 Introduction </SectionTitle> <Paragraph position="0"> In the TACITUS project for using commonsense knowledge in the understanding of texts *bout mechanical devices and their failures, we have been developing various commonsense theories that are needed to mediate between the way we talk about the behavior of such devices and causal models of their operation. Of central importance in this effort is the axiomatization of what might be called &quot;commonsense metaphysics'. This includes a number of areas that figure in virtually every domain of discourse, such as scalar notions, granularity, time, space, material, physical objects, causality, functionality, force, and shape.</Paragraph> <Paragraph position="1"> Our approach to lexical semantics is then to construct core theories of each of these areas, and then to define, or at least characterize, * large number of lexical items in terms provided by the core theories. In the TACITUS system, processes for solving pragrnatics problems posed by * text will use the knowledge base consisting of these theories in conjunction with the logical forms of the sentences in the text to produce an interpretation. In this paper we do not stress these interpretation processes; this is another, important aspect of the TACITUS project, and it will be described in subsequent papers.</Paragraph> <Paragraph position="2"> This work represents a convergence of research in lexical semantics in linguistics and efforts in AI to encode commonsense knowledge. Lexical semanticist* over the years have developed formalisms of increasing adequacy for encoding word meaning, progressing from simple sets of features (Katz and Fodor, 1963) to notations for predicate-argument structure (Lakoff, 1972; Miller and Johnson-Laird, 1976), but the early attempts still limited to world knowledge and assumed only very restricted sorts of processing. Workers in computational linguistics introduced inference (Rieger, 1974; Schank, 1975) and other complex cognitive processes (Herskovits, 1982) into our understanding of the role of word meaning. Recently, linguists have given greater attention to the cognitive processes that would operate on their representations (e.g., Talmy, 1983; Croft, 1986). Independently, in AI an effort arose to encode large amounts of commonsense knowledge (Hayes, 1979; Hobbs and Moore, 1985; Hobbs et al.</Paragraph> <Paragraph position="3"> 1985). The research reported here represents a convergence of these various developments. By developing core theories of several fundamental phenomena and defining lexical items within these theories, using the full power of predicate calculus, we awe able to cope with complex-Sties of word meaning that have hitherto escaped lexical semanticists, within a framework that gives full scope to the planning and reasoning processes that manipulate representations of word meaning.</Paragraph> <Paragraph position="4"> In constructing the core theories we are attempting to adhere to several methodological principles.</Paragraph> <Paragraph position="5"> I. One should aim for characterization of concepts.</Paragraph> <Paragraph position="6"> rather than definition. One cannot generally expect to find necessary and sulBcient conditions for a concept. The most we tun hope for is to find a number of necessary conditions and * number of sufficient conditions. This amounts to ~ying that * great many predicates are primitive, but primitives that are highly interrelated with the rest of the knowledge base.</Paragraph> <Paragraph position="7"> 2. One lhould determine the minimal structure necesury for * concept to make sense. In efforts to axiomatize some area, there are two positions one may take, exemplified by set theory and by group theory. In axiomatizing set theory, one attempts to capture exactly some concept one ham strong intuitions about. If the axiomatization turns out to have unexpected models, this exposes an inadequacy. In group theory, by contrast, one characterizes an abstract clam of structures. If there turn out to be unexpected models, this is a serendipitous discovery of a new phenomenon that we can reason about using an old theory. The pervffisive character of metaphor in natural language discourse shows that our commonsense theories of the world ought to be much more like group theory than set theory. By seeking minimal structures in axiomatizing concepts, we optimize the possibilities of using the theories in metaphorical and analogical contexts. This principle is illustrated below in the section on regions. One consequence of this principle is that our approach will seem more syntactic than semantic. We have concentrated more on specifying axioms than on constructing models. Our view is that the chief role of models in our effort is for proving the consistency and independence of sets of axioms, and for showing their adequacy. As an example of the last point, many of the spatial and temporal theories we construct are intended at least to have Euclidean space or the real numbers as one model, and a subclass of graph-theoretical structures as other models.</Paragraph> <Paragraph position="8"> 3. A balance must be struck between attempting to cover all cases and aiming only for the prototypical cases. In general, we have tried to cover as many cases as poesable with an elegant axiomatization, in line with the two previous principles, but where the formalization begins to look baroque, we assume that higher processes will suspend some inferences in the marginal cases. We assume that inferences will be drawn in a controlled fashion. Thus, every outr~, highly context-dependent counterexample need not be accounted for, and to a certain extent, definitions can be geared specifically for a prototype.</Paragraph> <Paragraph position="9"> 4. Where competing ontologies suggest themselves in * domain, one should attempt to construct a theory that accommodates both. Rather than commit oneself to adopting one set of primitives rather than another, one should show how each set of primitives can be characterized in terms of the other. Generally, each of the ontologies is useful for different purposes, and it is convenient to be able to appeal to both. Our treatment of time iUustrates this.</Paragraph> <Paragraph position="10"> 5. The theories one constructs should be richer in axioms than in theorems. In mathematics, one expectsto state half a dozen axioms and prove dozens of theorems from them. In encoding commonsense knowledge it seems to be just the opposite. The theorems we seek to prove on the basis of these axioms are theorems *bout specific situations which are to be interpreted, in particular, theorems about a text that the system is attempting to understand.</Paragraph> <Paragraph position="11"> 6. One should *void falling into &quot;black holes'. There are a few &quot;mysterious&quot; concepts which crop up repeatedly in the formalization of commonsense metaphysics. Among these are &quot;relevant* (that is, relevant to the task at hand) and &quot;normative&quot; {or conforming to some norm or pattern). To insist upon giving a satisfactory analysis of these before using them in analyzing other concepts is to crom the event horizon that separates lexical semantics from philosophy.</Paragraph> <Paragraph position="12"> On the other hand, our experience suggests that to *void their use entirely is crippling; the lexical semantics of * wide variety of other terms depends upon them. Instead, we have decided to leave them minimally analyzed for the moment and use them without scruple in the analysis of other commonsense concepts. This approach will allow us to accumulate many examples of the use of these mysterious concepts, and in the end, contribute to their successful analysis. The use of these concepts appears below in the discussions of the words &quot;immediately&quot;, &quot;sample', and &quot;operate&quot;.</Paragraph> <Paragraph position="13"> We chose as an initial target problem to encode the commonsense knowledge that underlies the concept of &quot;wear&quot;, as in * part of a device wearing out. Our aim was to define &quot;we*.r&quot; in ternm of predicates characterized elsewhere in the knowledge base and to infer consequences of wear. For something to wear, we decided, is for it to lose imperceptible bits of material from its surface due to abrasive action over time. One goal,which we have not yet achieved, is to be able to prove as a theorem that since the shape of a part of * mechanical device is often functional and since loss of material can result in * change of shape, wear of a part of * device can result in the failure of the device as a whole.</Paragraph> <Paragraph position="14"> In addition, as we have proceded, we have characterized a number of words found in a set of target texts, as it has become possible.</Paragraph> <Paragraph position="15"> We are encoding the knowledge as axioms in what is for the most part * first-order logic, described in Hobbs (1985a), although quantification over predicates is sometimes convenient. In the formalism there is a nominalization operator &quot; ' &quot; for reifying events and conditions, as expressed in the following axiom schema:</Paragraph> <Paragraph position="17"> That is, p is true of z if and only if there is a condition e of p being true of z and e exists in the real world.</Paragraph> <Paragraph position="18"> In our implementation so Ear, we have been proving simple theorems from our axioms using the CG5 theorem-prover developed by Mark Stickel (1982), but we are only now beginning to use the knowledge base in text processinf. null</Paragraph> </Section> <Section position="3" start_page="127" end_page="128" type="metho"> <SectionTitle> 2 Requirements on Arguments of </SectionTitle> <Paragraph position="0"/> <Section position="1" start_page="127" end_page="128" type="sub_section"> <SectionTitle> Predicates </SectionTitle> <Paragraph position="0"> There is a notational convention used below that deserves some explanation. It has frequently been noted that relational words in natural language can take only certain types of words as their arguments. These are usually described as selectional constraints. The same is true of predicJttm in our knowledge base. They are expressed below by ruin of the form p(*, y) : .(*, y) This means that for p even to make sense applied to r and If, it must be the case that r is true of z and y. The logical import of this rule is that wherever there is an axiom of the form 0'*,v)p(.,y) D q(.,.) this is really to be read as</Paragraph> <Paragraph position="2"> The checking of selectional constraints, therefore, falls out as a by-product of other logical operations: the constraint r(z, y) must be verified if anything else is to be proven from p(z, ~).</Paragraph> <Paragraph position="3"> The simplest example of such an r(z, If) is a conjunction of sort constraints r,(z) ^ r,(y). Our approach is a generalization of this, because much more complex requirements can be placed on the arguments. Consider, for example, the verb &quot;range&quot;. if z ranges from If to :, there must be a scale s that includes It and z, and z must be a set of entries that are located at various places on the scale. This can be represented as follows:</Paragraph> <Paragraph position="5"/> </Section> </Section> <Section position="4" start_page="128" end_page="132" type="metho"> <SectionTitle> 3 The Knowledge Base </SectionTitle> <Paragraph position="0"/> <Section position="1" start_page="128" end_page="128" type="sub_section"> <SectionTitle> 3.1 Sets and Granularity </SectionTitle> <Paragraph position="0"> At the foundation of the knowledge base is an axiomatizw tion of set theory. It follows the standard Zermelo-Frankel approach, except that there is no Axiom of Infinity.</Paragraph> <Paragraph position="1"> Since so many concepts used in discourse are $q'aindependent, a theory of granularity is also fundamental (see Hobbs 1985b). A grain is defined in terms of an indistinguishability relation, which is reflexive and symmetric, but not necessarily transitive. One grain can be a reJ~nemeal of another with the obvious definition. The most refined grain is the identity grain, i.e., the one in which every two distinct elements are distinguishable. One possible relationship between two grains, one of which is z refinement of the other, is what we call an &quot;Archimedean relation', after the Archimedean property of real numbers. Intuitively, if enough events occur that are imperceptible at the coarser grain 9* but perceptible at the SSner grain Ih, then the aggregate will eventually be.perceptible at the coarser grain. This is an important property in phenomena 1rob.</Paragraph> <Paragraph position="2"> ject to the Heap Paradox. Wear, for instance, eventually has significant consequences.</Paragraph> </Section> <Section position="2" start_page="128" end_page="129" type="sub_section"> <SectionTitle> 3.2 Scales </SectionTitle> <Paragraph position="0"> A great many of the most common words in English havre scales as their subject matter. This includes many prepmb tions, the most common adverbs, comparatives, and many abstract verbs. When spatial vocabulary is used metaphorically, it is generally the scalar aspect of space that carries over to the target domain. A scale is defined as a set of elements, together with a partial ordering and a granular,.</Paragraph> <Paragraph position="1"> ity (or an indistinguishability relation). The partial ordering and the indistinguishability relation are consistent with each other.</Paragraph> <Paragraph position="2"> 0fz.if.z)z<yAif~z D z<zVz~z It is useful to have an adjacency relation between points on a scale, and there are a number of ways we could introduce it. We could simply take it to be primitive; in a scale having a distance function, we could define two points to be adjacent when the distance between them is less than some ~; finally, we could define adjacency in terms of the grain-size: fvz, y.,}a~(z, r.') =(3 :), ~, ^, ~ If ^ -,\[z ~ If\], Two important possible properties of scales are connectedness and denseness. We can say that two elements of a scale are connected by a chain of adj relations: (VZ. If. J)cmmeded(z. If. e) _= * dj(z, If, .) v (3 :)ndj(z. z. +) ^ connected(:. If. +) A scale is connected (econnecled) if all pairs of elements are connected. A scale is dense if between any two points there is a third point, until the two points are so close together that the grain-size won't let us tell what the situation is. Cranking up the magnification could well resolve the continuous space into a discrete set, as objects into at, om,.</Paragraph> <Paragraph position="3"> (V,}den,e(,) E fVz, If, <)z E * ^ If E. ^ order(<, s) ^ z < If D (3 :}(. < z ^ : < If) v (~a z)(z ~ : ^ * ~ y) This captures the commonsense notion of continuity. A subscale of a scale has as its elements a subset of the elements of the scale and has as its partial ordering and its grain the partial ordering and the grain of the scale. (VJ,. <. ~)e,de,(<. s,) ^ grain(--,.,) D (W,s)\[,u6,cele(s,.,,) m ,ub,d(,s. ,t) ^ order(<, e,) ^ grain(.-., s,)\] An interval can be defined as a connected subscale:</Paragraph> <Paragraph position="5"> The relations between time intervals that Allen and Kauts (1985) have defined can be defined in a straight-forward manner in the approach presented here, applied to intervth in general.</Paragraph> <Paragraph position="6"> A concept closely related to Scales is that of a &quot;cycle'. This is a system which has a natural ordering locally but contains a loop globally. Examples include the color wheel, clock times, and geographical locations ordered by &quot;east of.. %%'o have axiomatized cycles iri terms of a ternary /asfween relation, whose axioms parallel the axioms for a partial ordering.</Paragraph> <Paragraph position="7"> The figure-ground relationship is of fundamental importance in language. We encode this with the primitive predicate af. The minimal structure that seems to be necessary for something to be a ground is that of a scale; hence, this is a selectional constraint on the arguments of at.</Paragraph> <Paragraph position="8"> at(z, ~} : (3 s}y * s ^ scale(s) At this point, we are already in * position to define some fairly complex words. As an illustration, we give the example of &quot;range* as in &quot;z ranges from It to z&quot;:</Paragraph> <Paragraph position="10"> A very important scale is the linearly ordered scale of numbers. We do not plan to reason axiomatically about numbers, but it is useful in natural language processing to have encoded a few facts about numbers. For example, * set has a cardinality which is an element of the number scale.</Paragraph> <Paragraph position="11"> Verticality is a concept that would be most properly analyzed in the section on space, but it is a property that many other scales have acquired metaphorically, for whatever reason. The number scale is one of these. Even in the absence of an analysis of verticality, it is * useful property to have as a primitive in lexical semantics.</Paragraph> <Paragraph position="12"> The word &quot;high* is a vague term that asserts an entity is in the upper region of some scale. It requires that the scale be a vertical one, such as the number scale. The verticality requirement distinguishes &quot;high* from the more general term'very*; we can say &quot;very hard* but not &quot;highly hard'. The phrase &quot;highly planar&quot; sounds all right because the high register of &quot;planar* suggests a quantifiable, scientific accuracy, whereas the low register of &quot;fiat* makes &quot;highly fiat* sound much worse.</Paragraph> <Paragraph position="13"> The test of any definition is whether it allows one to draw the appropriate inferences. In our target texts, the phrase &quot;high usage* occurs. Usage is a set of using events, and the verticality requirement on &quot;high&quot; forces us to coerce the phrase into &quot;a high or large number of using events&quot;. Combining this with an axiom that says that the use of * mechanical device involves the likelihood of abrasive events, as defined below, and with the definition of &quot;wear&quot; in terms of abrasive events, we should be able to conclude the likelihood of wear.</Paragraph> </Section> <Section position="3" start_page="129" end_page="130" type="sub_section"> <SectionTitle> 3.3 Time: Two Ontologles </SectionTitle> <Paragraph position="0"> There are two possible ontoiogies for time. In the first, the one most acceptable to the mathematically minded, there is a time line, which is a scale having some topological structure. We can stipulate the time line to be linearly ordered (although it is not in approaches that build ignorance of relative times into the representation of time (e.g., Hobbs, 1974) nor in approaches using branching futures (e.g., McDermott, 1985)), and we can stipulate it to be dense (although it is not in the situation calculus). We take before to be the ordering on the time line:</Paragraph> <Paragraph position="2"> We allow both instants and intervals of time. Most events occur at some instant or during some interval. In this approach, nearly every predicate takes a time argument.</Paragraph> <Paragraph position="3"> In the second ontology, the one that seems to be more deeply rooted in language, the world consists of a large number of more or less independent processes, or histories, or sequences of events. There is a primitive relation change between conditions. Thus, can.go(e,, ~} ^ p(e,, z} ^ Y=(e:, z) says that there is a change from the condition el of p being true of z to the condition ez of q being true of z.</Paragraph> <Paragraph position="4"> The time line in this ontology is then an artificial construct, * regular sequence of imagined abstract events-think of them as ticks of a clock in the National Bureau of Standards--to which other events can be related. The change ontology seems to correspond to the way we experience the world. We recognize relations of causality, change of state, and copresence among events and conditions. When events are not related in these ways, judgments of relative time must be mediated by copresence relations between the events and events on a clock and change of state relations on the clock.</Paragraph> <Paragraph position="5"> The predicate change possesses & limited transitivity.</Paragraph> <Paragraph position="6"> There has been * change from Reagan being an actor to Reagan being President, even though he was governor in between. But we probably do not want to say there has been * change from Reagan being an actor to Margaret Thatcher being Prime Minister, even though the second comes after the first.</Paragraph> <Paragraph position="7"> We can say that times, viewed in this ontology as events, always have a change relation between them.</Paragraph> <Paragraph position="8"> (Vti, t:Jbefore(tl,t:} ~ ehange(tl, t:) The predicate change is related to before by the axiom</Paragraph> <Paragraph position="10"> This does not allow us to derive change of state from ternpond succession. For this, we need axioms of the form</Paragraph> <Paragraph position="12"> That is, if z is p at time tl and q at a later time t:, then there has been a change of state from one to the other.</Paragraph> <Paragraph position="13"> Time arguments in predications can be viewed as abbreviations: null</Paragraph> <Paragraph position="15"> The word &quot;move', or the predica~ move, (as in &quot;z moves from g to z') can tbe* be defined equivale*tly in terms of change (Vz, y, .}move(., It, z} E (3 e, , e, )change( e t , e, ) ^ at'(es, z, It) ^ at'(e,,., z) m or in terms of the time line (v :, It, :)mo,,e(., y, .) m (3 is, t,)at(z, y, t, ) ^ at(z, z, t2) ^ before(tt, f,) In English and apparently all other natural languages, both ontologies are represented i* the lexicon. The time line ontology is found i* clock and calendar terms, tense systems of verbs, and in the deictic temporal locatives such as &quot;yesterday*, &quot;today*, &quot;tomorrow*, &quot;last night&quot;, sad so on. The change ontology is exhibited in mvst verbs, and in temporal clausal connectives. The universal presence of both classes of lexical items and grammatical markers in natural languages requires a theory which can accommodate both o*tologies, illustrating the importance of methodological principle 4.</Paragraph> <Paragraph position="16"> Among temporal connectives, the word &quot;while&quot; presents interesting problems. I* &quot;st while e~', s2 must be an event occurring over * time interval; et must be an event and may occur either at a point or over an interval. One% first guess is that the point or interval for et must be included in the interval for ca. However, there are cases, such as it rained while I was in Philadelphia.</Paragraph> <Paragraph position="17"> or The electricity should be off while the switch is being repaired.</Paragraph> <Paragraph position="18"> which suggest the reading &quot;h is included in el*. We came to the conclusion that one can infer *o more than that es and e, overlap, and any tigh(er constraints result from implicatures from background knowledge.</Paragraph> <Paragraph position="19"> The word &quot;immediately* also presents a number of problems. It requires its argument e to be an ordering relation between two entities z and g on some scale s.</Paragraph> <Paragraph position="20"> immediate(e} : (3 z, y, s)less-lhan'(e, z, It, s) It is not clear what the constraints on the scale are. Temporal and spatial scales are okay, as i* &quot;immediately after 'the alarm* and &quot;immediately to the left', but the size gale is*'t: * John is immediately larger than Bill.</Paragraph> <Paragraph position="21"> Etymologically, it means that there are no intermediate entities between z and y on s. Thus, OP' e, z, y, s)immediate( e) ^ less-Shah'(e, z, 71, s) D -~(3z)less-than(z, z, s) ^ less-than(z, 7I, s) However, this will only work if we r~trict z to be a relevant entity. For example, in the sentence We disengaged the compressor immediately after the alarm.</Paragraph> <Paragraph position="22"> the implication is that *o event that could damage the compressor occurred between the alarm and the disengagement, since the text is about equipment failure.</Paragraph> </Section> <Section position="4" start_page="130" end_page="131" type="sub_section"> <SectionTitle> 3.4 Spaces and Dimenslon: The Minlmal Structure </SectionTitle> <Paragraph position="0"> The notion of dimension has been made precise in linear al|ebr4. Since the concept of a region is used metaphorically u well ms i* the spatial sense, however, we were concerned to determine the minimal structure that a system requires for it to make sense to call it a space of more than one dimension. For a two.dimensional space, tt~re must be a scale, or partial ordering, for each dimension. Moreover, the two scales must be independent, in that the order of elements o* one scale can not be determined from their order on the other. Formally,</Paragraph> <Paragraph position="2"> Note that this does *or allow <ffi to be simply the reverse of ~t. A* u*surprising co*sequence of this definition is that the minima\] example of a two*dimensional space consists of three points (three points determine a plane}, e.g., the points A, B, ud C, where A <t B, A <t C, C <, A, A <z B.</Paragraph> <Paragraph position="3"> This is Illustrated in Figure 1.</Paragraph> <Paragraph position="4"> The dimensional scales are apparently found in all nat* rat languages i* relevant domains. The familiar three-dimensional space of common sense is defined by the three scale pairs &quot;up-down&quot;, &quot;front-back&quot;, and &quot;left-right'; the two*dimensional plane of the commonsense conception of the earth's surface is represented by the two scale pairs &quot;north-south&quot; and &quot;east-west&quot;.</Paragraph> <Paragraph position="5"> The simplest, although not the only, way to define adjacency in the space is as adjacency on both scales:</Paragraph> <Paragraph position="7"> A region is a subset of a space. 'I~he surface and interior of a region can be defined in terms of adjacency, in * manner paralleling the definition of a boundary in point-set topoiog)'. In the following, s is the boundary or surface of * two-</Paragraph> <Paragraph position="9"> Finally, we can define the notion of &quot;contact&quot; in terms of points in different regions being adjacent.</Paragraph> <Paragraph position="10"> (Vr,, r,, sp)eontact(r, , r,, sp) --= disjoint(rl , r2) A (Ez, y}(z E r, A y E r. ^ adj(z, Thsp) ) By picking the scales and defining adjacency right, we can talk about points of contact between communicational networks, systems of knowledge, and other metaphorical domains. By picking the scales to be the real line and defining adjacency in terms of C-neighborhoods, we get Euclidean space and can talk about contact between physical objects.</Paragraph> </Section> <Section position="5" start_page="131" end_page="132" type="sub_section"> <SectionTitle> 3.5 Material </SectionTitle> <Paragraph position="0"> Physical objects and materials must be distinguished, just as they are apparently distinguished in every natural language, by means of the count noun - mass noun distinction. A physical object is not a bit of material, but rather is comprised of a bit of material at any given time. Thus, rivers and human bodies are physical objects, even though their material constitution changes over time. This distinction also allows us to talk about an object losing material through wear and still being the same object.</Paragraph> <Paragraph position="1"> We will say that an entity b is a bit of material by means of the expression material(b): Bits of material are characterized by both extension and cohesion. The primitive predication occupies(b, r, t) encodes extension, saying that * bit of material b occupies a region * at time t. The topology of a bit of material is then parasitic on the topology of the region it occupies. A part bl of * bit of material b is * bit of material whose occupied region is always * subregion of the region occupied by b. Point-like particles (particle) are defined in terms of points in the occupied region, disjoint bits (disjointln't} in terms of disjointness of regions, and contact between bits in terms of contact between their regions. We can then state as follows the Principle of Non-Joint-Occupancy that two bits of material cannot occupy the same place at the same time:</Paragraph> <Paragraph position="3"> At some future point in our work, this may emerge as a consequence of a richer theory of cohesion and force.</Paragraph> <Paragraph position="4"> The cohesion of materials is also a primitive property, for we must distinguish between a bump on the surface of u object tad * chip merely lying on the surface. Cohesion depends on * primitive relation bond between particles of material, paralleling the role of adj in regions. The relation attached is defined as the transitive closure of bond. A topology of cohesion is built up in a manner analogous to the topology of regions. In addition, we have encoded the relation that bond bears to motion, i.e. that bonded bits remain adjacent and that one moves when the other does, tad the relation of bond to force, i.e. that there is a characteristic force that breaks a bond' in a given material.</Paragraph> <Paragraph position="5"> Different materials react in different ways to forces of various strengths. Materials subjected to force exhibit or fail to exhibit several invariance properties, proposed by Hager (1985). If the material is shape-invariant with respect to * particular force, its shape remains the same.</Paragraph> <Paragraph position="6"> If it is topologically invariant, particles that are adjacent remain adjacent. Shape invariance implies topological invariance. Subject to forces of a certain strength or degree dl, * material ceases being shape-invariant. At a force of strength d: > dl, it ceases being topologically iavaxiant, and at * force of strength d* _~ d:, it simply break*. Metals exhibit the full range of possibilities, that is, 0 < dl < d: < dl < oo. For forces of strength d < dl, the material is &quot;hard&quot;; for forces of strength d where d~ < d < d,, it is &quot;flexible*; for forces of strength d where d, < d < ds, it is &quot;malleable&quot;. Words such as &quot;ductile&quot; and &quot;elastic&quot; can be defined in terms of this vocabulary, together with predicates about the geometry of the bit of material. Words such as &quot;brittle&quot; (d, = d, = da) ud &quot;fluid&quot; (d, = O, d, = co) can also be defined in these fermi. While wC/ should not expect to be able to define various material terms, like &quot;metal&quot; and &quot;ceramic', we can certainly chanu:terize many of their properties with this vocabulary.</Paragraph> <Paragraph position="7"> Because of its iavariance properties, material interacts with containment tad motion. The word &quot;clog&quot; illustrates thil. The predicate clog is a three-place relation: z clogs agaiust the flow of z. It is the obstruction by z of :% motion through It, but with the selectional restriction that * must be something that can flow, such is a liquid, gas, or powder. If a rope is passing through a hole in a board, and * knot in the rope prevents it from going through, we do not say that the hole is clogged. On the other hand, there do not seem to be any selectional constraints on z.</Paragraph> <Paragraph position="8"> In particular, z can be identical with z: glue, sand, or molasses can clog a passageway against its own flow. We can speak of clogging where the obstruction of flow is not complete, but it must be thought of as &quot;nearly&quot; complete.</Paragraph> </Section> <Section position="6" start_page="132" end_page="132" type="sub_section"> <SectionTitle> 3.6 Other Domains 3.6.1 Causal Connection </SectionTitle> <Paragraph position="0"> Attachment within materials is one variety of caumd connection. In general, if two entities z and W are causally connected with respect to some behavior p of z, then whenever p happens to z, there is some corresponding behavior q that happens to y. In the case of attachment, p and q are both move. A particularly common variety of causal connection between two entities is one mediated by the motion of a third entity from one to the other. (This might be called a *vector boson* connection.) Photons mediating the connection between the sun and our eyes, rain drops connecting a state of the clouds with the wetness of our skin and clothes, a virus being transmitted from one person to another, and utterances passing between people are all examples of such causal connections. Barriers, openings, and penetration are all with respect to paths of causal connection.</Paragraph> </Section> </Section> <Section position="5" start_page="132" end_page="133" type="metho"> <SectionTitle> 3.6.2 Force </SectionTitle> <Paragraph position="0"> The concept of &quot;force&quot; is axiomatized, in a way consistent with Talmy's treatment (1985), in terms of the predications force(a, b, dl ) and resist(b, a, d:)--a forces against b with strength dl and b resists a's action with strength d2.</Paragraph> <Paragraph position="1"> We can infer motion from facts about relative strength.</Paragraph> <Paragraph position="2"> This treatment can also be specialized to Newtonian force, where we have not merely movement, but acceleration. In addition, in spaces in which orientation is defined, forces can have an orientation, and a version of the Parallelogram of Forces Law can be encoded. Finally, force interacts with shape in ways characterized by words like &quot;stretch*, &quot;compress&quot;, &quot;bend*, &quot;twist*, and &quot;shear&quot;. An important concept is the notion of a &quot;system&quot;, which is a set of entities, a set of their properties, and a set of relations among them. A common kind of system is one in which the entities are events and conditions and the relations are causal and enabling relations. A mechanical device can be described ~ such a system--in a sense, in terms of the plan it executes in its operation. The function of various parts and of conditions of those parts is then the role they play in this system, or plan.</Paragraph> <Paragraph position="3"> The intransitive sense of &quot;operate', as in The diesel was operating.</Paragraph> <Paragraph position="4"> involves systems and functionality. If an entity z operates, then there must be a larger system s of which z is a pap. t. The entity z itself is a system with parts. These parts undergo normative s~te changes, thereby causing z to undergo normative state changes, thereby causing z to produce an effect with a normative function in the larger system a. The concept of &quot;normative* is discussed below.</Paragraph> <Paragraph position="5"> We have been approaching the problem of characterizing shape from &quot;a number of different angles. The classical treatment of shape is via the notion of &quot;similarity* in Euclidean geometry, and in Hilbert's formal reconstruction of Euclidean geometry (Hilbert, 1902} the key primitive corn cept seems to be that of &quot;congruent angles*. Therefore, we first sought to develop a theory of &quot;orientation&quot;. The shape of an object can then be characterized in terms of changes in orientation of a tangent as one moves about on the surface of the object, as is done in vision research (e.g., Zahn and Roskies, 1972). In all of this, since &quot;shape&quot; can be used loosely and metaphorically, one question we are asking is whether some minimal, abstract structure can be found in which the notion of &quot;shape* makes sense. Conaider, for instance, a graph in which one scale is discrete, or even unordered. Accordingly, we have been examining * number of examples, asking when it seems right ;o say two structures have different shapes.</Paragraph> <Paragraph position="6"> We have also examined the interactions of shape and functionality (cf. Davis, 1984). What seems to be crucial is how the shape of an obstacle constrains the motion of a substance or of an object of a particular shape (cf.</Paragraph> <Paragraph position="7"> Shoham, 1985). Thus, a funnel concentrates the flow of a liquid, and similarly, a wedge concentrates force. A box pushed against a ridge in the floor will topple, and a wheel is, limiting case of continuous toppling.</Paragraph> <Paragraph position="8"> 3.T Hitting, Abrasion, Wear, and Rehated Concepts For z to hit If is for z to move into contact with y with some force.</Paragraph> <Paragraph position="9"> The basic scenario for an abrasive event is that there is u impinging bit of material m which hits an object o and by doing so removes a pointlike bit of material be from the surface of o: ebr-~r(e, m, o, be) : mat~-/./(m) A topolofficaily-invariant(o) (V e, m, O, bo)abr-~ent'(e, m, O, be) --= (3 t, b, a, ~,,e,. e,, e,).t(., t) ^ C/on,iatR-o$(o, b, t} ^ au,f.ce(s, b) A par~ide(~, s) ^ change'(c, st, &quot;=) ^ attacAed'(e,, be, b) ^ not ~ (e:, el ) ^cause(ca, e) ^ hit'(ca, m, be) After the abrasive event, the pointlike bit b0 is no longer a part of the object o: (Ve, m, o, be, e,, e:, t:)abr-e~ent'(e, m, o, be) ^ changd(e, e,, e:} ^ attached'(el, be, b) Anot'(e~,ej) A at(e:,t:) ^ consists-of(o, b:, t:) D -,part(bo, b:) It is necessary to state this explicitly since objects and bits of material can be discontinuous.</Paragraph> <Paragraph position="10"> An abrasion is a large number of abrasive events widely distributed through some nonpointlike region on the surface of an object: (Ve, m, o)abrad~(e, m, o) E</Paragraph> <Paragraph position="12"> Wear can occur by means of a large collection of abrasive events distributed over time as well as space (so that there may be no time at which enough abrasive events occur to count as an abrasion). Thus, the link between wear and abrasion is via the common notion of abrasive events, not via a definition of wear in terms of abrasion.</Paragraph> <Paragraph position="13"> (re, m, o)wear'(e, z, o) =-</Paragraph> <Paragraph position="15"> The concept &quot;widely distributed&quot; concerns systems. If z is distributed in tl, then I/ is * system tad z is * set of entities which are located at components of y. For the distribution to be wide, most of the elements of a partition of II determined independently of the distribution must contain components which have'elements of z at thelm.</Paragraph> <Paragraph position="16"> The word ``wear&quot; is one of a large class of other events involving cumulative, gradual loss of material - events described by words like &quot;chip*, &quot;corrode', &quot;file', &quot;erode*, &quot;rub*, &quot;sand&quot;, &quot;grind*, &quot;weather&quot;, &quot;rust', &quot;tarnish', &quot;eat * way*, &quot;rot*, and &quot;decay*. All of these lexica\] items can now be defined as variations on the definition of &quot;wear*, since we have built up the axiomatizatioas underlying &quot;wear&quot;. We are now in a position to characterize the entire class. We will illustrate this by defining two different types of variants of &quot;wear* - &quot;chip&quot; and &quot;corrode'.</Paragraph> <Paragraph position="17"> &quot;Chip* differs from &quot;wear&quot; in three ways: the bit of material removed in one abrasive event is larger (it need not be point-like), it need not happen because of a material hitting against the object, and &quot;chip* does not require (though it does permit) a large collection of such events: one can say that some object is chipped if there is only one .chip in it. Thus, we slightly alter the definition of abr-evenl to accommodate these changes: ire, m, o, be)chif (e, m, o, be}</Paragraph> <Paragraph position="19"> &quot;Corrode&quot; differs from &quot;wear* in that the bit of material is chemically transformed as well as being detached by the contact event; in fact, in some way the chemical transformation causes the detachment. This can be captured by adding * condition to the abrasive event which renders it * (single) corrode event: corrode-event(m, o, be) : fluid(m) ^contact(m, be) O's, m, o, be)corrode-~e.r(e, m, o, be) ~_ . .</Paragraph> <Paragraph position="20"> (3 t, b, s, be, e,, e,, e,)at(e, t) ^ consists-of(o, b, t) ^ surface(s, b) ^particle(be, a 0 ^ changd(e, el, e:) ^attacheW(c,,bo, b) ^ not'(e:, e,) ^cause(e,, e) ^ chemical-changd(e,, m, be) &quot;Corrode* itself may be defined in a parallel fashion to &quot;wear*, substituting corrode-event for abr-event. All of this suggests the generalization that abrasive events, chipping and corrode events all detach the bit in question, and that we may describe all of these as detachtag events. We can then generalize the above axiom about abrasive events resulting in loss of material to the following axiom about detaching: (re, m, o, bo, b2, e,, e:, t:)detach'(e, m, o, be) ^ changd(e, eh e:} ^ attached'(ct, be, b) ^ not'(ez, e,) ^ at(e:, t:) ^ Consists-of(o, b:, t:) D -,(part(be, b:))</Paragraph> </Section> class="xml-element"></Paper>