File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/88/c88-2137_abstr.xml
Size: 29,491 bytes
Last Modified: 2025-10-06 13:46:35
<?xml version="1.0" standalone="yes"?> <Paper uid="C88-2137"> <Title>Appficafio~ of the Direct Memory Access paradigm to ~,ot~ deg~a~ ~a~gl~age interfaces to knowledge-based systems</Title> <Section position="2" start_page="0" end_page="664" type="abstr"> <SectionTitle> &bstract </SectionTitle> <Paragraph position="0"> 't'h~s pa~r describes file use of the Direct Memory Ac~ ~:css (DIdA) pmadig~n hi a practical ltatu~tl lmlguage in~:c~f~tec, Advaaltages and disadvantages of DMA in such ~pplications art~ discussed. 'ihe DMA natural language inteffa~x~ 'I)M-.COMMAND ~ described in this paper is be..</Paragraph> <Paragraph position="1"> tug u.~;c C/l tk~r development of a knowledge-based machine translation system at rite Center for Machine 'lYanslation \[1987a\]). h this paradigm, natural language understanding is viewed a; an effort to recognize input sentences by using p~c-cxisting knowledge in memory, which is often experien.tial aud cpi:;odic. It is contrasted with traditional models of parsing in which syntactic and semantic representations itre built as the result of parsing and are normally lost after each parse. In the DMA model, input sentences are identified with the memory sU'ucture which represents the input, and are instantiatcd to represent flint specific input. Since understanding is performed as recognition through the memoly network, the result of understanding is ~aot lost after each sentence is processed. Also, since parsing and memory-based inferences are integrated, ~arious memory~based activities emt be triggered directly through natural language understanding without separate inferential processes.</Paragraph> <Paragraph position="2"> As on~' application of DMA, at the Center for Machine &quot;l?anslatioil (CMT) at Carnegie Mellon University, we have developed a natucal language interface for our large-scale knowledge-based machine translation system t called DM(~OMMAND. This application of DMA demonstrates the power of this m(~lcl, since direct access to memory during parsing allows dyna,~dc evaluation of input commands and question an~ ~we~ing without running separate inferential processes, while dynanfic~d~y utilizing the MT system's already existing do,~lahl 13~owlcdge sonrces~ The implementation of the DMA aTtte CM!3-MT systcin which is the target system fo~ tic DMo COiV~rC/~AND sy:~tem described in this paper is described in detail in Tomita&Cmbtmell\[1987\] and Mitamufa, et a/t1988\].</Paragraph> <Paragraph position="3"> natural language system has been completed and is used for development of actual grammars, domain knowledge-bases, and syntax/semantic mapping rules by the researchers at CMT.</Paragraph> <Paragraph position="4"> This system has been demonstrated to be highly effective as a MT developmental support system, since researchers who develop these individual knowledge sources are otherwise unknowledgeable about the internal implementation of the MT system. The DMA natural language interface can provide access (currently English and Japanese) to the system's internal functions through natural language command and query inputs. This use of the DMA model for natural language iw terfaces demonstrates that it is an effective alternative to other natural language interface schemes.</Paragraph> <Paragraph position="5"> II. A background of DMA The Direct Memoly Access method of parsing originated in Quillian~s\[1968\] notion of semantic memory, which was nsed in his TLC (Quillian\[1969\]) which led to further research in semantic network-based processing 2. TLC used breadth-first spreading marker-passing as an intersection search of two lexically pointed nodes in a semantic memory, leaving interpretation of text as an intersection of the paths. Thus, interpretation of input text was directly performed on semantic memory. Although TLC was the first DMA system, DMA had not been explored as a model of parsing until the DMAP0 system of Riesbeek&Martin, except as a scheme for disambiguations.</Paragraph> <Paragraph position="6"> I)MAP0 used a guided marker-passing algorithm to avoid the problem of an explosion of search paths, from which a dumb 3 (not guided) marker passing mechanism inherently suffers.</Paragraph> <Paragraph position="8"> (Activation markers) as markers passed around in memory, adopting the notion of concept sequence to represent linear ordering of concepts as linguistic knowledge, which guides linear predictions of concepts sending P-markers in memory.</Paragraph> <Paragraph position="9"> ZSueh as Fahhnan\[1979\], Hirst&Charniak\[1982\]. Chamiak\[1983\], Haun&Reimer\[1983\], Hirst\[1984\], Charniak\[1986\]. Norvig\[1987\], and eonneetionist and distributed parallel models iimluding Small. et al\[1982\], Granger&Eiselt\[1984\], Waltz&Pollack\[t984\], Waltz&Pollack\[1985\], Berg\[1987\], and Bookman\[1987\].</Paragraph> <Paragraph position="10"> aWe call it 'dumb' when markers are passed everywhere (through all lix~ks) from a node. In a 'guided' scheme, markers are passed through specific links only.</Paragraph> <Paragraph position="11"> (~ c q ' .,o~leept s(Y,l~:itmoes, ~ W~.ch (~r~colnpassee phrasal pattc~:~l;,, are attached to nodes in memory timt ~elwemnt some specific ex,, perientiaI memory strta:tm-eo ~a DMAP0~ Aomad~e~s at~:~ sent above in the abstractimi hierarchy fiom the lexically acfivaied m~de in memory, and P~ma'kers ale se~t to the ~'~ex.~ elema~t of the concept sequence only after the Aom?rker from t~\]ow ' , is already Pqnarked, Concept * ~ is lfit~ aaodc m~,t ~ elllleffleitt _ . (~ r * performed uxi~g c(mcept refinernent ~inlo; (xe~>hnks) when a whole concept sequence is activated. Co~cept ~'eti~emc',w~ locates the most specific node in memory, below t!~e activated root node~ which represe~is the speeific instance of ~he inpm text. DMTRAN~; ('/bmabechi\[1987al) evolved the DMA into a theory of crossqinguisfic ~ranslations and added mectmism,,; of explanatory generati(m, (;--Marker pass\]us (for further con-.</Paragraph> <Paragraph position="12"> t~:xual disambiguation,~;)~ and a revised scheme of concept :~'o.. finement while performing English/Japanese t~anslaiions.</Paragraph> <Paragraph position="13"> ~t!~o DM-Com~nand The DIVL-COMMAND system which wc describe in tiffs pa-~ per is a l-~atural language interface developed for grantmar, knowtcdgeobase, aud synlax/sema~tic mapping rule writers at CM'I; which enables these researcher,~ to access the MT sys-.</Paragraph> <Paragraph position="14"> tem's internal functions for their development aud de/ragging proposes. The DM--COMMAND parser borrows the bask- alto-.</Paragraph> <Paragraph position="15"> rithm from the DMTRAN,~; machine translation system, which performs recognition of inp~t via the guided spreading acdva.</Paragraph> <Paragraph position="16"> tion marker-passing of A-.mm'kers, P-mm'kers and C-markers ~' in memory.</Paragraph> <Paragraph position="17"> As a brief example~ let us consider the process\]lit the input command &quot;show me *HAVt%A-PAIN', where *HAVE-A ~ PAIN i,~ au actual name of a concept definition in oar fl'ame system 0~'.RAMEKIT~ Nybergl\] 988\]). Independent of the semantic network of domain knowledge used by the MT sys..</Paragraph> <Paragraph position="18"> tern, the DM-COMMAND has separate memory network repo resenting concepts involved in performing various actions in the MT system. Among such concepts is the concept 'showframe', which represents the aetiou of pretty-printing bSRAME-KIT definitions stored as domain knowledge. This concept has the concept sequence <mtrans-word person *CONCEPI&quot;> attached to it. This concept sequence prcdicts that the first input word may point to au instance of 'mtraus-word ~ (such as 'show~), followed by an instance of person fi)llowed by some concept in the font~ of a }~'RAME\]\[{'YF name, Whe~ the first input wonl &quot;Show&quot; (:ome~ in, it activates (puts au A-marker on) the lexieN node %how', which in turu sends ae4C~markers (Conte~ual.-markers) were imrodnced in DM'i~IANa, ~md m'e propagated to mak eontexually highlighted concepts in memory. DMTRAN3 used Comarkers lor word-scn~ disambigaations through C/ontexual markin 3. DM'rRANS also added an explanatory geueration m~hanism whielt generates sentences in Ihe target langt~age :for concepts thqt did not havC/~ a lexieal entry i~ the target language, by explaining the concept in that target language.</Paragraph> <Paragraph position="19"> fiW~,fio~ (A-mmker) above i~ fl~ abst~acficm hie~acchy mid hits ':mt~ans-.word'o At d.~e v(~ry begimfi~ng of pmxi~?g, ait the first efeme~ts of cmicept scqucx~ces are predicted (P-~?~arke.:i), dherc%re~ whe~ ar~ A=mark~ ~ is sent fl=o~:, %how&quot; nard idi~ A--mark~&quot; m~d P.marker coil\]de at degm~-a~s-.WoaF,. Wheaa t~fia eollisiot~ of two ma~kers happens, the P.maft,~,,~ ' },~; ~;c~,~a te~ itr, ~ext elcmc~t of concept scque~~ce, which is ~c~:qoC, 'i!t'i~c:~ am! then aefivat~?s 'person' (an A..mm'ker is ~ent above :~ ~ 3:~e abstainer\]on hierarchy)deg Since 'person ~ was ~'-maked. at a p~'evio~s marker collision at 'mtra~s-wordL m~oihe~' c,~iJ..</Paragraph> <Paragraph position="20"> sloe (~ccurf; here. Therefore, a P-.marker ix agaif~ ~,:e~t to th~ ~)ex~ cleme~t of the concept seq~_~ence~ which ix '*CON(:~:,',l&quot;r ~ }~'i~atly, ~:~qiAVE.*A-f'AI.N&quot; co~n6x i~L Now, ~:~;' spreading activation occurs not in the command liciet~tO.t-y netwod~, b~:~ in the domain Nmwledge network (dcetor/paficn~ diatog do.main) activating '*t/AVE~-A4,AIN ~ initiall~cand thc,~ activai.i~g the coucepts above it (e.g, ~&quot;&quot;.lqAVE~A-SYMPTOM') until fl,e ac. dvafion hits the concept ':::CONCI'2t~I '' which wax P.ma?kt)d vt the p~'evious coil\]stem Since it is the final element of d~c concept uequence <rattans-word t~-rson ::~CONCIiPE>, t:his corfcept sequence is accepted wlmn this collision of A-.~nart~er and P-mm'ker happens. When a whole concept sequence is accepted, we ~mfivated the root node for the seque~ee, which in tiffs case is the concept 'show--fl'amC. Also, in addido~ to activating this coneept~ we perform concept refinemem6~ which searches for a specific node ia ~he eomma~M netwofl, ihat represents our input sentencedeg Since it does not exis~ i~: this first parse, DM~COMMAND creates that concept 7, This newly created concept is mi instance of 'ratransqi'amC~ m~d its object slot is now filled not by genetic ~*CONCt?!~r ~ N~t instead by '*HAVE.-A--PATN'~ specific to Ollr input so~t_e~i:eo '\['h~', final concept.-mfined concept is the result of the pacse?'o 5One firing to note here is that the concept 'aIIAVE-.AopAIN' Nat i~ activate, d by input &quot;*IIAVE-A-IPAIN&quot; is not part of the memo W Petwock tb~ Ne I)M-COMMAND~S MT system eonnnanding coaeepls, instead it is a menany unit that is a paa of lhe MT systenxs domain knowledge, in o~her wof&~ '*t/~VEoAq'AN' belongs to a different memory network fi'om 'showofiv_me', 'mtransoword', and 'person'. This doe, s not causo a problem ~o !he DM-, COMMAND, and actually, it cmi utilize any number of indepe~Moat s~aaa~ltic. networks shnnltam~usly, as long as concept ~quences g~fide pa%iug oi! P-marker from one network ~o anoiher. For example, l!~e '*I'|q-',SC/)N ~ i)i ihe domai~ knnwlexlge ~manfic network rep~s~mt scum generic pe~:o;~ whereas 'person' in DM--COMMAND c9~'fnaand knowledge t~etwofl~_ rcp:c-,~enfs t~:rsoos involved in the us~ of the DM..COMMAND Syste~'ll. ~'Lytinen\[19841 has a discussion of 'concept-refinement' with his M(X'~ T~L~d,~S parser.</Paragraph> <Paragraph position="21"> 71in DM~INANS~ when stleh eread(m of conc~;pts ~ce~t~rcd th,~ /~.~&quot; wa; a~ed to provide the vocabulary, and tla~s s~:xved as a model ibr w~e~N'~day acquisition as well as concept creation, h~ ).)~V't-'COL~IVJAND, WO rm~don~iy generac names lbr such newly created iustames ~:(td ~ser does ~o\[ ~3appiy names tot the newly c~eated concepts.</Paragraph> <Paragraph position="22"> aAct~M inputs to DM.-COMMAblO m'c ~o~mNly much longest aid ~.c eompa.uy multiple concept sequences; however, d~e basic mechad:~m tiff r,,:cog~lifio~ of input is a~ explained here. Also, DMoCo~.~V~ND baadle~ pwC/,Kyf'tt,sk~:: (lr~.~:/,,Mt2~{W&quot;s flmcfic~.~ A)r N~ti.y..pfilii;hG a :hm~,:.c: 7~:~,;,ttltlti~.~.ttJ,J. PS0 \[~t'~ :Tt'A'VE--A.Pt~IN C:C/,k~.~;i~: iS a Stlbb~aSL~ tff :~c.(c)rJ. ~:s)i;.<<'~: ' a~d if; ih~ objcc~, of pfi~ti,g 7~ oi~r exanlp\]m :hq)ltL {Qtq\]dt !~ t,~; C)~Vsl/~, ~no.~.(;~ ttatltl'al htngl;lage, tlildo:tsi.~itid~iqr~ ~,{; {~:p ~:i~cf;7, ;4dO~!{J~:\]fiit~ i~q~t with the Si}'~:Ci\]il;, t;OtlOO!)t SC/;~tiO;~CO ~ii~{!{ i:::.;~ ~:{'~:Di:;{,t/;~; a, i'C/,;;;:)1i, GG~iGO~)\[9+ alld l:iJ~;;~l pOr~.:OlJ~i~(!g ()i(),t)G(}~j~7: la~gtmgo h~cor fa(;m h~r hJggC/;rhlg sys/ra~ ~i~ilt~Cdo~ is {~lo}f~ ai:cd h~to i\]~m nK.m~;~y ~;ea~t;~ ,~icdvity u;ml\]o~' tht~ :()l;/J:~ pu'C/a,:iigm, avid fi6s w;~y, i~te:~ce is iul~:gratod h~to nat~iral \]auguag~ ,':m&~staadh~go ;{&quot;.7, :i! J)~ac,~t.,ss~i,~r.; ~IAt~ to s{x:og~dz(~ tho 7ap~.d: ba~:ed oil what ~t ~fit'oi~dy :!tao'ws &quot;~s doma, i~ ~.;pecitic know\]odgo i~ tho m'oa. 4~t b:alls\]athJ~ and tho ~;ya~m's owr~ implomG~tafi~m, Whe~i somo aci,ion if.; re ~ (\]~GSI;G{i, \[tlG i~\[(31-~l.(j~3 IHIISi. alKt6rsl.~!(~d thc i'cqi:lgst a~ld lgspon(i acc(gdin E to what is r~;q~t(;si:fxt, at~d d~crefore it \]s ~coessary to ~:s~i\].C ~i~(l ,':l!Y~O)ll~; fi\[iSOOO~:,~;~ alld i,o ~l-J/Jty{)l ' th{i :-;y:-;t{~rn<&quot;s ~.. iol.~t:-;il {\]~.l~{;fions appiroprJah3\]y, iTof tlxamp\](% ~&quot;: ~ i~:nowlodgt:ba<,;(~ dove, lopo~' i~tplttf; &quot;Ehow me all the mapping lu\[c~; on ::flIPq)t)WIq.),EVER '~' i~ o~dC/3y to debhg somo conccpt~!al bug pm~m~m ~efc~ci-tcc rcSrMufion, e, llipscs, and <~ome types of anaphora (c,x- null a~aplcs arc ilich',dcd in ihc Appcndix)o Al,co, ~)Mo(JOMMAND Istilizcs C~aakcr propai~#-ion ~t) di~?aiabJgtmic sfmi~, ~ of ik~J co~tt,xually diflicolt x~:n.tcncc:4. Toi~'~a:)echi\[1987b\] gives a dc~oiled dcscdptkm (if Ibis dis~uid)igua. d,u.. ~edutM,, sao !):I,: ~t:~y b,:.',~l~ tbFA f{~c Jh~fitafiOll Of tiffs ~itieihoLi is ~}iat t}ii. P.,'.;iiicHco Ca,; that the concept which represents the request for action is directly connected to the concept that represents the action that is requested. Likewise, the direct memory access recognition of a question means that the concept which is identified by the input is directly connected to the concept that represents the answer, as long as the system knows (or potentially knows) the answer. In oilier words, in the DMA model, recognition of a request for action is a triggering of the action requested and recognition of a question is knowing the answer (i.e., as soon as we understand the question, either we know the answer, or we know the inferences to be performed (or functions to be evaluated) to get the answer) as long as memory contains the action and the answer. 'lb reiterate the literature on the DMA paradigm, in this model, memory is organized in the hierarchical network of concepts which are related by links that define the concepts. Thus, as soon as we identify the input with a certain concept in the memory, we can trigger the action (if this is a concept that represents some action (or request for action)), or answer the question (if the concept represents some knowledge (or request for some knowledge)). Thus, parsing and inference are integrated in the memory search process, and no separate inferential modules are necessary. It should be understood; however, that it is not our claim that we can eliminate inference altogether. Our claim is that 1) the memory search through concept refinement itself is an inference which is normally performed by separate inference modules(such as eontexual inference and discourse analyses modules) in other parsing paradigm; and 2) whenever further inference is necessary, such inference can be directly triggered after concept refinement from the result of parse (for example, as a daemon stored in the abstraction of the refined concept) and therefore, the inference is integrated in the memory activity.</Paragraph> <Paragraph position="23"> C. Ellispsis and anaphora In a practical natural language interface, the capacity to handle elliptic and anaphorie expressions is important. DM-COMMAND is capable of handling these phenomena, because under the DMA paradigm (which is typically called &quot;recognize-and-record paradigm&quot;), the result of each parse is not lost after each sentence, but instead remains as part of the contexual knowledge in the memory network. On the other hand, in the traditional parsing paradigm (we call it &quot;buildand~store&quot; paradigm), since the result of the parse is lost after each sentence, the parsers can at best handle indexicality within a sentence. Specifically, 1) ellipses are handled by DM-CoMMAND; since ellipses are characterized as the lack of elements in a concept sequence, and these are recoverable as long as the elements or their descendants had been activated in previous parsesl4; 2) anaphoric and pronoun references are ~4For example, with the input &quot;jgt92.gra o uchidase, sem.tst mo.&quot; (Print jgr92.gra. Sem.tst also). Second senteuce has the object droped; however, resolved by utilization of both semantic knowledge (repre-.</Paragraph> <Paragraph position="24"> sented as restrictions on possible types of resolutions) and also by the context left from the previous parses in memory similar to the way,that the elliptic expressions am handled.</Paragraph> <Paragraph position="25"> Finding a contexually salient NP corresponding to sotr~e NP means, in DMA, searching for a concept in memory which is previously activated and can be contexually substit~te fox' currently active concept sequencetS o Do DMA and syntax One weakness of current implementations of th~ If)MA paradigm is that the concept sequence is the sole syntactic knowledge for parsing 16. Therefore, a DMA system needs deliberate preparation of concept sequences to handle syntactically complex sentences (such as deeply embedded clauses~ small cauls, many types of sentential adjuncts, etc.). This does not mean that it is incapable of handling syntactically complex sentences, instead it means that concept sequences at some level of abstraction (at syntactic template level down to phrasal lexicon (Becker\[1975\]) level) must be prepared for each type of complex sentence. In other words, although such sentences can be handled by the combination of concept sequences, designing such sequences can be complex and less general than using external syntactic knowledge 17. Thus, current reliance upon a linear sequence of concepts causes limitations on the types of sentences that can be realistically handled in DM-COMMAND. Of course, there is nothing to prevent DMA paradigm to integrate syntactic knowledge othea' than a linear sequence of concepts. Actually, we have already implemented two alternative schemes for integrating phrase-structure rules into DMA. One method we used was having syntactic nodes as part of the memory and writing phrase-structure rules as concept sequences 18. Another method was to integrate the DMA memory activity into an augnrnented context-free grammar unification in a generalized LR parsing.</Paragraph> <Paragraph position="26"> Second method used in a continuous speech understanding is described in Tomabeetti&Tomita\[ms\]. We will not discuss these schemes in this paper.</Paragraph> <Paragraph position="27"> While handling syntactically complex sentences is rather expensive for DM-COMMAND, since it relies solely on linear concept sequences, natural language interfaces are one applitiffs can be supplied since the memory activity after the first sentence is ~ot lost and the memory can supply the missing object.</Paragraph> <Paragraph position="28"> 15Fur example in &quot;Pretty-print dm.lisp. Send it to mt@.nr'~ &quot;it&quot; can be identified with the concept in memory that represents din.lisp which was activated in memory during the understanding of the first sentence.</Paragraph> <Paragraph position="29"> t~Although generation is normally helped by external syntactic katowledge snch as in file case of DM'I'RANS.</Paragraph> <Paragraph position="30"> 17Also, pronoun and anaphora resolution is based upon contexual knowledge alone; however, use of syntactic knowledge (such as rite governing category of an anaphora) would help such efforts.</Paragraph> <Paragraph position="31"> cation area where the capacity to handle phenomena such as ellipsis, auaphora, pronoun resolution, and contexual disambignation is more valuable than handling syntactically complex sentences. It seems that DMA is one ideal paradigm in this axeao This is evident if we consider the fact that input to a natui'al language interface is normally in a form of dialog and users tend to input short, elliptic, ambiguous aud even ungrammatical sentences to the interface. Our experience shows that an increase in the size and complexity of the system ~n order to integrate full syntaciic processing, enhancing the DMA's capacity to handle syntactically complex sentences, has so far outweighed the need for such capacity 19.</Paragraph> <Paragraph position="32"> Eo 1V~fip~e ~e~anti~: ~etworks and portability \])M-COMt,.~AND utilizes two types of semantic networks. One is the semantic network that is developed under the MT system as domain knowledge that DM-COMMAND utilizes. The other as the network of memoxy which is unique to DM-COMMAND.</Paragraph> <Paragraph position="33"> This memory represents a hierarchy of concepts involved in commanding and question-answering necessary for the development of machine translation systems. This memory network is written with generic concepts for development of MT systems, so that this memory we have developed at CMT should be portable to other systems 2deg.</Paragraph> <Paragraph position="34"> The control mechanism (i.e., spreading activation guided marker-passing algorithm) and the actual functions for performing actions are separate (actu',d functions ale integrated iuto the \]D~/~-COMMAND memory network). This separation makes the system highly portable, first because virtually no change is necessary in the control mecharfism for iranspolting to other systems, and second because the size of the whole system can be trimmed or expanded according to the machine's aw61able virtual memory space simply by changing the size of the DM-COMMAND memory network 21.</Paragraph> <Paragraph position="35"> Thus, ~mder DMA, a natural language interface can 1) directly spr,~ad markers on the target system's already existing semautic network 22, utilizing the existing knowledge 19Although, we have seen that it is effective in parsing noisy continuous sl)eech input (Tomabechi&Tomita\[ms\]).</Paragraph> <Paragraph position="36"> ~Of conrse~ we will need to change the specific functions that are stored in some of the nodes and perhaps some of the specific (lower in the hierarchy) concepls need to be modified for each specific system.</Paragraph> <Paragraph position="37"> 21if only a l)asic command natural language interface is required, then we can trim |h,~ pints of memory used for adwmced interface and questionanswering. (h~ the other hand, if machine's memory is of no concern, we can write memory-net and concept-sequences fbr all the system functions of ltie tin'get MT .,;ystem. Also, note thai due to the spreading activation guided mal'ker..passing algorifllm of the DM-CoMMAND recognizer, the speed of the system is ndnimally affected by an increase in the size of the memory for commanding and qnestion-mlswering. It is because spreading activation is local to each concept and its packaged nodes under guided marker-passing that even if the size of the whole memory network increased, the amount of computation for each concept should not inerea~ accordingly.</Paragraph> <Paragraph position="38"> ~-:'~As long ~L~ semantic nets are implemented in a general frame language or object oriented systems.</Paragraph> <Paragraph position="39"> for understanding input texts! 2) utilize a command and query conceptual network developed elsewhere (such as DM-COMMAND), with minimum ~todifications in the functions stored in the root nodes that ~h-igger the actions; 3) be ported to different systems with virtually no change in the control mechanism since it is a guided spreading activation marker-passing mechanism and no system specific functions are ineluded (those functions are included in the comand/query semantic net).</Paragraph> <Paragraph position="40"> V. Conclusion DM-.COMMAND is the first practical application of the DMA paradigm of natural language understanding, in which parsing and memory-based inference is integrated. This system has been proven to be highly effective in knowledge-based MT development. It is due to the complexity of system implementations in a large scale MT project that grammar and knowledge base writers axe not expected to have expertise ou the internals of the translation system, whereas it is necessary for such a group of project members to access the system internal functions. DM-COMMAND makes this access possible through a natural language command and question answering interface. Since DM-COMMAND uses the spreading activation guided marker-passing algorithm, in a memory access parser which directly accesses the MT system's already exist~ ing network of concepts, inference is integrated into memory activity. Since there is a separate memory network for concepts representing commanding and question-answering that are generic to MT system development, the system is highly portable. The DM-COMMAND system demonstrates the power of a direct memory access paradigm as a model for a natural language interface, since understanding in this model is a l'ecognition of the input sentence with the existing knowledge in memory, and as soon as such understanding is done, the desired command can be directly triggered (or the question directly answered).</Paragraph> <Paragraph position="41"> With DMA's ability to handle extra-sentential phenomena (including ellipsis, anaphora, pronoun reference, and word-sense ambiguity), which are typical in a practical natural language commaud/query inputs, DMA is one ideal paradigm for natural language interfaces as shown in our DM-COMMAND system. Also, DMA's integration of parsing and inference into an unified semantic memory search has proven to be highly effective in this application.</Paragraph> <Paragraph position="42"> Appendix: Implementation ql~e DM-COMMAND system has been implemented on the IBM-RT ~3 and HP9000 AI workstations, both running 23Due to the space limitation, the actual sample output of the system is not included in this proceedings paper. The tectmical report from CMU- null CommonLisp. The. sys~:em directly utilizes the I?RAMEKIT -.</Paragraph> <Paragraph position="43"> represented domain kn~wA~xlge (currently in the area of centpurer manuals and doctor/patient cot~w.rsations) ~f the CIVIl.\]-. MT k~mwledge-based large-scale iuachine lratl.%ttion ~ystem~ it handles inpals in both English and Japanese, '~'he current :-:ize of the t)IM~-(~'OMIVIAND ~ystem is roughly 5,(X)0 li~e~; of ifisp ~;ode (this does riot irtchtde the MT system fimctions and the \[?I/AMEK/T l)'ame system, parts of whicii must also be loaded into memory) and is not expected to increase, tinct the fntam variety in types of commands and questions thai the system will ha~dle wilt be 7~ltegrated into the network of mere-.</Paragraph> <Paragraph position="44"> try that represents concepts for ~:ommanding and qt~esticm/ ailsworhtg aild not iiilo the system code il:self pz. Compiled code on IBM-.R'7's and l\[/\[Pg0l?0s is fast enough that parsing and l~erforming commanded action happens virtually in ~eal-fimeo We are expecti~g to increase die variety in types of system fimctions arm grammar/rule development fimctions; however, as noted above, since such increases will occur in the memory network, as a system implementation, I)M-.COMMAND is a completed syslem.</Paragraph> <Paragraph position="45"> Ac~ow~edgments The autlto~s would like to thank members of the Center for Machine Translation for frtfitfal discussions. Erie Nyberg and ~l~ruko Mitamura were especially helpful in prel~aring the final version of ~his paperdeg CMT under the same title c(mtains the sample runs oi Ihc DM-.COMMAND on an IBM-RT nnming CMIJ-CommonLisp for development of CMIJ-MT project's conceptual entity definitions ~md syntax/semantic mapping r0_les. The example input sentences in Japanese include s(nr~e of the ellipses handlings in discourse lhat are typically problematic ti~r natural la, igtlage interfaces. The system ,also accepts English as the input language. Some of the input sentences m'e &quot;*have-a-pain no zenbu no m~pping role o misenasai&quot;; &quot;SO fie oya i11o&quot;; &quot;koremade. no o zcmbu nlisenas~i'; and &quot;so no shtlnllyokn o takcda san ni okare&quot;.</Paragraph> <Paragraph position="46"> ~'~()ne advantage of DM-COMMAND is that the whole system is only 5,0(X) lines king and we need not load the whole MT system (which is quite largo) for developing grammar and concept entity definitions lind writing syntaxNemantics mapping roles.</Paragraph> </Section> class="xml-element"></Paper>