File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/concl/93/w93-0103_concl.xml
Size: 1,874 bytes
Last Modified: 2025-10-06 13:57:08
<?xml version="1.0" standalone="yes"?> <Paper uid="W93-0103"> <Title>Lexical Concept Acquisition From Collocation Map 1</Title> <Section position="5" start_page="27" end_page="29" type="concl"> <SectionTitle> 4 Conclusion </SectionTitle> <Paragraph position="0"> We have introduced a representation of lexical knowledge encoding from which an arbitrary conditional probability can be computed, thereby rendering an automatic acquisition</Paragraph> <Paragraph position="2"> the index of tree in the Map is 23. di(19) is an index to dictionary, ctr(92) says tree occurred 92 times, mi(19) indicates the index of inch in the Map is 19. c(4) of shrub says shrub occurred 4 times in the back list.</Paragraph> <Paragraph position="3"> of lexical concept. The representation named Collocation Map is a variation of Belief Net that uses sigmoid function in summing the conditioning evidences. The dependency is not as strong as that of ordinary Belief Net, but is of event occurrence.</Paragraph> <Paragraph position="4"> The potential power of Collocation Map can be fully appreciated when the computational overhead is further reduced. Several options to alleviate the computational burden are also begin studied in two approaches. The one is parallel algorithm for Gibbs sampling and the other is to localize or optimize the sampling itself. Preliminary test on the Map built from 100 texts shows a promising outlook, and we currently having a large scale testing on 75,000 Korean text units(two million word corpus) and Pentree Bank. The aims of the test include the accuracy of modified sampling, sampling cost versus accuracy, comparison with the Boltzman machine implementation of the Collocation Map, Lexical Concept Acquisition, thesaurus construction, and sense disambiguation problems such as in PP attachment and homonym resolution.</Paragraph> </Section> class="xml-element"></Paper>