File Information
File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/abstr/06/n06-1043_abstr.xml
Size: 948 bytes
Last Modified: 2025-10-06 13:44:56
<?xml version="1.0" standalone="yes"?> <Paper uid="N06-1043"> <Title>Cross-Entropy and Estimation of Probabilistic Context-Free Grammars</Title> <Section position="1" start_page="0" end_page="0" type="abstr"> <SectionTitle> Abstract </SectionTitle> <Paragraph position="0"> We investigate the problem of training probabilistic context-free grammars on the basis of a distribution defined over an infinite set of trees, by minimizing the cross-entropy. This problem can be seen as a generalization of the well-known maximum likelihood estimator on (finite) tree banks. We prove an unexpected theoretical property of grammars that are trained in this way, namely, we show that the derivational entropy of the grammar takes the same value as the cross-entropy between the input distribution and the grammar itself. We show that the result also holds for the widely applied maximum likelihood estimator on tree banks.</Paragraph> </Section> class="xml-element"></Paper>