Repository logo
 

Neural generative rhetorical structure parsing

Published version
Peer-reviewed

Type

Conference Object

Change log

Authors

Mabona, A 
Rimell, L 
Clark, S 

Abstract

© 2019 Association for Computational Linguistics Rhetorical structure trees have been shown to be useful for several document-level tasks including summarization and document classification. Previous approaches to RST parsing have used discriminative models; however, these are less sample efficient than generative models, and RST parsing datasets are typically small. In this paper, we present the first generative model for RST parsing. Our model is a document-level RNN grammar (RNNG) with a bottom-up traversal order. We show that, for our parser's traversal order, previous beam search algorithms for RNNGs have a left-branching bias which is ill-suited for RST parsing. We develop a novel beam search algorithm that keeps track of both structure- and word-generating actions without exhibiting this branching bias and results in absolute improvements of 6.8 and 2.9 on unlabelled and labelled F1 over previous algorithms. Overall, our generative model outperforms a discriminative model with the same features by 2.6 F1 points and achieves performance comparable to the state-of-the-art, outperforming all published parsers from a recent replication study that do not use additional training data.

Description

Keywords

cs.CL, cs.CL

Journal Title

Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing

Conference Name

e 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing

Journal ISSN

Volume Title

Publisher

Association for Computational Linguistics
Sponsorship
Engineering and Physical Sciences Research Council (EP/R021643/2)