Projects per year
Recent advances in RST discourse parsing have focused on two modeling paradigms: (a) high order parsers which jointly predict the tree structure of the discourse and the relations it encodes; or (b) linear-time parsers which are efficient but mostly based on local features. In this work, we propose a linear-time parser with a novel way of representing discourse constituents based on neural networks which takes into account global contextual information and is able to capture long-distance dependencies. Experimental results show that our parser obtains state-of-the art performance on benchmark datasets, while being efficient (with time complexity linear in the number of sentences in the document) and requiring minimal feature engineering.
|Title of host publication||Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing|
|Place of Publication||Copenhagen, Denmark|
|Publisher||Association for Computational Linguistics|
|Number of pages||10|
|Publication status||Published - 1 Sep 2017|
|Event||EMNLP 2017: Conference on Empirical Methods in Natural Language Processing - Copenhagen, Denmark|
Duration: 7 Sep 2017 → 11 Sep 2017
|Conference||EMNLP 2017: Conference on Empirical Methods in Natural Language Processing|
|Abbreviated title||EMNLP 2017|
|Period||7/09/17 → 11/09/17|
FingerprintDive into the research topics of 'Learning Contextually Informed Representations for Linear-Time Discourse Parsing'. Together they form a unique fingerprint.
- 1 Finished
TransModal: Translating from Multiple Modalities into Text
1/09/16 → 31/08/22