Please use this identifier to cite or link to this item: doi:10.22028/D291-42313
Volltext verfügbar? / Dokumentlieferung
Title: Incorporating Distributions of Discourse Structure for Long Document Abstractive Summarization
Author(s): Pu, Dongqi
Wang, Yifan
Demberg, Vera
Editor(s): Rogers, Anna
Language: English
Title: The 61st Conference of the the Association for Computational Linguistics : July 9-14, 2023 : ACL 2023 : Volume 1: Long papers
Pages: 5574-5590
Publisher/Platform: ACL
Year of Publication: 2023
Place of publication: Stroudsburg, PA
Place of the conference: Toronto, Canada
DDC notations: 004 Computer science, internet
400 Language, linguistics
Publikation type: Conference Paper
Abstract: For text summarization, the role of discourse structure is pivotal in discerning the core content of a text. Regrettably, prior studies on incorporating Rhetorical Structure Theory (RST) into transformer-based summarization models only consider the nuclearity annotation, thereby overlooking the variety of discourse relation types. This paper introduces the 'RSTformer', a novel summarization model that comprehensively incorporates both the types and uncertainty of rhetorical relations. Our RST-attention mechanism, rooted in document-level rhetorical structure, is an extension of the recently devised Longformer framework. Through rigorous evaluation, the model proposed herein exhibits significant superiority over state-of-the-art models, as evidenced by its notable performance on several automatic metrics and human evaluation.
Link to this record: urn:nbn:de:bsz:291--ds-423132
hdl:20.500.11880/37985
http://dx.doi.org/10.22028/D291-42313
ISBN: 978-1-959429-72-2
Date of registration: 1-Jul-2024
Faculty: MI - Fakultät für Mathematik und Informatik
Department: MI - Informatik
Professorship: MI - Prof. Dr. Vera Demberg
Collections:SciDok - Der Wissenschaftsserver der Universität des Saarlandes

Files for this record:
There are no files associated with this item.


Items in SciDok are protected by copyright, with all rights reserved, unless otherwise indicated.