Please use this identifier to cite or link to this item: doi:10.22028/D291-30793
Volltext verfügbar? / Dokumentlieferung
Title: Using Explicit Discourse Connectives in Translation for Implicit Discourse Relation Classification
Author(s): Shi, Wei
Yung, Frances Pikyu
Rubino, Raphael
Demberg, Vera
Editor(s): Kondrak, Greg
Watanabe, Taro
Language: English
Title: Proceedings of the Eighth International Joint Conference on Natural Language Processing
Startpage: 484
Endpage: 495
Publisher/Platform: Asian Federation of Natural Language Processing
Year of Publication: 2017
Title of the Conference: IJCNLP 2017
Place of the conference: Taipei, Taiwan
Publikation type: Conference Paper
Abstract: Implicit discourse relation recognition is an extremely challenging task due to the lack of indicative connectives. Various neural network architectures have been proposed for this task recently, but most of them suffer from the shortage of labeled data. In this paper, we address this problem by procuring additional training data from parallel corpora: When humans translate a text, they sometimes add connectives (a process known as explicitation). We automatically back-translate it into an English connective and use it to infer a label with high confidence. We show that a training set several times larger than the original training set can be generated this way. With the extra labeled instances, we show that even a simple bidirectional Long Short-Term Memory Network can outperform the current state-of-the-art.
URL of the first publication: https://www.aclweb.org/anthology/I17-1049/
Link to this record: hdl:20.500.11880/29714
http://dx.doi.org/10.22028/D291-30793
ISBN: 978-1-948087-00-1
Date of registration: 23-Sep-2020
Notes: Volume 1: Long Papers
Faculty: MI - Fakultät für Mathematik und Informatik
Department: MI - Informatik
Professorship: MI - Prof. Dr. Vera Demberg
Collections:UniBib – Die Universitätsbibliographie

Files for this record:
There are no files associated with this item.


Items in SciDok are protected by copyright, with all rights reserved, unless otherwise indicated.