Bitte benutzen Sie diese Referenz, um auf diese Ressource zu verweisen: doi:10.22028/D291-38851
Volltext verfügbar? / Dokumentlieferung
Titel: Design Choices in Crowdsourcing Discourse Relation Annotations: The Effect of Worker Selection and Training
VerfasserIn: Scholman, Merel Cleo Johanna
Pyatkin, Valentina
Yung, Frances Pikyu
Dagan, Ido
Tsarfaty, Reut
Demberg, Vera
HerausgeberIn: Calzolari, Nicoletta
Sprache: Englisch
Titel: Language Resources and Evaluation Conference, LREC 2022, 20-25 June 2022 : Palais du Pharo, Marseille, France : conference proceedings
Seiten: 2148-2156
Verlag/Plattform: European Language Resources Association
Erscheinungsjahr: 2022
Erscheinungsort: Paris
Konferenzort: Marseille, France
Freie Schlagwörter: discourse annotations
crowdsourcing
training
participant selection
DDC-Sachgruppe: 004 Informatik
400 Sprache, Linguistik
Dokumenttyp: Konferenzbeitrag (in einem Konferenzband / InProceedings erschienener Beitrag)
Abstract: Obtaining linguistic annotation from novice crowdworkers is far from trivial. A case in point is the annotation of discourse relations, which is a complicated task. Recent methods have obtained promising results by extracting relation labels from either discourse connectives (DCs) or question-answer (QA) pairs that participants provide. The current contribution studies the effect of worker selection and training on the agreement on implicit relation labels between workers and gold labels, for both the DC and the QA method. In Study 1, workers were not specifically selected or trained, and the results show that there is much room for improvement. Study 2 shows that a combination of selection and training does lead to improved results, but the method is cost- and time-intensive. Study 3 shows that a selection-only approach is a viable alternative; it results in annotations of comparable quality compared to annotations from trained participants. The results generalized over both the DC and QA method and therefore indicate that a selection-only approach could also be effective for other crowdsourced discourse annotation tasks.
URL der Erstveröffentlichung: https://aclanthology.org/2022.lrec-1.231/
Link zu diesem Datensatz: urn:nbn:de:bsz:291--ds-388514
hdl:20.500.11880/35058
http://dx.doi.org/10.22028/D291-38851
ISBN: 979-10-95546-72-6
Datum des Eintrags: 31-Jan-2023
Fakultät: MI - Fakultät für Mathematik und Informatik
Fachrichtung: MI - Informatik
Professur: MI - Prof. Dr. Vera Demberg
Sammlung:SciDok - Der Wissenschaftsserver der Universität des Saarlandes

Dateien zu diesem Datensatz:
Es gibt keine Dateien zu dieser Ressource.


Alle Ressourcen in diesem Repository sind urheberrechtlich geschützt.