Please use this identifier to cite or link to this item: doi:10.22028/D291-30966
Volltext verfügbar? / Dokumentlieferung
Title: A Hybrid Model for Globally Coherent Story Generation
Author(s): Zhai, Fangzhou
Demberg, Vera
Shkadzko, Pavel
Shi, Wei
Sayeed, Asad
Editor(s): Ferraro, Francis
Language: English
Title: Storytelling - proceedings of the second workshop
Startpage: 34
Endpage: 45
Publisher/Platform: ACL
Year of Publication: 2019
Place of publication: Stroudsburg, PA
Title of the Conference: Storytelling Workshop 2019
Place of the conference: Florence, Italy
Publikation type: Conference Paper
Abstract: Automatically generating globally coherent stories is a challenging problem. Neural text generation models have been shown to perform well at generating fluent sentences from data, but they usually fail to keep track of the overall coherence of the story after a couple of sentences. Existing work that incorporates a text planning module succeeded in generating recipes and dialogues, but appears quite data-demanding. We propose a novel story generation approach that generates globally coherent stories from a fairly small corpus. The model exploits a symbolic text planning module to produce text plans, thus reducing the demand of data; a neural surface realization module then generates fluent text conditioned on the text plan. Human evaluation showed that our model outperforms various baselines by a wide margin and generates stories which are fluent as well as globally coherent.
DOI of the first publication: 10.18653/v1/W19-3404
URL of the first publication: https://www.aclweb.org/anthology/W19-3404/
Link to this record: hdl:20.500.11880/29737
http://dx.doi.org/10.22028/D291-30966
ISBN: 978-1-950737-44-4
Date of registration: 24-Sep-2020
Faculty: MI - Fakultät für Mathematik und Informatik
Department: MI - Informatik
Professorship: MI - Prof. Dr. Vera Demberg
Collections:UniBib – Die Universitätsbibliographie

Files for this record:
There are no files associated with this item.


Items in SciDok are protected by copyright, with all rights reserved, unless otherwise indicated.