Please use this identifier to cite or link to this item: doi:10.22028/D291-25310
Title: Syntactic-prosodic labeling of large spontaneous speech data-bases
Author(s): Batliner, Anton
Kießling, Andreas
Kompe, Ralf
Niemann, Heinrich
Nöth, Elmar
Language: English
Year of Publication: 1996
SWD key words: Künstliche Intelligenz
Free key words: artificial intelligence
DDC notations: 004 Computer science, internet
Publikation type: Report
Abstract: In automatic speech understanding, the division of continuously running speech into syntactic chunks is a great problem. Syntactic boundaries are often marked by prosodic means. For the training of statistic models for prosodic boundaries large databases are necessary. For the German Verbmobil project (automatic speech-to-speech translation), we developed a syntactic-prosodic labeling scheme where two main types of boundaries (major syntactic boundaries and syntactically ambiguous boundaries) and some other special boundaries are labeled for a large Verbmobil spontaneous speech corpus. We compare the results of classifiers (multilayer perceptrons and language models) trained on these syntactic-prosodic boundary labels with classifiers trained on perceptual-prosodic and pure syntactic labels. The main advantage of the rough syntactic-prosodic labels presented in this paper is that large amounts of data could be labeled within a short time. Therefore, the classifiers trained with these labels turned out to be superior (recognition rates of up to 96%).
Link to this record: urn:nbn:de:bsz:291-scidok-53244
Series name: Vm-Report / Verbmobil, Verbundvorhaben, [Deutsches Forschungszentrum für Künstliche Intelligenz]
Series volume: 131
Date of registration: 13-Jun-2013
Faculty: SE - Sonstige Einrichtungen
Department: SE - DFKI Deutsches Forschungszentrum für Künstliche Intelligenz
Collections:SciDok - Der Wissenschaftsserver der Universität des Saarlandes

Files for this record:
File Description SizeFormat 
report_131_96.pdf176,66 kBAdobe PDFView/Open

Items in SciDok are protected by copyright, with all rights reserved, unless otherwise indicated.