Please use this identifier to cite or link to this item:
|Title:||Improving parsing of spontaneous speech with the help of prosodic boundaries|
Block, H. U.
|Year of Publication:||1997|
|SWD key words:||Künstliche Intelligenz|
|Free key words:||artificial intelligence|
|DDC notations:||004 Computer science, internet|
|Abstract:||Parsing can be improved in automatic speech understanding if prosodic boundary marking is taken into account, because syntactic boundaries are often marked by prosodic means. Because large databases are needed for the training of statistical models for prosodic boundaries, we developed a labeling scheme for syntactic-prosodic boundaries within the German VERBMOBIL project (automatic speech-to-speech translation). We compare the results of classifiers (multi-layer perceptrons and language models) trained on these syntactic-prosodic boundary labels with classifiers trained on perceptual-prosodic and purely syntactic labels. Recognition rates of up to 96% were achieved. The turns that we need to parse consist of 20 words on the average and frequently contain sequences of partial sentence equivalents due to restarts, ellipsis, etc. For this material, the boundary scores computed by our classifiers can successfully be integrated into the syntactic parsing of word graphs; currently, they improve the parse time by 92% and reduce the number of parse trees by 96%. This is achieved by introducing a special Prosodic Syntactic Clause Boundary symbol (PSCB) into our grammar and guiding the search for the best word chain with the prosodic boundary scores.|
|Link to this record:||urn:nbn:de:bsz:291-scidok-54848|
|Series name:||Vm-Report / Verbmobil, Verbundvorhaben, [Deutsches Forschungszentrum für Künstliche Intelligenz]|
|Date of registration:||10-Sep-2013|
|Faculty:||SE - Sonstige Einrichtungen|
|Department:||SE - DFKI Deutsches Forschungszentrum für Künstliche Intelligenz|
|Collections:||SciDok - Der Wissenschaftsserver der Universität des Saarlandes|
Items in SciDok are protected by copyright, with all rights reserved, unless otherwise indicated.