Page: 1 2 3 4 5 6 7 8 9... 104
82 |
„Was kann ich eigentlich? Welche Begabungen schlummern in mir?“ Die Geschichte über die vielen Begabungen in jedem von uns. Eine Rezension zum Kinderbuch vielleicht von Kobi Yamada
|
|
|
|
BASE
|
|
Show details
|
|
83 |
Inklusion mit Hilfe digitaler Medien – Die Märchenapp StoryToys‘ Schneewittchen und ihr Potenzial zur Leseförderung in heterogenen Lerngruppen
|
|
|
|
BASE
|
|
Show details
|
|
84 |
Editorial: Perspektiven der Diversitätsforschung auf Literatur, Medien und deren Didaktik
|
|
|
|
BASE
|
|
Show details
|
|
85 |
Das schwarze Buch der Farben – Sensibilisierung der Sinne für ein inklusives Miteinander
|
|
|
|
BASE
|
|
Show details
|
|
86 |
Geschlechtliche Vielfalt und Transgeschlechtlichkeit in der Kinder- und Jugendliteratur: Die Graphic Novel Nennt mich Nathan von Catherine Castro & Quentin Zuttion
|
|
|
|
BASE
|
|
Show details
|
|
87 |
Saures: Backgrounds machen Leute: Von den Hintergründen und Einblicken in die Identitätskonstruktionen bei Online- Lehre
|
|
|
|
BASE
|
|
Show details
|
|
88 |
Held*in, „Musterkrüppel“ oder „nur“ behindert? – Eine systematische Analyse zu den Darstellungen von Behinderungen im historischen Vergleich ausgewählter literarischer Werke
|
|
|
|
BASE
|
|
Show details
|
|
89 |
Revolution - Fortschritt - Evolution ; Revolution, Progress, Evolution. On the conceptual history of the evolutionist semantics of history and of future in German social democracy
|
|
Deus, Fabian. - : Universität Siegen, 2021. : Germanistisches Seminar, 2021
|
|
In: Siegen : universi - Universitätsverlag Siegen, 2021. - ISBN 978-3-96182-116-7 (2021)
|
|
BASE
|
|
Show details
|
|
91 |
Vorlesung und Hörsaal als Symptomwörter für Geschichte und Gegenwart der deutschen Sprache
|
|
|
|
BASE
|
|
Show details
|
|
92 |
Was denken linguistische Laien über die (deutsche) Grammatik?
|
|
|
|
BASE
|
|
Show details
|
|
93 |
Öffnung - Schließung - Übertritte ; Körperbilder in der deutschsprachigen Gegenwartsliteratur
|
|
|
|
BASE
|
|
Show details
|
|
94 |
Satzbaupläne als Zeichen: die semantischen Rollen des Deutschen in Theorie und Praxis
|
|
|
|
BASE
|
|
Show details
|
|
95 |
Usbekische Personennamen - Etymologie, Grammatik, Pragmatik
|
|
|
|
BASE
|
|
Show details
|
|
96 |
FALKE: Experiences From Transdisciplinary Educational Research by Fourteen Disciplines
|
|
|
|
BASE
|
|
Show details
|
|
97 |
Deutsch in der Ukraine. Geschichte, Gegenwart und zukünftige Potentiale
|
|
|
|
BASE
|
|
Show details
|
|
98 |
Untersuchungen zur Wortgeographie Nordostbayerns - Sprachatlas und dialektometrische Studie zum Einfluss der Konfessionszugehörigkeit auf die Lexik
|
|
|
|
BASE
|
|
Show details
|
|
99 |
Using Deep Learning for Emotion Analysis of 18th and 19th Century German Plays
|
|
|
|
Abstract:
We present first results of the project “Emotions in Drama” in which we explore the annotation of emotions and the application of computational emotion analysis, predominantly deep learning-based methods, in the context of historical German plays of the time around 1800. We performed a pilot annotation study with five plays generating over 6,500 annotations for up to 13 sub-emotions structured in a hierarchical scheme. This emotion scheme includes common types like joy, anger or hate but also concepts that are specifically important for German literary criticism of this period like friendship, compassion or Schadenfreude. We evaluate the performance of various methods of emotion-based text sequence classification including lexicon-based methods, traditional machine learning, fastText as static word embedding, various transformer models based on BERT- or ELECTRA-architectures and pretrained with contemporary language, transformer-based methods pretrained or finetuned for historical and/or poetic language as well as the finetuning of BERT models via our own corpora and plays. We do achieve state-of-the-art results with hierarchical levels with two or three classes, i. e. the classification of valence (positive/negative). The best models are the transformer-based models gbert-large and gelectra-large by deepset pretrained on large corpora of contemporary German, which achieve accuracy values of up to 83%. Lexicon-based methods, traditional machine learning as well as static word embeddings are consistently outperformed by transformer-based models. Models trained on historical texts show small and inconsistent improvements. The performance becomes significantly smaller for settings with multiple sub-emotions like 6 or 13 due to the general challenge and class imbalances in which the models achieve 57% and 47% respectively. We discuss how we intend to continue our annotations and how to improve the prediction results via various optimization techniques in future work.
|
|
Keyword:
004 Informatik; 400 Sprachwissenschaft; 430 Deutsch; 792 Theater; 800 Literatur; 830 Deutsche Literatur; ddc:004; ddc:400; ddc:430; ddc:792; ddc:800; ddc:830; Linguistik; Literaturwissenschaft; Rhetorik; Tanz
|
|
URL: https://epub.uni-regensburg.de/50827/ https://epub.uni-regensburg.de/50827/1/vDhd-Beitrag-Schmidt-UsingDeepLearning.pdf https://epub.uni-regensburg.de/50827/7/vDhd-Beitrag-Schmidt-UsingDeepLearning-PublishedVersion_workingLinks.pdf https://www.melusinapress.lu/read/10-26298-melusina-8f8w-y749-udlf/section/8d0fefff-384c-4798-b5d7-032809de2430
|
|
BASE
|
|
Hide details
|
|
100 |
Deutsch in der Ukraine. Geschichte, Gegenwart und zukünftige Potentiale
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8 9... 104
|
|