DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 79

1
Wissensrohstoff Text : Eine Einführung in das Text Mining
Biemann, Chris (VerfasserIn); Heyer, Gerhard (VerfasserIn). - 2., wesentl. überarb. Auflage 2022. - Wiesbaden : Springer Fachmedien Wiesbaden GmbH, 2022
IDS Mannheim
Show details
2
Language Models Explain Word Reading Times Better Than Empirical Predictability ...
BASE
Show details
3
SCoT: Sense Clustering over Time: a tool for the analysis of lexical change ...
BASE
Show details
4
Language Models Explain Word Reading Times Better Than Empirical Predictability
In: Front Artif Intell (2022)
Abstract: Though there is a strong consensus that word length and frequency are the most important single-word features determining visual-orthographic access to the mental lexicon, there is less agreement as how to best capture syntactic and semantic factors. The traditional approach in cognitive reading research assumes that word predictability from sentence context is best captured by cloze completion probability (CCP) derived from human performance data. We review recent research suggesting that probabilistic language models provide deeper explanations for syntactic and semantic effects than CCP. Then we compare CCP with three probabilistic language models for predicting word viewing times in an English and a German eye tracking sample: (1) Symbolic n-gram models consolidate syntactic and semantic short-range relations by computing the probability of a word to occur, given two preceding words. (2) Topic models rely on subsymbolic representations to capture long-range semantic similarity by word co-occurrence counts in documents. (3) In recurrent neural networks (RNNs), the subsymbolic units are trained to predict the next word, given all preceding words in the sentences. To examine lexical retrieval, these models were used to predict single fixation durations and gaze durations to capture rapidly successful and standard lexical access, and total viewing time to capture late semantic integration. The linear item-level analyses showed greater correlations of all language models with all eye-movement measures than CCP. Then we examined non-linear relations between the different types of predictability and the reading times using generalized additive models. N-gram and RNN probabilities of the present word more consistently predicted reading performance compared with topic models or CCP. For the effects of last-word probability on current-word viewing times, we obtained the best results with n-gram models. Such count-based models seem to best capture short-range access that is still underway when the eyes move on to the subsequent word. The prediction-trained RNN models, in contrast, better predicted early preprocessing of the next word. In sum, our results demonstrate that the different language models account for differential cognitive processes during reading. We discuss these algorithmically concrete blueprints of lexical consolidation as theoretically deep explanations for human reading.
Keyword: Artificial Intelligence
URL: https://doi.org/10.3389/frai.2021.730570
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8847793/
BASE
Hide details
5
Probing Pre-trained Language Models for Semantic Attributes and their Values ...
BASE
Show details
6
WebAnno-MM: EXMARaLDA meets WebAnno
Remus, Steffen [Verfasser]; Hedeland, Hanna [Verfasser]; Ferger, Anne [Verfasser]. - Mannheim : Leibniz-Institut für Deutsche Sprache (IDS), Bibliothek, 2020
DNB Subject Category Language
Show details
7
Comparison of Different Lexical Resources With Respect to the Tip-of-the-Tongue Problem
In: ISSN: 1598-2327 ; EISSN: 1976-6939 ; Journal of Cognitive Science ; https://hal.archives-ouvertes.fr/hal-03168850 ; Journal of Cognitive Science, Institute for Cognitive Science, Seoul National University, 2020, 21 (2), pp.193-252. ⟨10.17791/jcs.2020.21.2.193⟩ (2020)
BASE
Show details
8
Introducing various Semantic Models for Amharic: Experimentation and Evaluation with multiple Tasks and Datasets ...
BASE
Show details
9
Word Sense Disambiguation for 158 Languages using Word Embeddings Only ...
BASE
Show details
10
Individual corpora predict fast memory retrieval during reading ...
BASE
Show details
11
Individual corpora predict fast memory retrieval during reading ...
BASE
Show details
12
Token-based spelling variant detection in Middle Low German texts [<Journal>]
Barteld, Fabian [Verfasser]; Biemann, Chris [Verfasser]; Zinsmeister, Heike [Verfasser]
DNB Subject Category Language
Show details
13
Unsupervised Induction of Domain Dependency Graphs - Extracting, Understanding and Visualizing Domain Knowledge
Kohail, Sarah [Verfasser]; Biemann, Chris [Akademischer Betreuer]. - Hamburg : Staats- und Universitätsbibliothek Hamburg, 2019
DNB Subject Category Language
Show details
14
Making Fast Graph-based Algorithms with Graph Metric Embeddings ...
BASE
Show details
15
On the Compositionality Prediction of Noun Phrases using Poincaré Embeddings ...
BASE
Show details
16
Every child should have parents: a taxonomy refinement algorithm based on hyperbolic term embeddings ...
BASE
Show details
17
Datasets for Watset: Local-Global Graph Clustering with Applications in Sense and Frame Induction ...
BASE
Show details
18
Datasets for Watset: Local-Global Graph Clustering with Applications in Sense and Frame Induction ...
BASE
Show details
19
Adaptive Approaches to Natural Language Processing in Annotation and Application ; Adaptive Ansätze zur Verarbeitung natürlicher Sprache in Annotation und Anwendung
Yimam, Seid Muhie. - : Staats- und Universitätsbibliothek Hamburg Carl von Ossietzky, 2019
BASE
Show details
20
HHMM at SemEval-2019 Task 2: Unsupervised frame induction using contextualized word embeddings
Arefyev, Nikolay; Panchenko, Alexander; Anwar, Saba. - : Association for Computational Linguistics, ACL, 2019
BASE
Show details

Page: 1 2 3 4

Catalogues
3
2
2
0
8
0
0
Bibliographies
3
0
1
0
0
0
1
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
60
0
1
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern