1 |
How Efficiency Shapes Human Language
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
A verb-frame frequency account of constraints on long-distance dependencies in English
|
|
|
|
In: Prof. Gibson (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Dependency locality as an explanatory principle for word order
|
|
|
|
In: Prof. Levy (2022)
|
|
BASE
|
|
Show details
|
|
4 |
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Learning Constraints on Wh-Dependencies by Learning How to Efficiently Represent Wh-Dependencies: A Developmental Modeling Investigation With Fragment Grammars
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
7 |
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
8 |
Word order affects the frequency of adjective use across languages
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
10 |
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
11 |
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
12 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
13 |
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
14 |
Maze Made Easy: Better and easier measurement of incremental processing difficulty
|
|
|
|
In: Other repository (2021)
|
|
Abstract:
© 2019 Elsevier Inc. Behavioral measures of incremental language comprehension difficulty form a crucial part of the empirical basis of psycholinguistics. The two most common methods for obtaining these measures have significant limitations: eye tracking studies are resource-intensive, and self-paced reading can yield noisy data with poor localization. These limitations are even more severe for web-based crowdsourcing studies, where eye tracking is infeasible and self-paced reading is vulnerable to inattentive participants. Here we make a case for broader adoption of the Maze task, involving sequential forced choice between each successive word in a sentence and a contextually inappropriate distractor. We leverage natural language processing technology to automate the most researcher-laborious part of Maze – generating distractor materials – and show that the resulting A(uto)-Maze method has dramatically superior statistical power and localization for well-established syntactic ambiguity resolution phenomena. We make our code freely available online for widespread adoption of A-maze by the psycholinguistics community.
|
|
URL: https://hdl.handle.net/1721.1/138282
|
|
BASE
|
|
Hide details
|
|
15 |
An Information-Theoretic Characterization of Morphological Fusion ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Sensitivity as a Complexity Measure for Sequence Classification Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
What do RNN Language Models Learn about Filler–Gap Dependencies?
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
20 |
Language Learning and Processing in People and Machines
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
|
|