1 |
Dependency locality as an explanatory principle for word order
|
|
|
|
In: Prof. Levy (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Granularity in the Semantics of Comparison
|
|
|
|
In: Semantics and Linguistic Theory; Proceedings of SALT 31; 550-569 ; 2163-5951 (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Competition from novel features drives scalar inferences in reference games
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
4 |
Using the Interpolated Maze Task to Assess Incremental Processing in English Relative Clauses
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
5 |
Child-directed Listening: How Caregiver Inference Enables Children's Early Verbal Communication
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
6 |
On Factors Influencing Typing Time: Insights from a Viral Online Typing Game
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
7 |
Eye Movement Traces of Linguistic Knowledge
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
8 |
A Systematic Assessment of Syntactic Generalization in Neural Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
10 |
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
11 |
Comparing Models of Associative Meaning: An Empirical Investigation of Reference in Simple Language Games
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
12 |
Cognitive Science Honors the Memory of Jeffrey Elman
|
|
|
|
In: MIT Press (2021)
|
|
BASE
|
|
Show details
|
|
13 |
Investigating Novel Verb Learning in BERT: Selectional Preference Classes and Alternation-Based Syntactic Generalization
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
14 |
Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
15 |
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
16 |
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
17 |
Neural language models as psycholinguistic subjects: Representations of syntactic state
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
18 |
Linking artificial and human neural representations of language
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
19 |
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
20 |
Maze Made Easy: Better and easier measurement of incremental processing difficulty
|
|
|
|
In: Other repository (2021)
|
|
Abstract:
© 2019 Elsevier Inc. Behavioral measures of incremental language comprehension difficulty form a crucial part of the empirical basis of psycholinguistics. The two most common methods for obtaining these measures have significant limitations: eye tracking studies are resource-intensive, and self-paced reading can yield noisy data with poor localization. These limitations are even more severe for web-based crowdsourcing studies, where eye tracking is infeasible and self-paced reading is vulnerable to inattentive participants. Here we make a case for broader adoption of the Maze task, involving sequential forced choice between each successive word in a sentence and a contextually inappropriate distractor. We leverage natural language processing technology to automate the most researcher-laborious part of Maze – generating distractor materials – and show that the resulting A(uto)-Maze method has dramatically superior statistical power and localization for well-established syntactic ambiguity resolution phenomena. We make our code freely available online for widespread adoption of A-maze by the psycholinguistics community.
|
|
URL: https://hdl.handle.net/1721.1/138282
|
|
BASE
|
|
Hide details
|
|
|
|