1 |
How Efficiency Shapes Human Language
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
A verb-frame frequency account of constraints on long-distance dependencies in English
|
|
|
|
In: Prof. Gibson (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Dependency locality as an explanatory principle for word order
|
|
|
|
In: Prof. Levy (2022)
|
|
BASE
|
|
Show details
|
|
4 |
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Learning Constraints on Wh-Dependencies by Learning How to Efficiently Represent Wh-Dependencies: A Developmental Modeling Investigation With Fragment Grammars
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
7 |
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
8 |
Word order affects the frequency of adjective use across languages
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
10 |
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
11 |
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
12 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
13 |
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
Abstract:
© 2019 Association for Computational Linguistics State-of-the-art LSTM language models trained on large corpora learn sequential contingencies in impressive detail and have been shown to acquire a number of non-local grammatical dependencies with some success. Here we investigate whether supervision with hierarchical structure enhances learning of a range of grammatical dependencies, a question that has previously been addressed only for subject-verb agreement. Using controlled experimental methods from psycholinguistics, we compare the performance of word-based LSTM models versus two models that represent hierarchical structure and deploy it in left-to-right processing: Recurrent Neural Network Grammars (RNNGs) (Dyer et al., 2016) and a incrementalized version of the Parsing-as-Language-Modeling configuration from Charniak et al. (2016). Models are tested on a diverse range of configurations for two classes of non-local grammatical dependencies in English-Negative Polarity licensing and Filler-Gap Dependencies. Using the same training data across models, we find that structurally-supervised models outperform the LSTM, with the RNNG demonstrating best results on both types of grammatical dependencies and even learning many of the Island Constraints on the filler-gap dependency. Structural supervision thus provides data efficiency advantages over purely string-based training of neural language models in acquiring human-like generalizations about non-local grammatical dependencies.
|
|
URL: https://hdl.handle.net/1721.1/137340.2
|
|
BASE
|
|
Hide details
|
|
14 |
Maze Made Easy: Better and easier measurement of incremental processing difficulty
|
|
|
|
In: Other repository (2021)
|
|
BASE
|
|
Show details
|
|
15 |
An Information-Theoretic Characterization of Morphological Fusion ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Sensitivity as a Complexity Measure for Sequence Classification Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
What do RNN Language Models Learn about Filler–Gap Dependencies?
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
20 |
Language Learning and Processing in People and Machines
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
|
|