DE eng

Search in the Catalogues and Directories

Hits 1 – 16 of 16

1
Using the Interpolated Maze Task to Assess Incremental Processing in English Relative Clauses
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
BASE
Show details
2
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
In: Association for Computational Linguistics (2021)
BASE
Show details
3
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
4
Investigating Novel Verb Learning in BERT: Selectional Preference Classes and Alternation-Based Syntactic Generalization
In: Association for Computational Linguistics (2021)
BASE
Show details
5
Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study
In: Association for Computational Linguistics (2021)
BASE
Show details
6
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
7
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
In: Association for Computational Linguistics (2021)
Abstract: Humans can learn structural properties about a word from minimal experience, and deploy their learned syntactic representations uniformly in different grammatical contexts. We assess the ability of modern neural language models to reproduce this behavior in English and evaluate the effect of structural supervision on learning outcomes. First, we assess few-shot learning capabilities by developing controlled experiments that probe models’ syntactic nominal number and verbal argument structure generalizations for tokens seen as few as two times during training. Second, we assess invariance properties of learned representation: the ability of a model to transfer syntactic generalizations from a base context (e.g., a simple declarative active-voice sentence) to a transformed context (e.g., an interrogative sentence). We test four models trained on the same dataset: an n-gram baseline, an LSTM, and two LSTM-variants trained with explicit structural supervision (Dyer et al., 2016; Charniak et al., 2016). We find that in most cases, the neural models are able to induce the proper syntactic generalizations after minimal exposure, often from just two examples during training, and that the two structurally supervised models generalize more accurately than the LSTM model. All neural models are able to leverage information learned in base contexts to drive expectations in transformed contexts, indicating that they have learned some invariance properties of syntax.
URL: https://hdl.handle.net/1721.1/138280
BASE
Hide details
8
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
In: Association for Computational Linguistics (2021)
BASE
Show details
9
What do RNN Language Models Learn about Filler–Gap Dependencies?
In: Association for Computational Linguistics (2021)
BASE
Show details
10
A Targeted Assessment of Incremental Processing in Neural Language Models and Humans ...
BASE
Show details
11
A Targeted Assessment of Incremental Processing in Neural LanguageModels and Humans ...
BASE
Show details
12
Which Presuppositions are Subject to Contextual Felicity Constraints?
In: Semantics and Linguistic Theory; Proceedings of SALT 31; 345-364 ; 2163-5951 (2021)
BASE
Show details
13
A Systematic Assessment of Syntactic Generalization in Neural Language Models ...
BASE
Show details
14
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations ...
BASE
Show details
15
Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study ...
An, Aixiu; Qian, Peng; Wilcox, Ethan. - : arXiv, 2019
BASE
Show details
16
RNNs as psycholinguistic subjects: Syntactic state and grammatical dependency ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
16
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern