DE eng

Search in the Catalogues and Directories

Hits 1 – 17 of 17

1
Syntactic dependencies correspond to word pairs with high mutual information
In: Association for Computational Linguistics (2021)
BASE
Show details
2
A Systematic Assessment of Syntactic Generalization in Neural Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
3
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
4
Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study
In: Association for Computational Linguistics (2021)
BASE
Show details
5
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
6
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
7
Neural language models as psycholinguistic subjects: Representations of syntactic state
In: Association for Computational Linguistics (2021)
BASE
Show details
8
Syntactic dependencies correspond to word pairs with high mutual information
In: Association for Computational Linguistics (2021)
BASE
Show details
9
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
In: Association for Computational Linguistics (2021)
BASE
Show details
10
Controlled Evaluation of Grammatical Knowledge in Mandarin Chinese Language Models ...
Abstract: Prior work has shown that structural supervision helps English language models learn generalizations about syntactic phenomena such as subject-verb agreement. However, it remains unclear if such an inductive bias would also improve language models' ability to learn grammatical dependencies in typologically different languages. Here we investigate this question in Mandarin Chinese, which has a logographic, largely syllable-based writing system; different word order; and sparser morphology than English. We train LSTMs, Recurrent Neural Network Grammars, Transformer language models, and Transformer-parameterized generative parsing models on two Mandarin Chinese datasets of different sizes. We evaluate the models' ability to learn different aspects of Mandarin grammar that assess syntactic and semantic relationships. We find suggestive evidence that structural supervision helps with representing syntactic state across intervening content and improves performance in low-data settings, suggesting that the benefits ... : To appear in the Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2109.11058
https://dx.doi.org/10.48550/arxiv.2109.11058
BASE
Hide details
11
Controlled Evaluation of Grammatical Knowledge in Mandarin Chinese Language Models ...
BASE
Show details
12
What if This Modified That? Syntactic Interventions with Counterfactual Embeddings ...
BASE
Show details
13
Structural Guidance for Transformer Language Models ...
BASE
Show details
14
Structural Guidance for Transformer Language Models ...
BASE
Show details
15
A Systematic Assessment of Syntactic Generalization in Neural Language Models ...
BASE
Show details
16
Composition is the core driver of the language-selective network
In: MIT Press (2019)
BASE
Show details
17
Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study ...
An, Aixiu; Qian, Peng; Wilcox, Ethan. - : arXiv, 2019
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
17
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern