DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
BASE
Show details
2
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
Papadimitriou, Isabel; Chi, Ethan A.; Futrell, Richard. - : University of Massachusetts Amherst, 2021
BASE
Show details
3
Multilingual BERT, Ergativity, and Grammatical Subjecthood
In: Proceedings of the Society for Computation in Linguistics (2021)
BASE
Show details
4
Finding Universal Grammatical Relations in Multilingual BERT ...
Abstract: Recent work has found evidence that Multilingual BERT (mBERT), a transformer-based multilingual masked language model, is capable of zero-shot cross-lingual transfer, suggesting that some aspects of its representations are shared cross-lingually. To better understand this overlap, we extend recent work on finding syntactic trees in neural networks' internal representations to the multilingual setting. We show that subspaces of mBERT representations recover syntactic tree distances in languages other than English, and that these subspaces are approximately shared across languages. Motivated by these results, we present an unsupervised analysis method that provides evidence mBERT learns representations of syntactic dependency labels, in the form of clusters which largely agree with the Universal Dependencies taxonomy. This evidence suggests that even without explicit supervision, multilingual masked language models learn certain linguistic universals. ... : To appear in ACL 2020; Farsi typo corrected ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; I.2.7; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.2005.04511
https://arxiv.org/abs/2005.04511
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern