1 |
Simple, Interpretable and Stable Method for Detecting Words with Usage Change across Corpora
|
|
|
|
In: ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03161637 ; ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Jul 2020, Seattle / Virtual, United States. pp.538-555, ⟨10.18653/v1/2020.acl-main.51⟩ (2020)
|
|
BASE
|
|
Show details
|
|
4 |
It's not Greek to mBERT: Inducing Word-Level Translations from Multilingual BERT ...
|
|
|
|
Abstract:
Recent works have demonstrated that multilingual BERT (mBERT) learns rich cross-lingual representations, that allow for transfer across languages. We study the word-level translation information embedded in mBERT and present two simple methods that expose remarkable translation capabilities with no fine-tuning. The results suggest that most of this information is encoded in a non-linear way, while some of it can also be recovered with purely linear tools. As part of our analysis, we test the hypothesis that mBERT learns representations which contain both a language-encoding component and an abstract, cross-lingual component, and explicitly identify an empirical language-identity subspace within mBERT representations. ... : BlackboxNLP 2020 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2010.08275 https://arxiv.org/abs/2010.08275
|
|
BASE
|
|
Hide details
|
|
5 |
The Extraordinary Failure of Complement Coercion Crowdsourcing ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Amnesic Probing: Behavioral Explanation with Amnesic Counterfactuals ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Unsupervised Domain Clusters in Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Unsupervised Distillation of Syntactic Information from Contextualized Word Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
pyBART: Evidence-based Syntactic Transformations for IE ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|