4 |
The Impact of Positional Encodings on Multilingual Compression ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
The Impact of Positional Encodings on Multilingual Compression ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Attention Can Reflect Syntactic Structure (If You Let It) ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
The Sensitivity of Language Models and Humans to Winograd Schema Perturbations ...
|
|
|
|
Abstract:
Large-scale pretrained language models are the major driving force behind recent improvements in performance on the Winograd Schema Challenge, a widely employed test of common sense reasoning ability. We show, however, with a new diagnostic dataset, that these models are sensitive to linguistic perturbations of the Winograd examples that minimally affect human understanding. Our results highlight interesting differences between humans and language models: language models are more sensitive to number or gender alternations and synonym replacements than humans, and humans are more stable and consistent in their predictions, maintain a much higher absolute performance, and perform better on non-associative instances than associative ones. Overall, humans are correct more often than out-of-the-box models, and the models are sometimes right for the wrong reasons. Finally, we show that fine-tuning on a large, task-specific dataset can offer a solution to these issues. ... : ACL 2020 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
|
|
URL: https://dx.doi.org/10.48550/arxiv.2005.01348 https://arxiv.org/abs/2005.01348
|
|
BASE
|
|
Hide details
|
|
10 |
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Do Neural Language Models Show Preferences for Syntactic Formalisms? ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers
|
|
|
|
BASE
|
|
Show details
|
|
14 |
From zero to hero: On the limitations of zero-shot language transfer with multilingual transformers
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Probing Multilingual Sentence Representations With X-Probe ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
The WMT'18 Morpheval test suites for English-Czech, English-German, English-Finnish and Turkish-English
|
|
|
|
In: Proceedings of the Third Conference on Machine Translation ; 3rd Conference on Machine Translation (WMT 18) ; https://hal.archives-ouvertes.fr/hal-01910244 ; 3rd Conference on Machine Translation (WMT 18), Oct 2018, Bruxelles, Belgium. pp.550-564, ⟨10.18653/v1/W18-64060⟩ ; http://www.statmt.org/wmt18/ (2018)
|
|
BASE
|
|
Show details
|
|
19 |
Universal Dependencies 2.2
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
|
|
BASE
|
|
Show details
|
|
|
|