DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 41

1
MasakhaNER: Named entity recognition for African languages
In: EISSN: 2307-387X ; Transactions of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03350962 ; Transactions of the Association for Computational Linguistics, The MIT Press, 2021, ⟨10.1162/tacl⟩ (2021)
BASE
Show details
2
Phoneme Recognition through Fine Tuning of Phonetic Representations: a Case Study on Luhya Language Varieties ...
BASE
Show details
3
Few-shot Language Coordination by Modeling Theory of Mind ...
BASE
Show details
4
Systematic Inequalities in Language Technology Performance across the World's Languages ...
BASE
Show details
5
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
BASE
Show details
6
Multi-view Subword Regularization ...
BASE
Show details
7
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning ...
BASE
Show details
8
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
BASE
Show details
9
When Does Translation Require Context? A Data-driven, Multilingual Exploration ...
BASE
Show details
10
Breaking Down Multilingual Machine Translation ...
BASE
Show details
11
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
Abstract: Adapters are light-weight modules that allow parameter-efficient fine-tuning of pretrained models. Specialized language and task adapters have recently been proposed to facilitate cross-lingual transfer of multilingual pretrained models (Pfeiffer et al., 2020b). However, this approach requires training a separate language adapter for every language one wishes to support, which can be impractical for languages with limited data. An intuitive solution is to use a related language adapter for the new language variety, but we observe that this solution can lead to sub-optimal performance. In this paper, we aim to improve the robustness of language adapters to uncovered languages without training new adapters. We find that ensembling multiple existing language adapters makes the fine-tuned model significantly more robust to other language varieties not included in these adapters. Building upon this observation, we propose Entropy Minimized Ensemble of Adapters (EMEA), a method that optimizes the ensemble weights ... : EMNLP 2021 Findings ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2109.04877
https://arxiv.org/abs/2109.04877
BASE
Hide details
12
Distributionally Robust Multilingual Machine Translation ...
Zhou, Chunting; Levy, Daniel; Li, Xian. - : arXiv, 2021
BASE
Show details
13
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages ...
BASE
Show details
14
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
BASE
Show details
15
Meta Back-translation ...
Pham, Hieu; Wang, Xinyi; Yang, Yiming. - : arXiv, 2021
BASE
Show details
16
Measuring and Increasing Context Usage in Context-Aware Machine Translation ...
BASE
Show details
17
On The Ingredients of an Effective Zero-shot Semantic Parser ...
BASE
Show details
18
Findings of the AmericasNLP 2021 Shared Task on Open Machine Translation for Indigenous Languages of the Americas ...
Mager, Manuel; Oncevay, Arturo; Ebrahimi, Abteen. - : Association for Computational Linguistics, 2021
BASE
Show details
19
Explorations in Transfer Learning for OCR Post-Correction ...
BASE
Show details
20
Detecting Hallucinated Content in Conditional Neural Sequence Generation ...
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
41
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern