DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 73

1
AUTOLEX: An Automatic Framework for Linguistic Exploration ...
BASE
Show details
2
MCoNaLa: A Benchmark for Code Generation from Multiple Natural Languages ...
BASE
Show details
3
A Systematic Evaluation of Large Language Models of Code ...
BASE
Show details
4
Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation ...
BASE
Show details
5
Phoneme Recognition through Fine Tuning of Phonetic Representations: a Case Study on Luhya Language Varieties ...
BASE
Show details
6
Few-shot Language Coordination by Modeling Theory of Mind ...
BASE
Show details
7
Systematic Inequalities in Language Technology Performance across the World's Languages ...
BASE
Show details
8
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
BASE
Show details
9
Multi-view Subword Regularization ...
BASE
Show details
10
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning ...
BASE
Show details
11
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
BASE
Show details
12
When Does Translation Require Context? A Data-driven, Multilingual Exploration ...
BASE
Show details
13
Breaking Down Multilingual Machine Translation ...
Abstract: While multilingual training is now an essential ingredient in machine translation (MT) systems, recent work has demonstrated that it has different effects in different multilingual settings, such as many-to-one, one-to-many, and many-to-many learning. These training settings expose the encoder and the decoder in a machine translation model with different data distributions. In this paper, we examine how different varieties of multilingual training contribute to learning these two components of the MT model. Specifically, we compare bilingual models with encoders and/or decoders initialized by multilingual training. We show that multilingual training is beneficial to encoders in general, while it only benefits decoders for low-resource languages (LRLs). We further find the important attention heads for each language pair and compare their correlations during inference. Our analysis sheds light on how multilingual translation models work and enables us to propose methods to improve performance by training with ... : ACL 2022 Findings ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2110.08130
https://arxiv.org/abs/2110.08130
BASE
Hide details
14
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
BASE
Show details
15
Distributionally Robust Multilingual Machine Translation ...
Zhou, Chunting; Levy, Daniel; Li, Xian. - : arXiv, 2021
BASE
Show details
16
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages ...
BASE
Show details
17
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
BASE
Show details
18
Meta Back-translation ...
Pham, Hieu; Wang, Xinyi; Yang, Yiming. - : arXiv, 2021
BASE
Show details
19
On The Ingredients of an Effective Zero-shot Semantic Parser ...
BASE
Show details
20
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
BASE
Show details

Page: 1 2 3 4

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
73
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern