DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
BASE
Show details
2
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
BASE
Show details
3
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages ...
BASE
Show details
4
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
BASE
Show details
5
Explicit Alignment Objectives for Multilingual Bidirectional Encoders ...
NAACL 2021 2021; Firat, Orhan; Hu, Junjie. - : Underline Science Inc., 2021
BASE
Show details
6
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
NAACL 2021 2021; Hauptmann, Alexander; Hu, Junjie. - : Underline Science Inc., 2021
BASE
Show details
7
Phrase-level Active Learning for Neural Machine Translation ...
BASE
Show details
8
Explicit Alignment Objectives for Multilingual Bidirectional Encoders ...
Abstract: Pre-trained cross-lingual encoders such as mBERT (Devlin et al., 2019) and XLMR (Conneau et al., 2020) have proven to be impressively effective at enabling transfer-learning of NLP systems from high-resource languages to low-resource languages. This success comes despite the fact that there is no explicit objective to align the contextual embeddings of words/sentences with similar meanings across languages together in the same space. In this paper, we present a new method for learning multilingual encoders, AMBER (Aligned Multilingual Bidirectional EncodeR). AMBER is trained on additional parallel data using two explicit alignment objectives that align the multilingual representations at different granularities. We conduct experiments on zero-shot cross-lingual transfer learning for different tasks including sequence tagging, sentence retrieval and sentence classification. Experimental results show that AMBER obtains gains of up to 1.1 average F1 score on sequence tagging and up to 27.3 average accuracy on ... : published at NAACL 2021 ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2010.07972
https://arxiv.org/abs/2010.07972
BASE
Hide details
9
XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization ...
BASE
Show details
10
Domain Adaptation of Neural Machine Translation by Lexicon Induction ...
BASE
Show details
11
Rapid Adaptation of Neural Machine Translation to New Languages ...
Neubig, Graham; Hu, Junjie. - : arXiv, 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern