DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6
Hits 21 – 40 of 116

21
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages ...
BASE
Show details
22
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
BASE
Show details
23
Meta Back-translation ...
Pham, Hieu; Wang, Xinyi; Yang, Yiming. - : arXiv, 2021
BASE
Show details
24
Measuring and Increasing Context Usage in Context-Aware Machine Translation ...
BASE
Show details
25
On The Ingredients of an Effective Zero-shot Semantic Parser ...
BASE
Show details
26
Findings of the AmericasNLP 2021 Shared Task on Open Machine Translation for Indigenous Languages of the Americas ...
Mager, Manuel; Oncevay, Arturo; Ebrahimi, Abteen. - : Association for Computational Linguistics, 2021
BASE
Show details
27
Explorations in Transfer Learning for OCR Post-Correction ...
BASE
Show details
28
Detecting Hallucinated Content in Conditional Neural Sequence Generation ...
BASE
Show details
29
CitationIE: Leveraging the Citation Graph for Scientific Information Extraction ...
BASE
Show details
30
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
Abstract: It has been shown that machine translation models usually generate poor translations for named entities that are infrequent in the training corpus. Earlier named entity translation methods mainly focus on phonetic transliteration, which ignores the sentence context for translation and is limited in domain and language coverage. To address this limitation, we propose DEEP, a DEnoising Entity Pre-training method that leverages large amounts of monolingual data and a knowledge base to improve named entity translation accuracy within sentences. Besides, we investigate a multi-task learning strategy that finetunes a pre-trained neural machine translation model on both entity-augmented monolingual data and parallel data to further improve entity translation. Experimental results on three language pairs demonstrate that \method results in significant improvements over strong denoising auto-encoding baselines, with a gain of up to 1.3 BLEU and up to 9.2 entity accuracy points for English-Russian translation. ... : 13 pages ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2111.07393
https://arxiv.org/abs/2111.07393
BASE
Hide details
31
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
BASE
Show details
32
Explicit Alignment Objectives for Multilingual Bidirectional Encoders ...
NAACL 2021 2021; Firat, Orhan; Hu, Junjie. - : Underline Science Inc., 2021
BASE
Show details
33
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
BASE
Show details
34
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
NAACL 2021 2021; Hauptmann, Alexander; Hu, Junjie. - : Underline Science Inc., 2021
BASE
Show details
35
Distributionally Robust Multilingual Machine Translation ...
BASE
Show details
36
Multi-view Subword Regularization ...
NAACL 2021 2021; ., Sebastian; Neubig, Graham. - : Underline Science Inc., 2021
BASE
Show details
37
Word Alignment by Fine-tuning Embeddings on Parallel Corpora ...
Dou, Zi-Yi; Neubig, Graham. - : arXiv, 2021
BASE
Show details
38
Data Augmentation for Sign Language Gloss Translation ...
BASE
Show details
39
Do Context-Aware Translation Models Pay the Right Attention? ...
BASE
Show details
40
Lexically Aware Semi-Supervised Learning for OCR Post-Correction ...
BASE
Show details

Page: 1 2 3 4 5 6

Catalogues
0
0
2
0
1
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
113
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern