21 |
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
22 |
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
|
|
|
|
BASE
|
|
Show details
|
|
24 |
Measuring and Increasing Context Usage in Context-Aware Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
25 |
On The Ingredients of an Effective Zero-shot Semantic Parser ...
|
|
|
|
BASE
|
|
Show details
|
|
26 |
Findings of the AmericasNLP 2021 Shared Task on Open Machine Translation for Indigenous Languages of the Americas ...
|
|
|
|
BASE
|
|
Show details
|
|
27 |
Explorations in Transfer Learning for OCR Post-Correction ...
|
|
|
|
BASE
|
|
Show details
|
|
28 |
Detecting Hallucinated Content in Conditional Neural Sequence Generation ...
|
|
|
|
BASE
|
|
Show details
|
|
29 |
CitationIE: Leveraging the Citation Graph for Scientific Information Extraction ...
|
|
|
|
BASE
|
|
Show details
|
|
30 |
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
|
|
|
|
Abstract:
It has been shown that machine translation models usually generate poor translations for named entities that are infrequent in the training corpus. Earlier named entity translation methods mainly focus on phonetic transliteration, which ignores the sentence context for translation and is limited in domain and language coverage. To address this limitation, we propose DEEP, a DEnoising Entity Pre-training method that leverages large amounts of monolingual data and a knowledge base to improve named entity translation accuracy within sentences. Besides, we investigate a multi-task learning strategy that finetunes a pre-trained neural machine translation model on both entity-augmented monolingual data and parallel data to further improve entity translation. Experimental results on three language pairs demonstrate that \method results in significant improvements over strong denoising auto-encoding baselines, with a gain of up to 1.3 BLEU and up to 9.2 entity accuracy points for English-Russian translation. ... : 13 pages ...
|
|
Keyword:
Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2111.07393 https://arxiv.org/abs/2111.07393
|
|
BASE
|
|
Hide details
|
|
31 |
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
|
|
|
|
BASE
|
|
Show details
|
|
32 |
Explicit Alignment Objectives for Multilingual Bidirectional Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
33 |
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
|
|
|
|
BASE
|
|
Show details
|
|
34 |
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
35 |
Distributionally Robust Multilingual Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
37 |
Word Alignment by Fine-tuning Embeddings on Parallel Corpora ...
|
|
|
|
BASE
|
|
Show details
|
|
39 |
Do Context-Aware Translation Models Pay the Right Attention? ...
|
|
|
|
BASE
|
|
Show details
|
|
40 |
Lexically Aware Semi-Supervised Learning for OCR Post-Correction ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|