22 |
Low Resource Neural Machine Translation: A Benchmark for Five African Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
24 |
Breeding Gender-aware Direct Speech Translation Systems ...
|
|
|
|
BASE
|
|
Show details
|
|
25 |
Machine-oriented NMT Adaptation for Zero-shot NLP tasks: Comparing the Usefulness of Close and Distant Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
26 |
APE Shared Task WMT18: Human Post-edits and References Test Data EN-DE PBSMT
|
|
|
|
BASE
|
|
Show details
|
|
27 |
Identification of bilingual terms from monolingual documents for statistical machine translation
|
|
|
|
BASE
|
|
Show details
|
|
28 |
Knowledge portability with semantic expansion of ontology labels
|
|
|
|
BASE
|
|
Show details
|
|
29 |
Enhancing statistical machine translation with bilingual terminology in a CAT environment
|
|
|
|
BASE
|
|
Show details
|
|
30 |
Multilingual Neural Machine Translation for Zero-Resource Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
32 |
Adapting Multilingual Neural Machine Translation to Unseen Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
33 |
Adapting Multilingual Neural Machine Translation to Unseen Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
34 |
Adapting Multilingual Neural Machine Translation to Unseen Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
35 |
MuST-C A Multilingual Speech Translation Corpus (Conference slides) ...
|
|
|
|
BASE
|
|
Show details
|
|
39 |
Transfer Learning in Multilingual Neural Machine Translation with Dynamic Vocabulary ...
|
|
|
|
Abstract:
We propose a method to transfer knowledge across neural machine translation (NMT) models by means of a shared dynamic vocabulary. Our approach allows to extend an initial model for a given language pair to cover new languages by adapting its vocabulary as long as new data become available (i.e., introducing new vocabulary items if they are not included in the initial model). The parameter transfer mechanism is evaluated in two scenarios: i) to adapt a trained single language NMT system to work with a new language pair and ii) to continuously add new language pairs to grow to a multilingual NMT system. In both the scenarios our goal is to improve the translation performance, while minimizing the training convergence time. Preliminary experiments spanning five languages with different training data sizes (i.e., 5k and 50k parallel sentences) show a significant performance gain ranging from +3.85 up to +13.63 BLEU in different language directions. Moreover, when compared with training an NMT model from scratch, ... : Published at the International Workshop on Spoken Language Translation (IWSLT), 2018 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.1811.01137 https://arxiv.org/abs/1811.01137
|
|
BASE
|
|
Hide details
|
|
40 |
Improving Zero-Shot Translation of Low-Resource Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|