1 |
NMTScore: A Multilingual Analysis of Translation-based Text Similarity Measures ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Improving Zero-shot Cross-lingual Transfer between Closely Related Languages by injecting Character-level Noise ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Edinburgh’s End-to-End Multilingual Speech Translation System for IWSLT 2021 ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Understanding the Properties of Minimum Bayes Risk Decoding in Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Vision Matters When It Should: Sanity Checking Multimodal Machine Translation Models ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Contrastive Conditioning for Assessing Disambiguation in MT: A Case Study of Distilled Bias ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
|
|
|
|
Abstract:
Anthology paper link: https://aclanthology.org/2021.emnlp-main.667/ Abstract: Differently from the traditional statistical MT that decomposes the translation task into distinct separately learned components, neural machine translation uses a single neural network to model the entire translation process. Despite neural machine translation being de-facto standard, it is still not clear how NMT models acquire different competences over the course of training, and how this mirrors the different models in traditional SMT. In this work, we look at the competences related to three core SMT components and find that during training, NMT first focuses on learning target-side language modeling, then improves translation quality approaching word-by-word translation, and finally learns more complicated reordering patterns. We show that this behavior holds for several models and language pairs. Additionally, we explain how such an understanding of the training process can be useful in practice and, as an example, show how ...
|
|
Keyword:
Computational Linguistics; Machine Learning; Machine Learning and Data Mining; Machine translation; Natural Language Processing
|
|
URL: https://dx.doi.org/10.48448/g7q7-y885 https://underline.io/lecture/37691-language-modeling,-lexical-translation,-reordering-the-training-process-of-nmt-through-the-lens-of-classical-smt
|
|
BASE
|
|
Hide details
|
|
20 |
Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation
|
|
|
|
In: Zhang, Biao; Bapna, Ankur; Sennrich, Rico; Firat, Orhan (2021). Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation. In: International Conference on Learning Representations, Virtual, 3 May 2021 - 7 May 2021, ICLR. (2021)
|
|
BASE
|
|
Show details
|
|
|
|