1 |
Observation of new excited ${B} ^0_{s} $ states
|
|
|
|
In: Eur.Phys.J.C ; https://hal.archives-ouvertes.fr/hal-03010999 ; Eur.Phys.J.C, 2021, 81 (7), pp.601. ⟨10.1140/epjc/s10052-021-09305-3⟩ (2021)
|
|
BASE
|
|
Show details
|
|
3 |
End-to-end Speech Translation via Cross-modal Progressive Training ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Locate then Segment: A Strong Pipeline for Referring Image Segmentation ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Learning Language Specific Sub-network for Multilingual Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Contrastive Learning for Many-to-many Multilingual Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Multilingual Translation via Grafting Pre-trained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Counter-Interference Adapter for Multilingual Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
MTG: A Benchmarking Suite for Multilingual Text Generation ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Language Tags Matter for Zero-Shot Neural Machine Translation ...
|
|
|
|
Abstract:
Multilingual Neural Machine Translation (MNMT) has aroused widespread interest due to its efficiency. An exciting advantage of MNMT models is that they could also translate between unsupervised (zero-shot) language directions. Language tag (LT) strategies are often adopted to indicate the translation directions in MNMT. In this paper, we demonstrate that the LTs are not only indicators for translation directions but also crucial to zero-shot translation qualities. Unfortunately, previous work tends to ignore the importance of LT strategies. We demonstrate that a proper LT strategy could enhance the consistency of semantic representations and alleviate the off-target issue in zero-shot directions. Experimental results show that by ignoring the source language tag (SLT) and adding the target language tag (TLT) to the encoder, the zero-shot translations could achieve a +8 BLEU score difference over other LT strategies in IWSLT17, Europarl, TED talks translation tasks. ... : 7 pages, 3 figures, Accepted by the Findings of ACL2021 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2106.07930 https://dx.doi.org/10.48550/arxiv.2106.07930
|
|
BASE
|
|
Hide details
|
|
12 |
Personalized Transformer for Explainable Recommendation ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Dynamic Knowledge Distillation for Pre-trained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Learning Shared Semantic Space for Speech-to-Text Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Glancing Transformer for Non-Autoregressive Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Text AutoAugment: Learning Compositional Augmentation Policy for Text Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Probabilistic Graph Reasoning for Natural Proof Generation ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Document-level Event Extraction via Heterogeneous Graph-based Interaction Model with a Tracker ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Multilingual Translation via Grafting Pre-trained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Language Tags Matter for Zero-Shot Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|