DE eng

Search in the Catalogues and Directories

Hits 1 – 1 of 1

1
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation ...
NAACL 2021 2021; Hao, Yongchang. - : Underline Science Inc., 2021
Abstract: Read the paper on the folowing link: https://www.aclweb.org/anthology/2021.naacl-main.313/ Abstract: Non-Autoregressive machine Translation (NAT) models have demonstrated significant inference speedup but suffer from inferior translation accuracy. The common practice to tackle the problem is transferring the Autoregressive machine Translation (AT) knowledge to NAT models, e.g., with knowledge distillation. In this work, we hypothesize and empirically verify that AT and NAT encoders capture different linguistic properties and representations of source sentences. Therefore, we propose to adopt the multi-task learning to transfer the AT knowledge to NAT models through the encoder sharing. Specifically, we take the AT model as an auxiliary task to enhance NAT model performance. Experimental results on WMT14 English->German and WMT16 English->Romanian datasets show that the proposed multi-task NAT achieves significant improvements over the baseline NAT models. In addition, experimental results demonstrate ...
Keyword: Intelligent System; Machine Intelligence; Natural Language Processing
URL: https://dx.doi.org/10.48448/4rrc-2g63
https://underline.io/lecture/19612-multi-task-learning-with-shared-encoder-for-non-autoregressive-machine-translation
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern