DE eng

Search in the Catalogues and Directories

Hits 1 – 3 of 3

1
Multilingual Translation via Grafting Pre-trained Language Models ...
Sun, Zewei; Wang, Mingxuan; Li, Lei. - : arXiv, 2021
BASE
Show details
2
Multilingual Translation via Grafting Pre-trained Language Models ...
Abstract: Can pre-trained BERT for one language and GPT for another be glued together to translate texts? Self-supervised training using only monolingual data has led to the success of pre-trained (masked) language models in many NLP tasks. However, directly connecting BERT as an encoder and GPT as a decoder can be challenging in machine translation, for GPT-like models lack a cross-attention component that is needed in seq2seq decoders. In this paper, we propose Graformer to graft separately pre-trained (masked) language models for machine translation. With monolingual data for pre-training and parallel data for grafting training, we maximally take advantage of the usage of both types of data. Experiments on 60 directions show that our method achieves average improvements of 5.8 BLEU in x2en and 2.9 BLEU in en2x directions comparing with the multilingual Transformer of the same size. ...
URL: https://dx.doi.org/10.48448/s1t4-ez34
https://underline.io/lecture/38416-multilingual-translation-via-grafting-pre-trained-language-models
BASE
Hide details
3
Rethinking Document-level Neural Machine Translation ...
Sun, Zewei; Wang, Mingxuan; Zhou, Hao. - : arXiv, 2020
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
3
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern