Home
Catalogue search
Refine your search:
Keyword
Creator / Publisher:
Li, Lei (3)
Sun, Zewei (3)
Wang, Mingxuan (3)
Chen, Jiajun (1)
Huang, Shujian (1)
The 2021 Conference on Empirical Methods in Natural Language Processing 2021 (1)
Zhao, Chengqi (1)
Zhou, Hao (1)
Year
Medium
Type
BLLDB-Access:
free (3)
subject to license (0)
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 3 of 3
1
Multilingual Translation via Grafting Pre-trained Language Models ...
Sun, Zewei
;
Wang, Mingxuan
;
Li, Lei
. - : arXiv, 2021
BASE
Show details
2
Multilingual Translation via Grafting Pre-trained Language Models ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Li, Lei
;
Sun, Zewei
;
Wang, Mingxuan
. - : Underline Science Inc., 2021
Abstract:
Can pre-trained BERT for one language and GPT for another be glued together to translate texts? Self-supervised training using only monolingual data has led to the success of pre-trained (masked) language models in many NLP tasks. However, directly connecting BERT as an encoder and GPT as a decoder can be challenging in machine translation, for GPT-like models lack a cross-attention component that is needed in seq2seq decoders. In this paper, we propose Graformer to graft separately pre-trained (masked) language models for machine translation. With monolingual data for pre-training and parallel data for grafting training, we maximally take advantage of the usage of both types of data. Experiments on 60 directions show that our method achieves average improvements of 5.8 BLEU in x2en and 2.9 BLEU in en2x directions comparing with the multilingual Transformer of the same size. ...
URL:
https://dx.doi.org/10.48448/s1t4-ez34
https://underline.io/lecture/38416-multilingual-translation-via-grafting-pre-trained-language-models
BASE
Hide details
3
Rethinking Document-level Neural Machine Translation ...
Sun, Zewei
;
Wang, Mingxuan
;
Zhou, Hao
. - : arXiv, 2020
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
3
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern