DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
Exploring Unsupervised Pretraining Objectives for Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.261 Abstract: Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NMT), by drastically reducing the need for large parallel data. Most approaches adapt masked-language modeling (MLM) to sequence-to-sequence architectures, by masking parts of the input and reconstructing them in the decoder. In this work, we systematically compare masking with alternative objectives that produce inputs resembling real (full) sentences, by reordering and replacing words based on their context. We pretrain models with different methods on English-German, English-Nepali and English-Sinhala monolingual data, and evaluate them on NMT, obtaining very unexpected results. In (semi-) supervised NMT, varying the pretraining objective leads to surprisingly small differences in the finetuned performance, whereas unsupervised NMT is much more sensitive to it. To understand these results, we thoroughly study the pretrained models using ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/26352-exploring-unsupervised-pretraining-objectives-for-machine-translation
https://dx.doi.org/10.48448/f0pt-cb53
BASE
Hide details
2
Attention-based Conditioning Methods for External Knowledge Integration ...
BASE
Show details
3
NTUA-SLP at SemEval-2018 Task 2: Predicting Emojis using RNNs with Context-aware Attention ...
BASE
Show details
4
NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles of Word and Character Level Attentive RNNs ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern