DE eng

Search in the Catalogues and Directories

Hits 1 – 5 of 5

1
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning ...
BASE
Show details
2
Non-Parametric Few-Shot Learning for Word Sense Disambiguation ...
BASE
Show details
3
A Summary of the First Workshop on Language Technology for Language Documentation and Revitalization ...
BASE
Show details
4
Generalized Data Augmentation for Low-Resource Translation ...
BASE
Show details
5
Domain Adaptation of Neural Machine Translation by Lexicon Induction ...
Abstract: It has been previously noted that neural machine translation (NMT) is very sensitive to domain shift. In this paper, we argue that this is a dual effect of the highly lexicalized nature of NMT, resulting in failure for sentences with large numbers of unknown words, and lack of supervision for domain-specific words. To remedy this problem, we propose an unsupervised adaptation method which fine-tunes a pre-trained out-of-domain NMT model using a pseudo-in-domain corpus. Specifically, we perform lexicon induction to extract an in-domain lexicon, and construct a pseudo-parallel in-domain corpus by performing word-for-word back-translation of monolingual in-domain target sentences. In five domains over twenty pairwise adaptation settings and two model architectures, our method achieves consistent improvements without using any in-domain parallel sentences, improving up to 14 BLEU over unadapted models, and up to 2 BLEU over strong back-translation baselines. ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1906.00376
https://dx.doi.org/10.48550/arxiv.1906.00376
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
5
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern