DE eng

Search in the Catalogues and Directories

Hits 1 – 3 of 3

1
Beyond Noise: Mitigating the Impact of Fine-grained Semantic Divergences on Neural Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.562 Abstract: While it has been shown that Neural Machine Translation (NMT) is highly sensitive to noisy parallel training samples, prior work treats all types of mismatches between source and target as noise. As a result, it remains unclear how samples that are mostly equivalent but contain a small number of semantically divergent tokens impact NMT training. To close this gap, we analyze the impact of different types of fine-grained semantic divergences on Transformer models. We show that models trained on synthetic divergences output degenerated text more frequently and are less confident in their predictions. Based on these findings, we introduce a divergent-aware NMT framework that uses factors to help NMT recover from the degradation caused by naturally occurring divergences, improving both translation quality and model calibration on EN-FR tasks. ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25802-beyond-noise-mitigating-the-impact-of-fine-grained-semantic-divergences-on-neural-machine-translation
https://dx.doi.org/10.48448/ac9b-d890
BASE
Hide details
2
EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints ...
BASE
Show details
3
A Non-Autoregressive Edit-Based Approach to Controllable Text Simplification ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
3
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern