DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5
Hits 1 – 20 of 88

1
NMTScore: A Multilingual Analysis of Translation-based Text Similarity Measures ...
Vamvas, Jannis; Sennrich, Rico. - : arXiv, 2022
BASE
Show details
2
Improving Zero-shot Cross-lingual Transfer between Closely Related Languages by injecting Character-level Noise ...
Aepli, Noëmi; Sennrich, Rico. - : arXiv, 2021
BASE
Show details
3
On Biasing Transformer Attention Towards Monotonicity ...
BASE
Show details
4
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
Emelin, Denis; Sennrich, Rico. - : Association for Computational Linguistics, 2021
BASE
Show details
5
ELITR Multilingual Live Subtitling: Demo and Strategy ...
BASE
Show details
6
Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation ...
BASE
Show details
7
Edinburgh’s End-to-End Multilingual Speech Translation System for IWSLT 2021 ...
Zhang, Biao; Sennrich, Rico. - : ACL Anthology, 2021
BASE
Show details
8
On Biasing Transformer Attention Towards Monotonicity ...
Rios, Annette; Amrhein, Chantal; Aepli, Noëmi. - : Association for Computational Linguistics, 2021
BASE
Show details
9
Revisiting Negation in Neural Machine Translation ...
Abstract: Read paper: NA Abstract: In this paper, we evaluate the translation of negation both automatically and manually, in English—German (EN—DE) and English—Chinese (EN—ZH). We show that the ability of neural machine translation (NMT) models to translate negation has improved with deeper and more advanced networks, although the performance varies between language pairs and translation directions. The accuracy of manual evaluation in EN—DE, DE—EN, EN—ZH, and ZH—EN is 95.7%, 94.8%, 93.4%, and 91.7%, respectively. In addition, we show that under-translation is the most significant error type in NMT, which contrasts with the more diverse error profile previously observed for statistical machine translation. To better understand the root of the under-translation of negation, we study the model's information flow and training data. While our information flow analysis does not reveal any deficiencies that could be used to detect or fix the under-translation of negation, we find that negation is often rephrased during ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25803-revisiting-negation-in-neural-machine-translation
https://dx.doi.org/10.48448/x27r-fa09
BASE
Hide details
10
Understanding the Properties of Minimum Bayes Risk Decoding in Neural Machine Translation ...
BASE
Show details
11
Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation ...
BASE
Show details
12
Vision Matters When It Should: Sanity Checking Multimodal Machine Translation Models ...
BASE
Show details
13
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
BASE
Show details
14
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
BASE
Show details
15
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
Voita, Elena; Sennrich, Rico; Titov, Ivan. - : ACL Anthology, 2021
BASE
Show details
16
Contrastive Conditioning for Assessing Disambiguation in MT: A Case Study of Distilled Bias ...
BASE
Show details
17
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
BASE
Show details
18
On Biasing Transformer Attention Towards Monotonicity ...
NAACL 2021 2021; Aepli, Noëmi; Amrhein, Chantal. - : Underline Science Inc., 2021
BASE
Show details
19
Universal rewriting via machine translation
Mallinson, Jonathan. - : The University of Edinburgh, 2021
BASE
Show details
20
Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation
In: Zhang, Biao; Bapna, Ankur; Sennrich, Rico; Firat, Orhan (2021). Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation. In: International Conference on Learning Representations, Virtual, 3 May 2021 - 7 May 2021, ICLR. (2021)
BASE
Show details

Page: 1 2 3 4 5

Catalogues
0
0
0
0
2
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
86
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern