DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
Do Context-Aware Translation Models Pay the Right Attention? ...
BASE
Show details
2
When is Wall a Pared and when a Muro? -- Extracting Rules Governing Lexical Selection ...
BASE
Show details
3
When is Wall a Pared and when a Muro?: Extracting Rules Governing Lexical Selection ...
BASE
Show details
4
Do Context-Aware Translation Models Pay the Right Attention? ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.65 Abstract: Context-aware machine translation models are designed to leverage contextual information, but often fail to do so. As a result, they inaccurately disambiguate pronouns and polysemous words that require context for resolution. In this paper, we ask several questions: What contexts do human translators use to resolve ambiguous words? Are models paying large amounts of attention to the same context? What if we explicitly train them to do so? To answer these questions, we introduce SCAT (Supporting Context for Ambiguous Translations), a new English-French dataset comprising supporting context words for 14K translations that professional translators found useful for pronoun disambiguation. Using SCAT, we perform an in-depth analysis of the context used to disambiguate, examining positional and lexical characteristics of the supporting words. Furthermore, we measure the degree of alignment between the model's attention scores and the supporting ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25426-do-context-aware-translation-models-pay-the-right-attentionquestion
https://dx.doi.org/10.48448/20nd-s715
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern