DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...33
Hits 1 – 20 of 657

1
One model for the learning of language.
In: Proceedings of the National Academy of Sciences of the United States of America, vol 119, iss 5 (2022)
BASE
Show details
2
Solving analogies on words based on minimal complexity transformation
In: International Joint Conference on Artificial Intelligence (IJCAI-2020) ; https://hal.telecom-paris.fr/hal-02867163 ; International Joint Conference on Artificial Intelligence (IJCAI-2020), Jan 2021, Kyoto, Japan. pp.1848-1854 ; https://www.ijcai20.org/ (2021)
BASE
Show details
3
NUIG at TIAD 2021: Cross-lingual word embeddings for translation inference
BASE
Show details
4
Gradual emotion induction with a visual Velten method ...
Out, Charlotte. - : Open Science Framework, 2021
BASE
Show details
5
Gradual emotion induction with a visual Velten method ...
Out, Charlotte. - : Open Science Framework, 2021
BASE
Show details
6
Inductive general grammar
In: Glossa: a journal of general linguistics; Vol 6, No 1 (2021); 75 ; 2397-1835 (2021)
BASE
Show details
7
Evaluating Multiway Multilingual NMT in the Turkic Languages ...
BASE
Show details
8
Findings of the WMT 2021 Shared Task on Quality Estimation ...
BASE
Show details
9
Pushing the Right Buttons: Adversarial Evaluation of Quality Estimation ...
BASE
Show details
10
Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters ...
BASE
Show details
11
Robust Open-Vocabulary Translation from Visual Text Representations ...
BASE
Show details
12
Contrastive Learning for Context-aware Neural Machine Translation Using Coreference Information ...
Abstract: Context-aware neural machine translation (NMT) incorporates contextual information of surrounding texts, that can improve the translation quality of document-level machine translation. Many existing works on context-aware NMT have focused on developing new model architectures for incorporating additional contexts and have shown some promising results. However, most of existing works rely on cross-entropy loss, resulting in limited use of contextual information. In this paper, we propose CorefCL, a novel data augmentation and contrastive learning scheme based on coreference between the source and contextual sentences. By corrupting automatically detected coreference mentions in the contextual sentence, CorefCL can train the model to be sensitive to coreference inconsistency. We experimented with our method on common context-aware NMT models and two document-level translation tasks. In the experiments, our method consistently improved BLEU of compared models on English-German and English-Korean tasks. We also ...
Keyword: Bilingual Lexicon Induction; Computational Linguistics; Machine Learning; Machine Learning and Data Mining; Machine translation; Natural Language Processing
URL: https://dx.doi.org/10.48448/aq29-b345
https://underline.io/lecture/39471-contrastive-learning-for-context-aware-neural-machine-translation-using-coreference-information
BASE
Hide details
13
To Ship or Not to Ship: An Extensive Evaluation of Automatic Metrics for Machine Translation ...
BASE
Show details
14
Identifying the Importance of Content Overlap for Better Cross-lingual Embedding Mappings ...
BASE
Show details
15
Simultaneous Neural Machine Translation with Constituent Label Prediction ...
BASE
Show details
16
Just Ask! Evaluating Machine Translation by Asking and Answering Questions ...
BASE
Show details
17
An Analysis of Euclidean vs. Graph-Based Framing for Bilingual Lexicon Induction from Word Embedding Spaces ...
BASE
Show details
18
Findings of the WMT Shared Task on Machine Translation Using Terminologies ...
BASE
Show details
19
Translation Transformers Rediscover Inherent Data Domains ...
BASE
Show details
20
Phrase-level Active Learning for Neural Machine Translation ...
BASE
Show details

Page: 1 2 3 4 5...33

Catalogues
38
6
78
0
0
0
0
Bibliographies
360
0
0
0
0
0
0
0
5
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
280
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern