DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 28

1
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
2
Emergent Communication Pretraining for Few-Shot Machine Translation ...
BASE
Show details
3
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
4
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
BASE
Show details
5
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details
6
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
7
SemEval-2020 Task 3: Graded Word Similarity in Context ...
BASE
Show details
8
Emergent Communication Pretraining for Few-Shot Machine Translation ...
Abstract: While state-of-the-art models that rely upon massively multilingual pretrained encoders achieve sample efficiency in downstream applications, they still require abundant amounts of unlabelled text. Nevertheless, most of the world's languages lack such resources. Hence, we investigate a more radical form of unsupervised knowledge transfer in the absence of linguistic data. In particular, for the first time we pretrain neural networks via emergent communication from referential games. Our key assumption is that grounding communication on images---as a crude approximation of real-world environments---inductively biases the model towards learning natural languages. On the one hand, we show that this substantially benefits machine translation in few-shot settings. On the other hand, this also provides an extrinsic evaluation protocol to probe the properties of emergent languages \textit{ex vitro}. Intuitively, the closer they are to natural languages, the higher the gains from pretraining on them should be. For ...
Keyword: Computer and Information Science; Natural Language Processing; Neural Network
URL: https://underline.io/lecture/6290-emergent-communication-pretraining-for-few-shot-machine-translation
https://dx.doi.org/10.48448/13e8-dh54
BASE
Hide details
9
Manual Clustering and Spatial Arrangement of Verbs for Multilingual Evaluation and Typology Analysis ...
BASE
Show details
10
A Closer Look at Few-Shot Crosslingual Transfer: The Choice of Shots Matters ...
BASE
Show details
11
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
BASE
Show details
12
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
13
Multi-SimLex: A Large-Scale Evaluation of Multilingual and Cross-Lingual Lexical Semantic Similarity ...
BASE
Show details
14
Probing Pretrained Language Models for Lexical Semantics ...
BASE
Show details
15
The Secret is in the Spectra: Predicting Cross-lingual Task Performance with Spectral Similarity Measures ...
BASE
Show details
16
SemEval-2020 Task 2: Predicting Multilingual and Cross-Lingual (Graded) Lexical Entailment ...
BASE
Show details
17
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
18
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details
19
Non-linear instance-based cross-lingual mapping for non-isomorphic embedding spaces
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
20
Classification-based self-learning for weakly supervised bilingual lexicon induction
Vulić, Ivan; Korhonen, Anna; Glavaš, Goran. - : Association for Computational Linguistics, 2020
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
28
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern