DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 41

1
Delving Deeper into Cross-lingual Visual Question Answering ...
BASE
Show details
2
Cross-Lingual Dialogue Dataset Creation via Outline-Based Generation ...
BASE
Show details
3
Improving Word Translation via Two-Stage Contrastive Learning ...
BASE
Show details
4
Towards Zero-shot Language Modeling ...
BASE
Show details
5
Multilingual and Cross-Lingual Intent Detection from Spoken Data ...
BASE
Show details
6
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
7
Modelling Latent Translations for Cross-Lingual Transfer ...
BASE
Show details
8
Prix-LM: Pretraining for Multilingual Knowledge Base Construction ...
BASE
Show details
9
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
10
xGQA: Cross-Lingual Visual Question Answering ...
BASE
Show details
11
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
12
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
13
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
BASE
Show details
14
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
15
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
16
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
17
Emergent Communication Pretraining for Few-Shot Machine Translation ...
Abstract: While state-of-the-art models that rely upon massively multilingual pretrained encoders achieve sample efficiency in downstream applications, they still require abundant amounts of unlabelled text. Nevertheless, most of the world's languages lack such resources. Hence, we investigate a more radical form of unsupervised knowledge transfer in the absence of linguistic data. In particular, for the first time we pretrain neural networks via emergent communication from referential games. Our key assumption is that grounding communication on images---as a crude approximation of real-world environments---inductively biases the model towards learning natural languages. On the one hand, we show that this substantially benefits machine translation in few-shot settings. On the other hand, this also provides an extrinsic evaluation protocol to probe the properties of emergent languages ex vitro. Intuitively, the closer they are to natural languages, the higher the gains from pretraining on them should be. For instance, ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.2011.00890
https://arxiv.org/abs/2011.00890
BASE
Hide details
18
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
19
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
BASE
Show details
20
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
41
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern