DE eng

Search in the Catalogues and Directories

Hits 1 – 16 of 16

1
Delving Deeper into Cross-lingual Visual Question Answering ...
BASE
Show details
2
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages ...
BASE
Show details
3
Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation ...
BASE
Show details
4
xGQA: Cross-Lingual Visual Question Answering ...
BASE
Show details
5
Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation ...
BASE
Show details
6
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
7
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details
8
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
BASE
Show details
9
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
Abstract: In this work, we provide a systematic and comprehensive empirical comparison of pretrained multilingual language models versus their monolingual counterparts with regard to their monolingual task performance. We study a set of nine typologically diverse languages with readily available pretrained monolingual models on a set of five diverse monolingual downstream tasks. We first aim to establish, via fair and controlled comparisons, if a gap between the multilingual and the corresponding monolingual representation of that language exists, and subsequently investigate the reason for any performance difference. To disentangle conflating factors, we train new monolingual models on the same data, with monolingually and multilingually trained tokenizers. We find that while the pretraining data size is an important factor, a designated monolingual tokenizer plays an equally important role in the downstream performance. Our results show that languages that are adequately represented in the multilingual model's ... : ACL 2021 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2012.15613
https://dx.doi.org/10.48550/arxiv.2012.15613
BASE
Hide details
10
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
11
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
Pfeiffer, Jonas; Vulic, Ivan; Gurevych, Iryna. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
12
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Vulic, Ivan; Pfeiffer, Jonas; Ruder, Sebastian. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
13
AdapterHub: A Framework for Adapting Transformers
Pfeiffer, Jonas; Ruckle, Andreas; Poth, Clifton. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing: System Demonstrations (EMNLP 2020), 2020
BASE
Show details
14
Specialising Distributional Vectors of All Words for Lexical Entailment ...
Kamath, Aishwarya; Pfeiffer, Jonas; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
15
Specializing distributional vectors of all words for lexical entailment
Ponti, Edoardo Maria; Kamath, Aishwarya; Pfeiffer, Jonas. - : Association for Computational Linguistics, 2019
BASE
Show details
16
A neural autoencoder approach for document ranking and query refinement in pharmacogenomic information retrieval
Broscheit, Samuel; Pfeiffer, Jonas; Gemulla, Rainer. - : Association for Computational Linguistics, 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
16
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern