DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 29

1
Welcome to the Modern World of Pronouns: Identity-Inclusive Natural Language Processing beyond Gender ...
BASE
Show details
2
Specializing unsupervised pretraining models for word-level semantic similarity
Lauscher, Anne [Verfasser]; Vulic, Ivan [Verfasser]; Ponti, Edoardo Maria [Verfasser]. - Mannheim : Universitätsbibliothek Mannheim, 2021
DNB Subject Category Language
Show details
3
MultiCite: Modeling realistic citations requires moving beyond the single-sentence single-label setting ...
BASE
Show details
4
RedditBias: A Real-World Resource for Bias Evaluation and Debiasing of Conversational Language Models ...
BASE
Show details
5
Language representations for computational argumentation
Lauscher, Anne. - 2021
BASE
Show details
6
AraWEAT: Multidimensional Analysis of Biases in Arabic Word Embeddings ...
BASE
Show details
7
Rhetoric, Logic, and Dialectic: Advancing Theory-based Argument Quality Assessment in Natural Language Processing ...
BASE
Show details
8
Rhetoric, Logic, and Dialectic: Advancing Theory-based Argument Quality Assessment in Natural Language Processing ...
BASE
Show details
9
Creating a Domain-diverse Corpus for Theory-based Argument Quality Assessment ...
BASE
Show details
10
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
Lauscher, Anne; Ravishankar, Vinit; Vulic, Ivan. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
11
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
Abstract: Massively multilingual transformers pretrained with language modeling objectives (e.g., mBERT, XLM-R) have become a de facto default transfer paradigm for zero-shot cross-lingual transfer in NLP, offering unmatched transfer performance. Current downstream evaluations, however, verify their efficacy predominantly in transfer settings involving languages with sufficient amounts of pretraining data, and with lexically and typologically close languages. In this work, we analyze their limitations and show that cross-lingual transfer via massively multilingual transformers, much like transfer via cross-lingual word embeddings, is substantially less effective in resource-lean scenarios and for distant languages. Our experiments, encompassing three lower-level tasks (POS tagging, dependency parsing, NER), as well as two high-level semantic tasks (NLI, QA), empirically correlate transfer performance with linguistic similarity between the source and target languages, but also with the size of pretraining corpora of ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2005.00633
https://dx.doi.org/10.48550/arxiv.2005.00633
BASE
Hide details
12
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
13
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
Lauscher, Anne; Vulic, Ivan; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
14
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity
Lauscher, Anne; Vulic, Ivan; Ponti, Edoardo. - : International Committee on Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.coling-main.118, 2020. : Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020
BASE
Show details
15
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers
Ravishankar, Vinit; Glavas, Goran; Lauscher, Anne. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
16
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details
17
AraWEAT: Multidimensional analysis of biases in Arabic word embeddings
Lauscher, Anne; Takieddin, Rafik; Ponzetto, Simone Paolo. - : Association for Computational Linguistics, 2020
BASE
Show details
18
Common sense or world knowledge? Investigating adapter-based knowledge injection into pretrained transformers
Lauscher, Anne; Majewska, Olga; Ribeiro, Leonardo F. R.. - : Association for Computational Linguistics, 2020
BASE
Show details
19
From zero to hero: On the limitations of zero-shot language transfer with multilingual transformers
Ravishankar, Vinit; Glavaš, Goran; Lauscher, Anne. - : Association for Computational Linguistics, 2020
BASE
Show details
20
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
28
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern