DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 21 – 34 of 34

21
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
22
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
23
Multilingual and Cross-Lingual Intent Detection from Spoken Data ...
BASE
Show details
24
Semantic Data Set Construction from Human Clustering and Spatial Arrangement ...
Majewska, Olga; McCarthy, Diana; Van Den Bosch, Jasper JF. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
25
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
26
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
27
Parameter space factorization for zero-shot learning across tasks and languages
In: Transactions of the Association for Computational Linguistics, 9 (2021)
BASE
Show details
28
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
29
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
30
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
BASE
Show details
31
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
32
A Closer Look at Few-Shot Crosslingual Transfer: The Choice of Shots Matters ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.447 Abstract: Few-shot crosslingual transfer has been shown to outperform its zero-shot counterpart with pretrained encoders like multilingual BERT. Despite its growing popularity, little to no attention has been paid to standardizing and analyzing the design of few-shot experiments. In this work, we highlight a fundamental risk posed by this shortcoming, illustrating that the model exhibits a high degree of sensitivity to the selection of few shots. We conduct a large-scale experimental study on 40 sets of sampled few shots for six diverse NLP tasks across up to 40 languages. We provide an analysis of success and failure cases of few-shot transfer, which highlights the role of lexical features. Additionally, we show that a straightforward full model finetuning approach is quite effective for few-shot transfer, outperforming several state-of-the-art few-shot approaches. As a step towards standardizing few-shot crosslingual experimental designs, we make ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25886-a-closer-look-at-few-shot-crosslingual-transfer-the-choice-of-shots-matters
https://dx.doi.org/10.48448/m8s0-3a39
BASE
Hide details
33
Is supervised syntactic parsing beneficial for language understanding tasks? An empirical investigation
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2021
BASE
Show details
34
Evaluating multilingual text encoders for unsupervised cross-lingual retrieval
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
3
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
31
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern