Home
Catalogue search
Refine your search:
Keyword
Creator / Publisher:
May, Jonathan (3)
Ren, Xiang (3)
Bui, Trung (2)
Dernoncourt, Franck (2)
Kim, Doo Soon (2)
M'hamdi, Meryem (2)
Gheini, Mozhdeh (1)
NAACL 2021 2021 (1)
The 2021 Conference on Empirical Methods in Natural Language Processing 2021 (1)
Year
Medium
Type:
Article (2)
Miscellaneous (1)
BLLDB-Access:
free (3)
subject to license (0)
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 3 of 3
1
X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question Answering ...
M'hamdi, Meryem
;
Kim, Doo Soon
;
Dernoncourt, Franck
;
Bui, Trung
;
Ren, Xiang
;
May, Jonathan
. - : arXiv, 2021
Abstract:
Multilingual models, such as M-BERT and XLM-R, have gained increasing popularity, due to their zero-shot cross-lingual transfer learning capabilities. However, their generalization ability is still inconsistent for typologically diverse languages and across different benchmarks. Recently, meta-learning has garnered attention as a promising technique for enhancing transfer learning under low-resource scenarios: particularly for cross-lingual transfer in Natural Language Understanding (NLU). In this work, we propose X-METRA-ADA, a cross-lingual MEta-TRAnsfer learning ADAptation approach for NLU. Our approach adapts MAML, an optimization-based meta-learning approach, to learn to adapt to new languages. We extensively evaluate our framework on two challenging cross-lingual NLU tasks: multilingual task-oriented dialog and typologically diverse question answering. We show that our approach outperforms naive fine-tuning, reaching competitive performance on both tasks for most languages. Our analysis reveals that ... : Accepted at NAACL 2021 ...
Keyword:
Computation and Language cs.CL
;
FOS Computer and information sciences
URL:
https://dx.doi.org/10.48550/arxiv.2104.09696
https://arxiv.org/abs/2104.09696
BASE
Hide details
2
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Gheini, Mozhdeh
;
May, Jonathan
. - : Underline Science Inc., 2021
BASE
Show details
3
X-METRA-ADA: Cross-lingual Meta-Transfer learning Adaptation to Natural Language Understanding and Question Answering ...
NAACL 2021 2021
;
Bui, Trung
;
Dernoncourt, Franck
. - : Underline Science Inc., 2021
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
3
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern