1 |
Improving Word Translation via Two-Stage Contrastive Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Plan-then-Generate: Controlled Data-to-Text Generation via Planning ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Prix-LM: Pretraining for Multilingual Knowledge Base Construction ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
|
|
|
|
Abstract:
Injecting external domain-specific knowledge (e.g., UMLS) into pretrained language models (LMs) advances their capability to handle specialised in-domain tasks such as biomedical entity linking (BEL). However, such abundant expert knowledge is available only for a handful of languages (e.g., English). In this work, by proposing a novel cross-lingual biomedical entity linking task (XL-BEL) and establishing a new XL-BEL benchmark spanning 10 typologically diverse languages, we first investigate the ability of standard knowledge-agnostic as well as knowledge-enhanced monolingual and multilingual LMs beyond the standard monolingual English BEL task. The scores indicate large gaps to English performance. We then address the challenge of transferring domain-specific knowledge in resource-rich languages to resource-poor ones. To this end, we propose and evaluate a series of cross-lingual transfer methods for the XL-BEL task, and demonstrate that general-domain bitext helps propagate the available English knowledge ... : ACL-IJCNLP 2021 ...
|
|
Keyword:
Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
|
|
URL: https://dx.doi.org/10.48550/arxiv.2105.14398 https://arxiv.org/abs/2105.14398
|
|
BASE
|
|
Hide details
|
|
5 |
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Visually Grounded Reasoning across Languages and Cultures ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Visually Grounded Reasoning across Languages and Cultures ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Self-Alignment Pretraining for Biomedical Entity Representations
|
|
Liu, Fangyu; Shareghi, Ehsan; Meng, Zaiqiao. - : Association for Computational Linguistics, 2021. : Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2021
|
|
BASE
|
|
Show details
|
|
12 |
Large-scale exploration of neural relation classification architectures ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Will-They-Won't-They: A Very Large Dataset for Stance Detection on Twitter ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Will-They-Won't-They: A Very Large Dataset for Stance Detection on Twitter
|
|
|
|
BASE
|
|
Show details
|
|
16 |
STANDER: An expert-annotated dataset for news stance detection and evidence retrieval
|
|
Conforti, C; Berndt, J; Pilehvar, MT. - : Association for Computational Linguistics, 2020. : Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020, 2020
|
|
BASE
|
|
Show details
|
|
17 |
Large-scale exploration of neural relation classification architectures
|
|
Le, HQ; Can, DC; Vu, ST. - : Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018, 2020
|
|
BASE
|
|
Show details
|
|
|
|