DE eng

Search in the Catalogues and Directories

Hits 1 – 10 of 10

1
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages ...
BASE
Show details
2
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Liu, Qianchu; Liu, Fangyu; Collier, Nigel. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
3
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
4
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-short.72 Abstract: Injecting external domain-specific knowledge (e.g., UMLS) into pretrained language models (LMs) advances their capability to handle specialised in-domain tasks such as biomedical entity linking (BEL). However, such abundant expert knowledge is available only for a handful of languages (e.g., English). In this work, by proposing a novel cross-lingual biomedical entity linking task (XL-BEL) and establishing a new XL-BEL benchmark spanning 10 typologically diverse languages, we first investigate the ability of standard knowledge-agnostic as well as knowledge-enhanced monolingual and multilingual LMs beyond the standard monolingual English BEL task. The scores indicate large gaps to English performance. We then address the challenge of transferring domain-specific knowledge in resource-rich languages to resource-poor ones. To this end, we propose and evaluate a series of cross-lingual transfer methods for the XL-BEL task, and demonstrate that ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25632-learning-domain-specialised-representations-for-cross-lingual-biomedical-entity-linking
https://dx.doi.org/10.48448/v8cn-0854
BASE
Hide details
5
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
6
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
Liu, Fangyu; Vulić, I; Korhonen, Anna-Leena. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
7
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
8
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
9
Self-Alignment Pretraining for Biomedical Entity Representations
Liu, Fangyu; Shareghi, Ehsan; Meng, Zaiqiao. - : Association for Computational Linguistics, 2021. : Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2021
BASE
Show details
10
Upgrading the Newsroom: An Automated Image Selection System for News Articles
In: http://infoscience.epfl.ch/record/280322 (2020)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
10
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern