DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 28

1
Specializing unsupervised pretraining models for word-level semantic similarity
Lauscher, Anne [Verfasser]; Vulic, Ivan [Verfasser]; Ponti, Edoardo Maria [Verfasser]. - Mannheim : Universitätsbibliothek Mannheim, 2021
DNB Subject Category Language
Show details
2
Towards Zero-shot Language Modeling ...
BASE
Show details
3
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
4
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
5
Combining Deep Generative Models and Multi-lingual Pretraining for Semi-supervised Document Classification ...
Zhu, Yi; Shareghi, Ehsan; Li, Yingzhen. - : arXiv, 2021
BASE
Show details
6
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
7
Parameter space factorization for zero-shot learning across tasks and languages ...
BASE
Show details
8
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Liu, Qianchu; Liu, Fangyu; Collier, Nigel. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
9
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-short.72 Abstract: Injecting external domain-specific knowledge (e.g., UMLS) into pretrained language models (LMs) advances their capability to handle specialised in-domain tasks such as biomedical entity linking (BEL). However, such abundant expert knowledge is available only for a handful of languages (e.g., English). In this work, by proposing a novel cross-lingual biomedical entity linking task (XL-BEL) and establishing a new XL-BEL benchmark spanning 10 typologically diverse languages, we first investigate the ability of standard knowledge-agnostic as well as knowledge-enhanced monolingual and multilingual LMs beyond the standard monolingual English BEL task. The scores indicate large gaps to English performance. We then address the challenge of transferring domain-specific knowledge in resource-rich languages to resource-poor ones. To this end, we propose and evaluate a series of cross-lingual transfer methods for the XL-BEL task, and demonstrate that ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25632-learning-domain-specialised-representations-for-cross-lingual-biomedical-entity-linking
https://dx.doi.org/10.48448/v8cn-0854
BASE
Hide details
10
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
11
Semantic Data Set Construction from Human Clustering and Spatial Arrangement ...
Majewska, Olga; McCarthy, Diana; Van Den Bosch, Jasper JF. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
12
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
Liu, Fangyu; Vulić, I; Korhonen, Anna-Leena. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
13
Context vs Target Word: Quantifying Biases in Lexical Semantic Datasets ...
BASE
Show details
14
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
15
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
16
Parameter space factorization for zero-shot learning across tasks and languages
In: Transactions of the Association for Computational Linguistics, 9 (2021)
BASE
Show details
17
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
18
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
19
Improving Machine Translation of Rare and Unseen Word Senses ...
BASE
Show details
20
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
27
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern