2 |
Improving Word Translation via Two-Stage Contrastive Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
The MultimorbiditY COllaborative Medication Review And DEcision Making (MyComrade) study: a protocol for a cross-border pilot cluster randomised controlled trial
|
|
|
|
In: Pilot Feasibility Stud (2022)
|
|
BASE
|
|
Show details
|
|
5 |
"Non-linguistic" Browning : meter and music in "Pietro of Abano"
|
|
|
|
BASE
|
|
Show details
|
|
6 |
“Non-linguistic” Browning: Meter and music in “Pietro of Abano”
|
|
|
|
In: Proceedings of the Linguistic Society of America; Vol 7, No 1 (2022): Proceedings of the Linguistic Society of America; 5240 ; 2473-8689 (2022)
|
|
BASE
|
|
Show details
|
|
7 |
Observation of new excited ${B} ^0_{s} $ states
|
|
|
|
In: Eur.Phys.J.C ; https://hal.archives-ouvertes.fr/hal-03010999 ; Eur.Phys.J.C, 2021, 81 (7), pp.601. ⟨10.1140/epjc/s10052-021-09305-3⟩ (2021)
|
|
BASE
|
|
Show details
|
|
8 |
Infective Endocarditis in Patients on Chronic Hemodialysis
|
|
|
|
In: ISSN: 0735-1097 ; Journal of the American College of Cardiology ; https://hal.archives-ouvertes.fr/hal-03369871 ; Journal of the American College of Cardiology, Elsevier, 2021, 77 (13), pp.1629-1640. ⟨10.1016/j.jacc.2021.02.014⟩ (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Plan-then-Generate: Controlled Data-to-Text Generation via Planning ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Plan-then-Generate: Controlled Data-to-Text Generation via Planning ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Prix-LM: Pretraining for Multilingual Knowledge Base Construction ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Language and ethnobiological skills decline precipitously in Papua New Guinea, the world's most linguistically diverse nation
|
|
|
|
BASE
|
|
Show details
|
|
15 |
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
|
|
|
|
Abstract:
Recent work indicated that pretrained language models (PLMs) such as BERT and RoBERTa can be transformed into effective sentence and word encoders even via simple self-supervised techniques. Inspired by this line of work, in this paper we propose a fully unsupervised approach to improving word-in-context (WiC) representations in PLMs, achieved via a simple and efficient WiC-targeted fine-tuning procedure: MirrorWiC. The proposed method leverages only raw texts sampled from Wikipedia, assuming no sense-annotated data, and learns context-aware word representations within a standard contrastive learning setup. We experiment with a series of standard and comprehensive WiC benchmarks across multiple languages. Our proposed fully unsupervised MirrorWiC models obtain substantial gains over off-the-shelf PLMs across all monolingual, multilingual and cross-lingual setups. Moreover, on some standard WiC benchmarks, MirrorWiC is even on-par with supervised models fine-tuned with in-task data and sense labels. ...
|
|
Keyword:
cs.CL
|
|
URL: https://dx.doi.org/10.17863/cam.78495 https://www.repository.cam.ac.uk/handle/1810/331050
|
|
BASE
|
|
Hide details
|
|
16 |
Visually Grounded Reasoning across Languages and Cultures ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Visually Grounded Reasoning across Languages and Cultures ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|