DE eng

Search in the Catalogues and Directories

Hits 1 – 3 of 3

1
C3: Continued Pretraining with Contrastive Weak Supervision for Cross Language Ad-Hoc Retrieval ...
Abstract: Pretrained language models have improved effectiveness on numerous tasks, including ad-hoc retrieval. Recent work has shown that continuing to pretrain a language model with auxiliary objectives before fine-tuning on the retrieval task can further improve retrieval effectiveness. Unlike monolingual retrieval, designing an appropriate auxiliary task for cross-language mappings is challenging. To address this challenge, we use comparable Wikipedia articles in different languages to further pretrain off-the-shelf multilingual pretrained models before fine-tuning on the retrieval task. We show that our approach yields improvements in retrieval effectiveness. ... : 6 pages, 2 figures, accepted as a SIGIR 2022 Short Paper ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Information Retrieval cs.IR
URL: https://arxiv.org/abs/2204.11989
https://dx.doi.org/10.48550/arxiv.2204.11989
BASE
Hide details
2
Transfer Learning Approaches for Building Cross-Language Dense Retrieval Models ...
BASE
Show details
3
Cross-language Sentence Selection via Data Augmentation and Rationale Training ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
3
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern