DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
C3: Continued Pretraining with Contrastive Weak Supervision for Cross Language Ad-Hoc Retrieval ...
BASE
Show details
2
Transfer Learning Approaches for Building Cross-Language Dense Retrieval Models ...
Abstract: The advent of transformer-based models such as BERT has led to the rise of neural ranking models. These models have improved the effectiveness of retrieval systems well beyond that of lexical term matching models such as BM25. While monolingual retrieval tasks have benefited from large-scale training collections such as MS MARCO and advances in neural architectures, cross-language retrieval tasks have fallen behind these advancements. This paper introduces ColBERT-X, a generalization of the ColBERT multi-representation dense retrieval model that uses the XLM-RoBERTa (XLM-R) encoder to support cross-language information retrieval (CLIR). ColBERT-X can be trained in two ways. In zero-shot training, the system is trained on the English MS MARCO collection, relying on the XLM-R encoder for cross-language mappings. In translate-train, the system is trained on the MS MARCO English queries coupled with machine translations of the associated MS MARCO passages. Results on ad hoc document ranking tasks in several ... : Accepted at ECIR 2022 (Full paper) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Information Retrieval cs.IR
URL: https://arxiv.org/abs/2201.08471
https://dx.doi.org/10.48550/arxiv.2201.08471
BASE
Hide details
3
Goldilocks: Just-Right Tuning of BERT for Technology-Assisted Review ...
BASE
Show details
4
USNA: A Dual-Classifier Approach to Contextual Sentiment Analysis
In: DTIC (2013)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern