DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6...257
Hits 21 – 40 of 5.129

21
Curlie Dataset - Language-agnostic Website Embedding and Classification ...
Lugeon, Sylvain; Piccardi, Tiziano. - : figshare, 2022
BASE
Show details
22
Curlie Dataset - Language-agnostic Website Embedding and Classification ...
Lugeon, Sylvain; Piccardi, Tiziano. - : figshare, 2022
BASE
Show details
23
Curlie Dataset - Language-agnostic Website Embedding and Classification ...
Lugeon, Sylvain; Piccardi, Tiziano. - : figshare, 2022
BASE
Show details
24
Curlie Dataset - Language-agnostic Website Embedding and Classification ...
Lugeon, Sylvain; Piccardi, Tiziano. - : figshare, 2022
BASE
Show details
25
From Examples to Rules: Neural Guided Rule Synthesis for Information Extraction ...
BASE
Show details
26
Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations ...
Meng, Yu; Zhang, Yunyi; Huang, Jiaxin. - : arXiv, 2022
BASE
Show details
27
Offensive Language Detection in Under-resourced Algerian Dialectal Arabic Language ...
BASE
Show details
28
Shedding New Light on the Language of the Dark Web ...
BASE
Show details
29
Query Expansion and Entity Weighting for Query Reformulation Retrieval in Voice Assistant Systems ...
BASE
Show details
30
LoL: A Comparative Regularization Loss over Query Reformulation Losses for Pseudo-Relevance Feedback ...
BASE
Show details
31
Finding Inverse Document Frequency Information in BERT ...
Abstract: For many decades, BM25 and its variants have been the dominant document retrieval approach, where their two underlying features are Term Frequency (TF) and Inverse Document Frequency (IDF). The traditional approach, however, is being rapidly replaced by Neural Ranking Models (NRMs) that can exploit semantic features. In this work, we consider BERT-based NRMs and study if IDF information is present in the NRMs. This simple question is interesting because IDF has been indispensable for the traditional lexical matching, but global features like IDF are not explicitly learned by neural language models including BERT. We adopt linear probing as the main analysis tool because typical BERT based NRMs utilize linear or inner-product based score aggregators. We analyze input embeddings, representations of all BERT layers, and the self-attention weights of CLS. By studying MS-MARCO dataset with three BERT-based models, we show that all of them contain information that is strongly dependent on IDF. ... : 5 pages ...
Keyword: FOS Computer and information sciences; H.3.3; Information Retrieval cs.IR
URL: https://arxiv.org/abs/2202.12191
https://dx.doi.org/10.48550/arxiv.2202.12191
BASE
Hide details
32
Improving Word Translation via Two-Stage Contrastive Learning ...
BASE
Show details
33
nigam@COLIEE-22: Legal Case Retrieval and Entailment using Cascading of Lexical and Semantic-based models ...
BASE
Show details
34
Out-of-Domain Semantics to the Rescue! Zero-Shot Hybrid Retrieval Models ...
Chen, Tao; Zhang, Mingyang; Lu, Jing. - : arXiv, 2022
BASE
Show details
35
Dual Skipping Guidance for Document Retrieval with Learned Sparse Representations ...
BASE
Show details
36
Introducing Neural Bag of Whole-Words with ColBERTer: Contextualized Late Interactions using Enhanced Reduction ...
BASE
Show details
37
Transfer Learning Approaches for Building Cross-Language Dense Retrieval Models ...
BASE
Show details
38
TURNER: The Uncertainty-based Retrieval Framework for Chinese NER ...
BASE
Show details
39
LaPraDoR: Unsupervised Pretrained Dense Retriever for Zero-Shot Text Retrieval ...
Xu, Canwen; Guo, Daya; Duan, Nan. - : arXiv, 2022
BASE
Show details
40
A Deep Learning Approach for Repairing Missing Activity Labels in Event Logs for Process Mining ...
Lu, Yang; Chen, Qifan; Poon, Simon K.. - : arXiv, 2022
BASE
Show details

Page: 1 2 3 4 5 6...257

Catalogues
301
39
154
0
1
4
0
Bibliographies
1.653
1
0
0
0
0
0
1
2
Linked Open Data catalogues
0
Online resources
28
0
0
0
Open access documents
3.306
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern