DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 27

1
Towards Best Practices for Training Multilingual Dense Retrieval Models ...
BASE
Show details
2
Mr. TyDi: A Multi-lingual Benchmark for Dense Retrieval ...
Zhang, Xinyu; Ma, Xueguang; Shi, Peng. - : arXiv, 2021
BASE
Show details
3
Sparsifying Sparse Representations for Passage Retrieval by Top-$k$ Masking ...
BASE
Show details
4
Cross-Lingual Relevance Transfer for Document Retrieval ...
Shi, Peng; Lin, Jimmy. - : arXiv, 2019
BASE
Show details
5
Aligning Cross-Lingual Entities with Multi-Aspect Information ...
Yang, Hsiu-Wei; Zou, Yanyan; Shi, Peng. - : arXiv, 2019
BASE
Show details
6
End-to-End Open-Domain Question Answering with BERTserini ...
Yang, Wei; Xie, Yuqing; Lin, Aileen. - : arXiv, 2019
BASE
Show details
7
What Would Elsa Do? Freezing Layers During Transformer Fine-Tuning ...
Lee, Jaejun; Tang, Raphael; Lin, Jimmy. - : arXiv, 2019
Abstract: Pretrained transformer-based language models have achieved state of the art across countless tasks in natural language processing. These models are highly expressive, comprising at least a hundred million parameters and a dozen layers. Recent evidence suggests that only a few of the final layers need to be fine-tuned for high quality on downstream tasks. Naturally, a subsequent research question is, "how many of the last layers do we need to fine-tune?" In this paper, we precisely answer this question. We examine two recent pretrained language models, BERT and RoBERTa, across standard tasks in textual entailment, semantic similarity, sentiment analysis, and linguistic acceptability. We vary the number of final layers that are fine-tuned, then study the resulting change in task-specific effectiveness. We show that only a fourth of the final layers need to be fine-tuned to achieve 90% of the original quality. Surprisingly, we also find that fine-tuning all layers does not always help. ... : 5 pages ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1911.03090
https://dx.doi.org/10.48550/arxiv.1911.03090
BASE
Hide details
8
Simple BERT Models for Relation Extraction and Semantic Role Labeling ...
Shi, Peng; Lin, Jimmy. - : arXiv, 2019
BASE
Show details
9
Lucene for Approximate Nearest-Neighbors Search on Arbitrary Dense Vectors ...
Teofili, Tommaso; Lin, Jimmy. - : arXiv, 2019
BASE
Show details
10
Simple Attention-Based Representation Learning for Ranking Short Social Media Posts ...
Shi, Peng; Rao, Jinfeng; Lin, Jimmy. - : arXiv, 2018
BASE
Show details
11
ARCHITECTURE, MODELS, AND ALGORITHMS FOR TEXTUAL SIMILARITY
He, Hua. - 2018
BASE
Show details
12
Integrating Lexical and Temporal Signals in Neural Ranking Models for Searching Social Media Streams ...
Rao, Jinfeng; He, Hua; Zhang, Haotian. - : arXiv, 2017
BASE
Show details
13
Searching to Translate and Translating to Search: When Information Retrieval Meets Machine Translation
Ture, Ferhan. - 2013
BASE
Show details
14
Frontiers, Challenges, and Opportunities for Information Retrieval – Report from SWIRL 2012, The Second Strategic Workshop on Information Retrieval in Lorne
Kelly, Diane; Clarke, Charles L.A.; Moffat, Alistair. - : KTH, Teoretisk datalogi, TCS, 2012. : ACM, 2012
BASE
Show details
15
A cost-effective lexical acquisition process for large-scale thesaurus translation
In: Language resources and evaluation. - Dordrecht [u.a.] : Springer 43 (2009) 1, 27-40
BLLDB
Show details
16
COMPUTATIONAL ANALYSIS OF THE CONVERSATIONAL DYNAMICS OF THE UNITED STATES SUPREME COURT
Hawes, Timothy. - 2009
BASE
Show details
17
Special section on question answering in restricted domains
In: Computational linguistics. - Cambridge, Mass. : MIT Press 33 (2007) 1, 41-133
BLLDB
OLC Linguistik
Show details
18
Answering Clinical Questions with Knowledge-Based and Statistical Techniques
In: Computational linguistics. - Cambridge, Mass. : MIT Press 33 (2007) 1, 63
OLC Linguistik
Show details
19
Syntactic sentence compression in the biomedical domain: facilitating access to related articles
In: Information Retrieval Journal. - Dordrecht [u.a.] : Springer Science + Business Media B.V. 10 (2007) 4-5, 393-414
BLLDB
Show details
20
Multiple Alternative Sentene Compressions as a Tool for Automatic Summarization Tasks
BASE
Show details

Page: 1 2

Catalogues
0
0
4
0
0
0
0
Bibliographies
5
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
20
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern