DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 49

1
Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual Retrieval ...
BASE
Show details
2
On cross-lingual retrieval with multilingual text encoders
Litschko, Robert; Vulić, Ivan; Ponzetto, Simone Paolo. - : Springer Science + Business Media, 2022
BASE
Show details
3
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
4
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
5
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
BASE
Show details
6
RedditBias: A Real-World Resource for Bias Evaluation and Debiasing of Conversational Language Models ...
BASE
Show details
7
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
BASE
Show details
8
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
9
Is supervised syntactic parsing beneficial for language understanding tasks? An empirical investigation
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2021
BASE
Show details
10
Evaluating multilingual text encoders for unsupervised cross-lingual retrieval
BASE
Show details
11
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
12
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
13
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
Ponti, Edoardo; Glavaš, Goran; Majewska, Olga. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
14
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
BASE
Show details
15
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
16
Probing Pretrained Language Models for Lexical Semantics ...
Abstract: The success of large pretrained language models (LMs) such as BERT and RoBERTa has sparked interest in probing their representations, in order to unveil what types of knowledge they implicitly capture. While prior research focused on morphosyntactic, semantic, and world knowledge, it remains unclear to which extent LMs also derive lexical type-level knowledge from words in context. In this work, we present a systematic empirical analysis across six typologically diverse languages and five different lexical tasks, addressing the following questions: 1) How do different lexical knowledge extraction strategies (monolingual versus multilingual source LM, out-of-context versus in-context encoding, inclusion of special tokens, and layer-wise averaging) impact performance? How consistent are the observed effects across tasks and languages? 2) Is lexical knowledge stored in few parameters, or is it scattered throughout the network? 3) How do these representations fare against traditional static word vectors in ... : EMNLP 2020: Long paper ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2010.05731
https://dx.doi.org/10.48550/arxiv.2010.05731
BASE
Hide details
17
SemEval-2020 Task 2: Predicting Multilingual and Cross-Lingual (Graded) Lexical Entailment ...
BASE
Show details
18
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
19
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Liu, Qianchu; Korhonen, Anna-Leena; Majewska, Olga. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
20
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
49
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern