DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
Emergent Communication Pretraining for Few-Shot Machine Translation ...
Li, Yaoyiran; Ponti, Edoardo; Vulic, Ivan. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
2
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
Ponti, Edoardo; Glavaš, Goran; Majewska, Olga. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
3
Emergent Communication Pretraining for Few-Shot Machine Translation ...
BASE
Show details
4
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
5
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
Lauscher, Anne; Vulic, Ivan; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
6
Multi-SimLex: A Large-Scale Evaluation of Multilingual and Cross-Lingual Lexical Semantic Similarity ...
Vulic, Ivan; Baker, Simon; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
7
Probing Pretrained Language Models for Lexical Semantics ...
Vulic, Ivan; Ponti, Edoardo; Litschko, Robert; Glavas, Goran; Korhonen, Anna-Leena. - : Apollo - University of Cambridge Repository, 2020
Abstract: The success of large pretrained language models (LMs) such as BERT and RoBERTa has sparked interest in probing their representations, in order to unveil what types of knowledge they implicitly capture. While prior research focused on morphosyntactic, semantic, and world knowledge, it remains unclear to which extent LMs also derive lexical type-level knowledge from words in context. In this work, we present a systematic empirical analysis across six typologically diverse languages and five different lexical tasks, addressing the following questions: 1) How do different lexical knowledge extraction strategies (monolingual versus multilingual source LM, out-of-context versus in-context encoding, inclusion of special tokens, and layer-wise averaging) impact performance? How consistent are the observed effects across tasks and languages? 2) Is lexical knowledge stored in few parameters, or is it scattered throughout the network? 3) How do these representations fare against traditional static word vectors in ...
URL: https://dx.doi.org/10.17863/cam.62212
https://www.repository.cam.ac.uk/handle/1810/315105
BASE
Hide details
8
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity
Lauscher, Anne; Vulic, Ivan; Ponti, Edoardo. - : International Committee on Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.coling-main.118, 2020. : Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020
BASE
Show details
9
Probing Pretrained Language Models for Lexical Semantics
Vulic, Ivan; Ponti, Edoardo; Litschko, Robert. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
10
Emergent Communication Pretraining for Few-Shot Machine Translation
Vulic, Ivan; Ponti, Edoardo; Korhonen, Anna. - : International Committee on Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.coling-main.416, 2020. : Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020
BASE
Show details
11
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Liu, Qianchu; Korhonen, Anna-Leena; Majewska, Olga. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern