DE eng

Search in the Catalogues and Directories

Hits 1 – 8 of 8

1
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
Ponti, Edoardo; Glavaš, Goran; Majewska, Olga. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
2
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
3
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
Lauscher, Anne; Vulic, Ivan; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
4
Probing Pretrained Language Models for Lexical Semantics ...
Vulic, Ivan; Ponti, Edoardo; Litschko, Robert. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
5
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity
Lauscher, Anne; Vulic, Ivan; Ponti, Edoardo. - : International Committee on Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.coling-main.118, 2020. : Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020
BASE
Show details
6
Probing Pretrained Language Models for Lexical Semantics
Vulic, Ivan; Ponti, Edoardo; Litschko, Robert. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
7
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Liu, Qianchu; Korhonen, Anna-Leena; Majewska, Olga. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
8
Specialising Distributional Vectors of All Words for Lexical Entailment ...
Kamath, Aishwarya; Pfeiffer, Jonas; Ponti, Edoardo; Glavas, Goran; Vulic, Ivan. - : Apollo - University of Cambridge Repository, 2019
Abstract: Semantic specialization methods fine-tune distributional word vectors using lexical knowledge from external resources (e.g., WordNet) to accentuate a particular relation between words. However, such post-processing methods suffer from limited coverage as they affect only vectors of words \textit{seen} in the external resources. We present the first post-processing method that specializes vectors of \textit{all vocabulary words} -- including those \textit{unseen} in the resources -- for the \textit{asymmetric} relation of lexical entailment (\textsc{le}) (i.e., hyponymy-hypernymy relation). Leveraging a partially \textsc{le}-specialized distributional space, our \textsc{postle} (i.e., \textit{post-specialization} for \textsc{le}) model learns an explicit global specialization function, allowing for specialization of vectors of unseen words, as well as word vectors from other languages via cross-lingual transfer. We capture the function as a deep feed-forward neural network: its objective re-scales vector ...
URL: https://dx.doi.org/10.17863/cam.44005
https://www.repository.cam.ac.uk/handle/1810/296964
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
8
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern