DE eng

Search in the Catalogues and Directories

Hits 1 – 3 of 3

1
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
2
Do We Really Need Fully Unsupervised Cross-Lingual Embeddings? ...
Abstract: Recent efforts in cross-lingual word embedding (CLWE) learning have predominantly focused on fully unsupervised approaches that project monolingual embeddings into a shared cross-lingual space without any cross-lingual signal. The lack of any supervision makes such approaches conceptually attractive. Yet, their only core difference from (weakly) supervised projection-based CLWE methods is in the way they obtain a seed dictionary used to initialize an iterative self-learning procedure. The fully unsupervised methods have arguably become more robust, and their primary use case is CLWE induction for pairs of resource-poor and distant languages. In this paper, we question the ability of even the most robust unsupervised CLWE approaches to induce meaningful CLWEs in these more challenging settings. A series of bilingual lexicon induction (BLI) experiments with 15 diverse languages (210 language pairs) show that fully unsupervised CLWE methods still fail for a large number of language pairs (e.g., they yield zero ... : EMNLP 2019 (Long paper) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1909.01638
https://arxiv.org/abs/1909.01638
BASE
Hide details
3
Informing unsupervised pretraining with external linguistic knowledge
Lauscher, Anne; Vulić, Ivan; Ponti, Edoardo Maria. - : Cornell University, 2019
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
3
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern