1 |
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing
|
|
|
|
In: ISSN: 0891-2017 ; EISSN: 1530-9312 ; Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-02425462 ; Computational Linguistics, Massachusetts Institute of Technology Press (MIT Press), 2019, 45 (3), pp.559-601. ⟨10.1162/coli_a_00357⟩ ; https://www.mitpressjournals.org/doi/abs/10.1162/coli_a_00357 (2019)
|
|
BASE
|
|
Show details
|
|
2 |
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Show Some Love to Your n-grams: A Bit of Progress and Stronger n-gram Language Modeling Baselines ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Do We Really Need Fully Unsupervised Cross-Lingual Embeddings? ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some Misconceptions ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Specialising Distributional Vectors of All Words for Lexical Entailment ...
|
|
|
|
Abstract:
Semantic specialization methods fine-tune distributional word vectors using lexical knowledge from external resources (e.g., WordNet) to accentuate a particular relation between words. However, such post-processing methods suffer from limited coverage as they affect only vectors of words \textit{seen} in the external resources. We present the first post-processing method that specializes vectors of \textit{all vocabulary words} -- including those \textit{unseen} in the resources -- for the \textit{asymmetric} relation of lexical entailment (\textsc{le}) (i.e., hyponymy-hypernymy relation). Leveraging a partially \textsc{le}-specialized distributional space, our \textsc{postle} (i.e., \textit{post-specialization} for \textsc{le}) model learns an explicit global specialization function, allowing for specialization of vectors of unseen words, as well as word vectors from other languages via cross-lingual transfer. We capture the function as a deep feed-forward neural network: its objective re-scales vector ...
|
|
URL: https://dx.doi.org/10.17863/cam.44005 https://www.repository.cam.ac.uk/handle/1810/296964
|
|
BASE
|
|
Hide details
|
|
8 |
Multilingual and cross-lingual graded lexical entailment ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Generalized tuning of distributional word vectors for monolingual and cross-lingual lexical entailment ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Generalized tuning of distributional word vectors for monolingual and cross-lingual lexical entailment
|
|
Glavaš, G; Vulic, Ivan. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
|
|
BASE
|
|
Show details
|
|
11 |
Multilingual and cross-lingual graded lexical entailment
|
|
Vulic, Ivan; Ponzetto, SP; Glavaš, G. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
|
|
BASE
|
|
Show details
|
|
12 |
How to (properly) evaluate cross-lingual word embeddings: On strong baselines, comparative analyses, and some misconceptions
|
|
Glavaš, G; Litschko, R; Ruder, S. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
|
|
BASE
|
|
Show details
|
|
13 |
JW300: A wide-coverage parallel corpus for low-resource languages
|
|
Agic, Ž; Vulic, Ivan. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
|
|
BASE
|
|
Show details
|
|
15 |
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Specializing distributional vectors of all words for lexical entailment
|
|
|
|
BASE
|
|
Show details
|
|
18 |
How to (properly) evaluate cross-lingual word embeddings: On strong baselines, comparative analyses, and some misconceptions
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Cross-lingual semantic specialization via lexical relation induction
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Generalized tuning of distributional word vectors for monolingual and cross-lingual lexical entailment
|
|
|
|
BASE
|
|
Show details
|
|
|
|