DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 23

1
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing
In: ISSN: 0891-2017 ; EISSN: 1530-9312 ; Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-02425462 ; Computational Linguistics, Massachusetts Institute of Technology Press (MIT Press), 2019, 45 (3), pp.559-601. ⟨10.1162/coli_a_00357⟩ ; https://www.mitpressjournals.org/doi/abs/10.1162/coli_a_00357 (2019)
BASE
Show details
2
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing ...
Ponti, Edoardo; O'Horan, Helen; Berzak, Yevgeni. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
3
Show Some Love to Your n-grams: A Bit of Progress and Stronger n-gram Language Modeling Baselines ...
Shareghi, Ehsan; Gerz, Daniela; Vulic, Ivan; Korhonen, Anna-Leena. - : Apollo - University of Cambridge Repository, 2019
Abstract: In recent years neural language models (LMs) have set state-of-the-art performance for several benchmarking datasets. While the reasons for their success and their computational demand are well-documented, a comparison between neural models and more recent developments in n-gram models is neglected. In this paper, we examine the recent progress in n-gram literature, running experiments on 50 languages covering all morphological language families. Experimental results illustrate that a simple extension of Modified Kneser-Ney outperforms an LSTM language model on 42 languages while a word-level Bayesian n-gram LM outperforms the character-aware neural model on average across all languages, and its extension which explicitly injects linguistic knowledge on 8 languages. Further experiments on larger Europarl datasets for 3 languages indicate that neural architectures are able to outperform computationally much cheaper n-gram models: n-gram training is up to 15,000 times quicker. Our experiments illustrate that ...
URL: https://www.repository.cam.ac.uk/handle/1810/292617
https://dx.doi.org/10.17863/cam.39778
BASE
Hide details
4
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
5
Do We Really Need Fully Unsupervised Cross-Lingual Embeddings? ...
BASE
Show details
6
How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some Misconceptions ...
BASE
Show details
7
Specialising Distributional Vectors of All Words for Lexical Entailment ...
Kamath, Aishwarya; Pfeiffer, Jonas; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
8
Multilingual and cross-lingual graded lexical entailment ...
Vulic, Ivan; Ponzetto, SP; Glavaš, G. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
9
Generalized tuning of distributional word vectors for monolingual and cross-lingual lexical entailment ...
Glavaš, G; Vulic, Ivan. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
10
Generalized tuning of distributional word vectors for monolingual and cross-lingual lexical entailment
Glavaš, G; Vulic, Ivan. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
BASE
Show details
11
Multilingual and cross-lingual graded lexical entailment
Vulic, Ivan; Ponzetto, SP; Glavaš, G. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
BASE
Show details
12
How to (properly) evaluate cross-lingual word embeddings: On strong baselines, comparative analyses, and some misconceptions
Glavaš, G; Litschko, R; Ruder, S. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
BASE
Show details
13
JW300: A wide-coverage parallel corpus for low-resource languages
Agic, Ž; Vulic, Ivan. - : Association for Computational Linguistics, 2019. : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference, 2019
BASE
Show details
14
Unsupervised cross-lingual representation learning
Ruder, S; Søgaard, A; Vulic, Ivan. - : ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Tutorial Abstracts, 2019
BASE
Show details
15
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing
Reichart, Roi; Shutova, Ekaterina; Korhonen, Anna-Leena. - : MIT Press - Journals, 2019. : COMPUTATIONAL LINGUISTICS, 2019
BASE
Show details
16
Multilingual and cross-lingual graded lexical entailment
Glavaš, Goran; Vulić, Ivan; Ponzetto, Simone Paolo. - : Association for Computational Linguistics, 2019
BASE
Show details
17
Specializing distributional vectors of all words for lexical entailment
Ponti, Edoardo Maria; Kamath, Aishwarya; Pfeiffer, Jonas. - : Association for Computational Linguistics, 2019
BASE
Show details
18
How to (properly) evaluate cross-lingual word embeddings: On strong baselines, comparative analyses, and some misconceptions
Glavaš, Goran; Litschko, Robert; Ruder, Sebastian. - : Association for Computational Linguistics, 2019
BASE
Show details
19
Cross-lingual semantic specialization via lexical relation induction
Glavaš, Goran; Vulić, Ivan; Korhonen, Anna. - : Association for Computational Linguistics, 2019
BASE
Show details
20
Generalized tuning of distributional word vectors for monolingual and cross-lingual lexical entailment
Vulić, Ivan; Glavaš, Goran. - : Association for Computational Linguistics, 2019
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
23
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern