5 |
Deep learning for sentence clustering in essay grading support ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Morpho-syntactically annotated corpora provided for the PARSEME Shared Task on Semi-Supervised Identification of Verbal Multiword Expressions (edition 1.2)
|
|
|
|
BASE
|
|
Show details
|
|
10 |
WikiBERT models: deep transfer learning for many languages ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Dependency parsing of biomedical text with BERT
|
|
|
|
In: BMC Bioinformatics (2020)
|
|
BASE
|
|
Show details
|
|
16 |
Multilingual is not enough: BERT for Finnish ...
|
|
|
|
Abstract:
Deep learning-based language models pretrained on large unannotated text corpora have been demonstrated to allow efficient transfer learning for natural language processing, with recent approaches such as the transformer-based BERT model advancing the state of the art across a variety of tasks. While most work on these models has focused on high-resource languages, in particular English, a number of recent efforts have introduced multilingual models that can be fine-tuned to address tasks in a large number of different languages. However, we still lack a thorough understanding of the capabilities of these models, in particular for lower-resourced languages. In this paper, we focus on Finnish and thoroughly evaluate the multilingual BERT model on a range of tasks, comparing it with a new Finnish BERT model trained from scratch. The new language-specific model is shown to systematically and clearly outperform the multilingual. While the multilingual model largely fails to reach the performance of previously ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/1912.07076 https://dx.doi.org/10.48550/arxiv.1912.07076
|
|
BASE
|
|
Hide details
|
|
17 |
Universal Dependencies 2.2
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
|
|
BASE
|
|
Show details
|
|
|
|