4 |
Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
UDapter: Language Adaptation for Truly Universal Dependency Parsing ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
BERTje: A Dutch BERT Model ...
|
|
|
|
Abstract:
The transformer-based pre-trained language model BERT has helped to improve state-of-the-art performance on many natural language processing (NLP) tasks. Using the same architecture and parameters, we developed and evaluated a monolingual Dutch BERT model called BERTje. Compared to the multilingual BERT model, which includes Dutch but is only based on Wikipedia text, BERTje is based on a large and diverse dataset of 2.4 billion tokens. BERTje consistently outperforms the equally-sized multilingual BERT model on downstream NLP tasks (part-of-speech tagging, named-entity recognition, semantic role labeling, and sentiment analysis). Our pre-trained Dutch BERT model is made available at https://github.com/wietsedv/bertje. ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/1912.09582 https://dx.doi.org/10.48550/arxiv.1912.09582
|
|
BASE
|
|
Hide details
|
|
15 |
Universal Dependencies 2.0 – CoNLL 2017 Shared Task Development and Test Data
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Approximation and Exactness in Finite State Optimality Theory ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|