DE eng

Search in the Catalogues and Directories

Hits 1 – 6 of 6

1
UDapter: Language Adaptation for Truly Universal Dependency Parsing ...
BASE
Show details
2
Understanding Cross-Lingual Syntactic Transfer in Multilingual Recurrent Neural Networks ...
Dhar, Prajit; Bisazza, Arianna. - : arXiv, 2020
BASE
Show details
3
BERTje: A Dutch BERT Model ...
Abstract: The transformer-based pre-trained language model BERT has helped to improve state-of-the-art performance on many natural language processing (NLP) tasks. Using the same architecture and parameters, we developed and evaluated a monolingual Dutch BERT model called BERTje. Compared to the multilingual BERT model, which includes Dutch but is only based on Wikipedia text, BERTje is based on a large and diverse dataset of 2.4 billion tokens. BERTje consistently outperforms the equally-sized multilingual BERT model on downstream NLP tasks (part-of-speech tagging, named-entity recognition, semantic role labeling, and sentiment analysis). Our pre-trained Dutch BERT model is made available at https://github.com/wietsedv/bertje. ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1912.09582
https://dx.doi.org/10.48550/arxiv.1912.09582
BASE
Hide details
4
Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations ...
Tran, Ke; Bisazza, Arianna. - : arXiv, 2019
BASE
Show details
5
Recurrent Memory Networks for Language Modeling ...
BASE
Show details
6
Neural versus Phrase-Based Machine Translation Quality: a Case Study ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
6
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern