DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 25

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Unsupervised Translation of German--Lower Sorbian: Exploring Training and Novel Transfer Methods on a Low-Resource Language ...
BASE
Show details
5
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
6
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
7
UDapter: Language Adaptation for Truly Universal Dependency Parsing ...
BASE
Show details
8
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
9
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
10
BERTje: A Dutch BERT Model ...
Abstract: The transformer-based pre-trained language model BERT has helped to improve state-of-the-art performance on many natural language processing (NLP) tasks. Using the same architecture and parameters, we developed and evaluated a monolingual Dutch BERT model called BERTje. Compared to the multilingual BERT model, which includes Dutch but is only based on Wikipedia text, BERTje is based on a large and diverse dataset of 2.4 billion tokens. BERTje consistently outperforms the equally-sized multilingual BERT model on downstream NLP tasks (part-of-speech tagging, named-entity recognition, semantic role labeling, and sentiment analysis). Our pre-trained Dutch BERT model is made available at https://github.com/wietsedv/bertje. ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1912.09582
https://dx.doi.org/10.48550/arxiv.1912.09582
BASE
Hide details
11
Universal Dependencies 2.3
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2018
BASE
Show details
12
Universal Dependencies 2.2
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2018
BASE
Show details
13
Universal Dependencies 2.0 alpha (obsolete)
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2017
BASE
Show details
14
Universal Dependencies 2.0
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2017
BASE
Show details
15
Universal Dependencies 2.0 – CoNLL 2017 Shared Task Development and Test Data
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2017
BASE
Show details
16
Universal Dependencies 2.1
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2017
BASE
Show details
17
Bilingual Learning of Multi-sense Embeddings with Discrete Autoencoders ...
BASE
Show details
18
Universal Dependencies 1.4
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2016
BASE
Show details
19
Universal Dependencies 1.3
Nivre, Joakim; Agić, Željko; Ahrenberg, Lars. - : Universal Dependencies Consortium, 2016
BASE
Show details
20
Approximation and Exactness in Finite State Optimality Theory ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
25
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern