DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 50

1
SOCIOFILLMORE: A Tool for Discovering Perspectives ...
BASE
Show details
2
IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation ...
Sarti, Gabriele; Nissim, Malvina. - : arXiv, 2022
BASE
Show details
3
Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer ...
BASE
Show details
4
DALC: the Dutch Abusive Language Corpus ...
BASE
Show details
5
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer ...
BASE
Show details
6
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
7
Generic resources are what you need: Style transfer tasks without task-specific parallel training data ...
BASE
Show details
8
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
9
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
BASE
Show details
10
Teaching NLP with Bracelets and Restaurant Menus: An Interactive Workshop for Italian Students ...
BASE
Show details
11
Teaching NLP with Bracelets and Restaurant Menus:An Interactive Workshop for Italian Students
Pannitto, Ludovica; Busso, Lucia; Combei, Claudia Roberta. - : Association for Computational Linguistics, 2021
BASE
Show details
12
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models ...
BASE
Show details
13
Personal-ITY: A Novel YouTube-based Corpus for Personality Prediction in Italian ...
BASE
Show details
14
Datasets and Models for Authorship Attribution on Italian Personal Writings ...
BASE
Show details
15
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT's Gender Bias ...
BASE
Show details
16
Matching Theory and Data with Personal-ITY: What a Corpus of Italian YouTube Comments Reveals About Personality ...
BASE
Show details
17
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT'S Gender Bias ...
BASE
Show details
18
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
de Vries, Wietse; Nissim, Malvina. - : arXiv, 2020
BASE
Show details
19
Fair Is Better than Sensational: Man Is to Doctor as Woman Is to Doctor
In: Computational Linguistics, Vol 46, Iss 2, Pp 487-497 (2020) (2020)
BASE
Show details
20
BERTje: A Dutch BERT Model ...
Abstract: The transformer-based pre-trained language model BERT has helped to improve state-of-the-art performance on many natural language processing (NLP) tasks. Using the same architecture and parameters, we developed and evaluated a monolingual Dutch BERT model called BERTje. Compared to the multilingual BERT model, which includes Dutch but is only based on Wikipedia text, BERTje is based on a large and diverse dataset of 2.4 billion tokens. BERTje consistently outperforms the equally-sized multilingual BERT model on downstream NLP tasks (part-of-speech tagging, named-entity recognition, semantic role labeling, and sentiment analysis). Our pre-trained Dutch BERT model is made available at https://github.com/wietsedv/bertje. ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1912.09582
https://dx.doi.org/10.48550/arxiv.1912.09582
BASE
Hide details

Page: 1 2 3

Catalogues
1
0
4
0
0
0
0
Bibliographies
7
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
40
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern