DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 50

1
SOCIOFILLMORE: A Tool for Discovering Perspectives ...
BASE
Show details
2
IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation ...
Sarti, Gabriele; Nissim, Malvina. - : arXiv, 2022
BASE
Show details
3
Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer ...
BASE
Show details
4
DALC: the Dutch Abusive Language Corpus ...
BASE
Show details
5
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer ...
BASE
Show details
6
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
7
Generic resources are what you need: Style transfer tasks without task-specific parallel training data ...
BASE
Show details
8
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
9
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.74 Abstract: Large generative language models have been very successful for English, but other languages lag behind, in part due to data and computational limitations. We propose a method that may overcome these problems by adapting existing pre-trained models to new languages. Specifically, we describe the adaptation of English GPT-2 to Italian and Dutch by retraining lexical embeddings without tuning the Transformer layers. As a result, we obtain lexical embeddings for Italian and Dutch that are aligned with the original English lexical embeddings. Additionally, we scale up complexity by transforming relearned lexical embeddings of GPT-2 small to the GPT-2 medium embedding space. This method minimises the amount of training and prevents losing information during adaptation that was learned by GPT-2. English GPT-2 models with relearned lexical embeddings can generate realistic sentences in Italian and Dutch. Though on average these sentences are ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/2v1w-0k32
https://underline.io/lecture/26165-as-good-as-new.-how-to-successfully-recycle-english-gpt-2-to-make-models-for-other-languages
BASE
Hide details
10
Teaching NLP with Bracelets and Restaurant Menus: An Interactive Workshop for Italian Students ...
BASE
Show details
11
Teaching NLP with Bracelets and Restaurant Menus:An Interactive Workshop for Italian Students
Pannitto, Ludovica; Busso, Lucia; Combei, Claudia Roberta. - : Association for Computational Linguistics, 2021
BASE
Show details
12
What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models ...
BASE
Show details
13
Personal-ITY: A Novel YouTube-based Corpus for Personality Prediction in Italian ...
BASE
Show details
14
Datasets and Models for Authorship Attribution on Italian Personal Writings ...
BASE
Show details
15
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT's Gender Bias ...
BASE
Show details
16
Matching Theory and Data with Personal-ITY: What a Corpus of Italian YouTube Comments Reveals About Personality ...
BASE
Show details
17
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT'S Gender Bias ...
BASE
Show details
18
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
de Vries, Wietse; Nissim, Malvina. - : arXiv, 2020
BASE
Show details
19
Fair Is Better than Sensational: Man Is to Doctor as Woman Is to Doctor
In: Computational Linguistics, Vol 46, Iss 2, Pp 487-497 (2020) (2020)
BASE
Show details
20
BERTje: A Dutch BERT Model ...
BASE
Show details

Page: 1 2 3

Catalogues
1
0
4
0
0
0
0
Bibliographies
7
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
40
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern