DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
DALC: the Dutch Abusive Language Corpus ...
BASE
Show details
2
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-short.62 Abstract: Scarcity of parallel data causes formality style transfer models to have scarce success in preserving content. We show that fine-tuning pre-trained language (GPT-2) and sequence-to-sequence (BART) models boosts content preservation, and that this is possible even with limited amounts of parallel data. Augmenting these models with rewards that target style and content –the two core aspects of the task– we achieve a new state-of-the-art. ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/d768-0436
https://underline.io/lecture/25601-thank-you-bart!-rewarding-pre-trained-models-improves-formality-style-transfer
BASE
Hide details
3
Adapting Monolingual Models: Data can be Scarce when Language Similarity is High ...
BASE
Show details
4
As Good as New. How to Successfully Recycle English GPT-2 to Make Models for Other Languages ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern