DE eng

Search in the Catalogues and Directories

Hits 1 – 3 of 3

1
End-to-end style-conditioned poetry generation: What does it take to learn from examples alone? ...
Abstract: In this work, we design an end-to-end model for poetry generation based on conditioned recurrent neural network (RNN) language models whose goal is to learn stylistic features (poem length, sentiment, alliteration, and rhyming) from examples alone. We show this model successfully learns the ‘meaning’ of length and sentiment, as we can control it to generate longer or shorter as well as more positive or more negative poems. However, the model does not grasp sound phenomena like alliter- ation and rhyming, but instead exploits low-level statistical cues. Possible reasons include the size of the training data, the relatively low frequency and difficulty of these sublexical phenomena as well as model biases. We show that more recent GPT-2 models also have problems learning sublexical phenomena such as rhyming from examples alone. ...
Keyword: Computational Creativity; Computational Linguistics; Language Models; Machine Learning; Natural language generation; Natural Language Processing; Neural Network; Sentiment Analysis; Text Generation
URL: https://dx.doi.org/10.48448/vbjh-jw57
https://underline.io/lecture/39414-end-to-end-style-conditioned-poetry-generation-what-does-it-take-to-learn-from-examples-alonequestion
BASE
Hide details
2
Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training ...
BASE
Show details
3
Adverse Drug Reaction Classification of Tweets with Fusion of Text and Drug Representations ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
3
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern