Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (3)
Machine Learning (3)
Natural Language Processing (3)
Neural Network (3)
Sentiment Analysis (3)
Language Models (2)
Machine Learning and Data Mining (2)
Computational Creativity (1)
Natural language generation (1)
Text Generation (1)
Creator / Publisher
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 3 of 3
1
End-to-end style-conditioned poetry generation: What does it take to learn from examples alone? ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Eger, Steffen
. - : Underline Science Inc., 2021
Abstract:
In this work, we design an end-to-end model for poetry generation based on conditioned recurrent neural network (RNN) language models whose goal is to learn stylistic features (poem length, sentiment, alliteration, and rhyming) from examples alone. We show this model successfully learns the ‘meaning’ of length and sentiment, as we can control it to generate longer or shorter as well as more positive or more negative poems. However, the model does not grasp sound phenomena like alliter- ation and rhyming, but instead exploits low-level statistical cues. Possible reasons include the size of the training data, the relatively low frequency and difficulty of these sublexical phenomena as well as model biases. We show that more recent GPT-2 models also have problems learning sublexical phenomena such as rhyming from examples alone. ...
Keyword:
Computational Creativity
;
Computational Linguistics
;
Language Models
;
Machine Learning
;
Natural language generation
;
Natural Language Processing
;
Neural Network
;
Sentiment Analysis
;
Text Generation
URL:
https://dx.doi.org/10.48448/vbjh-jw57
https://underline.io/lecture/39414-end-to-end-style-conditioned-poetry-generation-what-does-it-take-to-learn-from-examples-alonequestion
BASE
Hide details
2
Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Li, Zhengyan
. - : Underline Science Inc., 2021
BASE
Show details
3
Adverse Drug Reaction Classification of Tweets with Fusion of Text and Drug Representations ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Sakhovskiy, Andrey
;
Tutubalina, Elena
. - : Underline Science Inc., 2021
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
3
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern