DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...29
Hits 1 – 20 of 572

1
Ara-Women-Hate: The first Arabic Hate Speech corpus regarding Women ...
BASE
Show details
2
Towards the Early Detection of Child Predators in Chat Rooms: A BERT-based Approach ...
BASE
Show details
3
One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets ...
BASE
Show details
4
STaCK: Sentence Ordering with Temporal Commonsense Knowledge ...
BASE
Show details
5
Searching for an Effective Defender: Benchmarking Defense against Adversarial Word Substitution ...
BASE
Show details
6
Graphine: A Dataset for Graph-aware Terminology Definition Generation ...
BASE
Show details
7
End-to-end style-conditioned poetry generation: What does it take to learn from examples alone? ...
Abstract: In this work, we design an end-to-end model for poetry generation based on conditioned recurrent neural network (RNN) language models whose goal is to learn stylistic features (poem length, sentiment, alliteration, and rhyming) from examples alone. We show this model successfully learns the ‘meaning’ of length and sentiment, as we can control it to generate longer or shorter as well as more positive or more negative poems. However, the model does not grasp sound phenomena like alliter- ation and rhyming, but instead exploits low-level statistical cues. Possible reasons include the size of the training data, the relatively low frequency and difficulty of these sublexical phenomena as well as model biases. We show that more recent GPT-2 models also have problems learning sublexical phenomena such as rhyming from examples alone. ...
Keyword: Computational Creativity; Computational Linguistics; Language Models; Machine Learning; Natural language generation; Natural Language Processing; Neural Network; Sentiment Analysis; Text Generation
URL: https://dx.doi.org/10.48448/vbjh-jw57
https://underline.io/lecture/39414-end-to-end-style-conditioned-poetry-generation-what-does-it-take-to-learn-from-examples-alonequestion
BASE
Hide details
8
To what extent do human explanations of model behavior align with actual model behavior? ...
BASE
Show details
9
Time-aware Graph Neural Network for Entity Alignment between Temporal Knowledge Graphs ...
BASE
Show details
10
What’s Hidden in a One-layer Randomly Weighted Transformer? ...
BASE
Show details
11
Finetuning Pretrained Transformers into RNNs ...
BASE
Show details
12
Comparing Span Extraction Methods for Semantic Role Labeling ...
BASE
Show details
13
Sometimes We Want Ungrammatical Translations ...
BASE
Show details
14
Pruning Neural Machine Translation for Speed Using Group Lasso ...
BASE
Show details
15
Elementary-Level Math Word Problem Generation using Pre-Trained Transformers ...
BASE
Show details
16
Does External Knowledge Help Explainable Natural Language Inference? Automatic Evaluation vs. Human Ratings ...
BASE
Show details
17
The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation ...
BASE
Show details
18
Knowledge Graph Representation Learning using Ordinary Differential Equations ...
BASE
Show details
19
What Models Know About Their Attackers: Deriving Attacker Information From Latent Representations ...
BASE
Show details
20
Mind the Context: The Impact of Contextualization in Neural Module Networks for Grounding Visual Referring Expressions ...
BASE
Show details

Page: 1 2 3 4 5...29

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
572
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern