DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 33

1
WANLI: Worker and AI Collaboration for Natural Language Inference Dataset Creation ...
BASE
Show details
2
COLD Decoding: Energy-based Constrained Text Generation with Langevin Dynamics ...
BASE
Show details
3
Annotators with Attitudes: How Annotator Beliefs And Identities Bias Toxic Language Detection ...
BASE
Show details
4
PIGLeT: Language Grounding Through Neuro-Symbolic Interaction in a 3D World ...
BASE
Show details
5
Contrastive Explanations for Model Interpretability ...
BASE
Show details
6
PIGLeT: Language Grounding Through Neuro-Symbolic Interaction in a 3D World ...
BASE
Show details
7
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.114 Abstract: Publicly available, large pretrained Language Models (LMs) generate text with remarkable quality, but only sequentially from left to right. As a result, they are not immediately applicable to generation tasks that break the unidirectional assumption, such as paraphrasing or text-infilling, necessitating task-specific supervision. In this paper, we present Reflective Decoding, a novel unsupervised algorithm that allows for direct application of unidirectional LMs to non-sequential tasks. Our 2-step approach requires no supervision or even parallel corpora, only two off-the-shelf pretrained LMs in opposite directions: forward and backward. First, in the contextualization step, we use LMs to generate ensembles of past and future contexts which collectively capture the input (e.g. the source sentence for paraphrasing). Second, in the reflection step, we condition on these ``context ensembles'', generating outputs that are compatible with them. ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/26006-reflective-decoding-beyond-unidirectional-generation-with-off-the-shelf-language-models
https://dx.doi.org/10.48448/swwt-ax71
BASE
Hide details
8
Conversational Multi-Hop Reasoning with Neural Commonsense Knowledge and Symbolic Logic Rules ...
BASE
Show details
9
Edited Media Understanding Frames: Reasoning About the Intent and Implications of Visual Misinformation ...
BASE
Show details
10
GO FIGURE: A Meta Evaluation of Factuality in Summarization ...
BASE
Show details
11
CLIPScore: A Reference-free Evaluation Metric for Image Captioning ...
BASE
Show details
12
Moral Stories: Situated Reasoning about Norms, Intents, Actions, and their Consequences ...
BASE
Show details
13
DExperts: Decoding-Time Controlled Text Generation with Experts and Anti-Experts ...
BASE
Show details
14
Challenges in Automated Debiasing for Toxic Language Detection ...
BASE
Show details
15
NeuroLogic A*esque Decoding: Constrained Text Generation with Lookahead Heuristics ...
Lu, Ximing; Welleck, Sean; West, Peter. - : arXiv, 2021
BASE
Show details
16
From Physical to Social Commonsense: Natural Language and the Natural World
Forbes, Maxwell. - 2021
BASE
Show details
17
Positive AI with Social Commonsense Models
Sap, Maarten. - 2021
BASE
Show details
18
Generative Data Augmentation for Commonsense Reasoning ...
BASE
Show details
19
NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints ...
BASE
Show details
20
Understanding Natural Language with Commonsense Knowledge Representation, Reasoning, and Simulation
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
33
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern