DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 33

1
WANLI: Worker and AI Collaboration for Natural Language Inference Dataset Creation ...
BASE
Show details
2
COLD Decoding: Energy-based Constrained Text Generation with Langevin Dynamics ...
BASE
Show details
3
Annotators with Attitudes: How Annotator Beliefs And Identities Bias Toxic Language Detection ...
BASE
Show details
4
PIGLeT: Language Grounding Through Neuro-Symbolic Interaction in a 3D World ...
BASE
Show details
5
Contrastive Explanations for Model Interpretability ...
BASE
Show details
6
PIGLeT: Language Grounding Through Neuro-Symbolic Interaction in a 3D World ...
BASE
Show details
7
Reflective Decoding: Beyond Unidirectional Generation with Off-the-Shelf Language Models ...
BASE
Show details
8
Conversational Multi-Hop Reasoning with Neural Commonsense Knowledge and Symbolic Logic Rules ...
BASE
Show details
9
Edited Media Understanding Frames: Reasoning About the Intent and Implications of Visual Misinformation ...
BASE
Show details
10
GO FIGURE: A Meta Evaluation of Factuality in Summarization ...
BASE
Show details
11
CLIPScore: A Reference-free Evaluation Metric for Image Captioning ...
BASE
Show details
12
Moral Stories: Situated Reasoning about Norms, Intents, Actions, and their Consequences ...
BASE
Show details
13
DExperts: Decoding-Time Controlled Text Generation with Experts and Anti-Experts ...
BASE
Show details
14
Challenges in Automated Debiasing for Toxic Language Detection ...
BASE
Show details
15
NeuroLogic A*esque Decoding: Constrained Text Generation with Lookahead Heuristics ...
Lu, Ximing; Welleck, Sean; West, Peter. - : arXiv, 2021
BASE
Show details
16
From Physical to Social Commonsense: Natural Language and the Natural World
Forbes, Maxwell. - 2021
Abstract: Thesis (Ph.D.)--University of Washington, 2021 ; Along with the meteoric rise of computation-hungry models, NLP research has also produced new handcrafted datasets. These datasets allow us to study problems that are difficult by web scraping alone. We can use such data to evaluate and extend machine learning models into new areas. One area of natural interest is work that connects NLP to the outside world. This dissertation describes four projects that present such datasets and computational models. Each project attempts to situate NLP in a context broader than text alone. As a common thread throughout, we make use of commonsense knowledge, either explicitly or implicitly. The first half of the dissertation covers two projects, Verb Physics and Social Chemistry, which contain explicit representations of commonsense knowledge. Respectively, they capture physical commonsense (e.g., that my house is bigger than I am) and social commonsense (e.g., that it's rude for my roommate to run the blender at 5am). The second half studies language production and evaluation. In this half, commonsense implicitly informs the work. Neural Naturalist addresses language generation from image comparisons. Scarecrow focuses on evaluating text generated by large language models. In the conclusion, we urge the field to embrace communication—not merely natural language—and thereby extend the richness of groundings we consider.
Keyword: commonsense; Computer science; Computer science and engineering; computer vision; machine learning; natural language processing; NLP; social norms
URL: http://hdl.handle.net/1773/48229
BASE
Hide details
17
Positive AI with Social Commonsense Models
Sap, Maarten. - 2021
BASE
Show details
18
Generative Data Augmentation for Commonsense Reasoning ...
BASE
Show details
19
NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints ...
BASE
Show details
20
Understanding Natural Language with Commonsense Knowledge Representation, Reasoning, and Simulation
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
33
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern