DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 31

1
Natural Language Descriptions of Deep Visual Features ...
BASE
Show details
2
Compositionality as Lexical Symmetry ...
Akyürek, Ekin; Andreas, Jacob. - : arXiv, 2022
BASE
Show details
3
Language as a bootstrap for compositional visual reasoning
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
BASE
Show details
4
Compositional Models for Few Shot Sequence Learning
Akyurek, Ekin. - : Massachusetts Institute of Technology, 2021
BASE
Show details
5
Cetacean Translation Initiative: a roadmap to deciphering the communication of sperm whales ...
BASE
Show details
6
Implicit Representations of Meaning in Neural Language Models ...
BASE
Show details
7
Implicit Representations of Meaning in Neural Language Models ...
BASE
Show details
8
How Do Neural Sequence Models Generalize? Local and Global Cues for Out-of-Distribution Prediction ...
BASE
Show details
9
Value-Agnostic Conversational Semantic Parsing ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.284 Abstract: Conversational semantic parsers map user utterances to executable programs given dialogue histories composed of previous utterances, programs, and system responses. Existing parsers typically condition on rich representations of history that include the complete set of values and computations previously discussed. We propose a model that abstracts over values to focus prediction on type- and function-level context. This approach provides a compact encoding of dialogue histories and predicted programs, improving generalization and computational efficiency. Our model incorporates several other components, including an atomic span copy operation and structural enforcement of well-formedness constraints on predicted programs, that are particularly advantageous in the low-data regime. Trained on the SMCalFlow and TreeDST datasets, our model outperforms prior work by 7.3% and 10.6% respectively in terms of absolute accuracy. Trained on only a ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25635-value-agnostic-conversational-semantic-parsing
https://dx.doi.org/10.48448/475n-cx87
BASE
Hide details
10
Lexicon Learning for Few-Shot Neural Sequence Modeling ...
Akyürek, Ekin; Andreas, Jacob. - : arXiv, 2021
BASE
Show details
11
What Context Features Can Transformer Language Models Use? ...
O'Connor, Joe; Andreas, Jacob. - : arXiv, 2021
BASE
Show details
12
Quantifying Adaptability in Pre-trained Language Models with 500 Tasks ...
BASE
Show details
13
One-Shot Lexicon Learning for Low-Resource Machine Translation ...
BASE
Show details
14
Lexicon Learning for Few Shot Sequence Modeling ...
BASE
Show details
15
What Context Features Can Transformer Language Models Use? ...
BASE
Show details
16
The Low-Dimensional Linear Geometry of Contextualized Word Representations ...
Hernandez, Evan; Andreas, Jacob. - : arXiv, 2021
BASE
Show details
17
The Low-Dimensional Linear Geometry of Contextualized Word Representations ...
BASE
Show details
18
Experience Grounds Language ...
BASE
Show details
19
Compositional Explanations of Neurons ...
Mu, Jesse; Andreas, Jacob. - : arXiv, 2020
BASE
Show details
20
A Benchmark for Systematic Generalization in Grounded Language Understanding ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
31
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern