3 |
Language as a bootstrap for compositional visual reasoning
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
5 |
Cetacean Translation Initiative: a roadmap to deciphering the communication of sperm whales ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Implicit Representations of Meaning in Neural Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Implicit Representations of Meaning in Neural Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
How Do Neural Sequence Models Generalize? Local and Global Cues for Out-of-Distribution Prediction ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Value-Agnostic Conversational Semantic Parsing ...
|
|
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021; Andreas, Jacob; Guo, algu@microsoft.com; Klein, Dan; Krishnamurthy, Jayant; Kyte, Alex; Pauls, Adam; Platanios, Anthony; Roy, Subhro; Thomson, Sam; Wolfe, Jason; Zhang, Yuchen. - : Underline Science Inc., 2021
|
|
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-long.284 Abstract: Conversational semantic parsers map user utterances to executable programs given dialogue histories composed of previous utterances, programs, and system responses. Existing parsers typically condition on rich representations of history that include the complete set of values and computations previously discussed. We propose a model that abstracts over values to focus prediction on type- and function-level context. This approach provides a compact encoding of dialogue histories and predicted programs, improving generalization and computational efficiency. Our model incorporates several other components, including an atomic span copy operation and structural enforcement of well-formedness constraints on predicted programs, that are particularly advantageous in the low-data regime. Trained on the SMCalFlow and TreeDST datasets, our model outperforms prior work by 7.3% and 10.6% respectively in terms of absolute accuracy. Trained on only a ...
|
|
Keyword:
Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
|
|
URL: https://underline.io/lecture/25635-value-agnostic-conversational-semantic-parsing https://dx.doi.org/10.48448/475n-cx87
|
|
BASE
|
|
Hide details
|
|
11 |
What Context Features Can Transformer Language Models Use? ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Quantifying Adaptability in Pre-trained Language Models with 500 Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
One-Shot Lexicon Learning for Low-Resource Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
What Context Features Can Transformer Language Models Use? ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
The Low-Dimensional Linear Geometry of Contextualized Word Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
The Low-Dimensional Linear Geometry of Contextualized Word Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
A Benchmark for Systematic Generalization in Grounded Language Understanding ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|