DE eng

Search in the Catalogues and Directories

Hits 1 – 5 of 5

1
Code Generation from Natural Language with Less Prior Knowledge and More Monolingual Data ...
BASE
Show details
2
Hierarchical Neural Data Synthesis for Semantic Parsing ...
Yang, Wei; Xu, Peng; Cao, Yanshuai. - : arXiv, 2021
BASE
Show details
3
Optimizing Deeper Transformers on Small Datasets ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.163 Abstract: It is a common belief that training deep transformers from scratch requires large datasets. Consequently, for small datasets, people usually use shallow and simple additional layers on top of pre-trained models during fine-tuning. This work shows that this does not always need to be the case: with proper initialization and optimization, the benefits of very deep transformers can carry over to challenging tasks with small datasets, including Text-to-SQL semantic parsing and logical reading comprehension. In particular, we successfully train 48 layers of transformers, comprising 24 fine-tuned layers from pre-trained RoBERTa and 24 relation-aware layers trained from scratch. With fewer training steps and no task-specific pre-training, we obtain the state of the art performance on the challenging cross-domain Text-to-SQL parsing benchmark Spider. We achieve this by deriving a novel Data dependent Transformer Fixed-update initialization scheme ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/ehsy-3055
https://underline.io/lecture/25482-optimizing-deeper-transformers-on-small-datasets
BASE
Hide details
4
TURING: an Accurate and Interpretable Multi-Hypothesis Cross-Domain Natural Language Database Interface ...
BASE
Show details
5
Adversarial Contrastive Estimation ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
5
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern