DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 73

1
AUTOLEX: An Automatic Framework for Linguistic Exploration ...
BASE
Show details
2
MCoNaLa: A Benchmark for Code Generation from Multiple Natural Languages ...
BASE
Show details
3
A Systematic Evaluation of Large Language Models of Code ...
BASE
Show details
4
Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation ...
BASE
Show details
5
Phoneme Recognition through Fine Tuning of Phonetic Representations: a Case Study on Luhya Language Varieties ...
BASE
Show details
6
Few-shot Language Coordination by Modeling Theory of Mind ...
BASE
Show details
7
Systematic Inequalities in Language Technology Performance across the World's Languages ...
BASE
Show details
8
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
BASE
Show details
9
Multi-view Subword Regularization ...
BASE
Show details
10
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning ...
BASE
Show details
11
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
BASE
Show details
12
When Does Translation Require Context? A Data-driven, Multilingual Exploration ...
BASE
Show details
13
Breaking Down Multilingual Machine Translation ...
BASE
Show details
14
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
BASE
Show details
15
Distributionally Robust Multilingual Machine Translation ...
Zhou, Chunting; Levy, Daniel; Li, Xian. - : arXiv, 2021
BASE
Show details
16
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages ...
BASE
Show details
17
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
BASE
Show details
18
Meta Back-translation ...
Pham, Hieu; Wang, Xinyi; Yang, Yiming. - : arXiv, 2021
BASE
Show details
19
On The Ingredients of an Effective Zero-shot Semantic Parser ...
Abstract: Semantic parsers map natural language utterances into meaning representations (e.g., programs). Such models are typically bottlenecked by the paucity of training data due to the required laborious annotation efforts. Recent studies have performed zero-shot learning by synthesizing training examples of canonical utterances and programs from a grammar, and further paraphrasing these utterances to improve linguistic diversity. However, such synthetic examples cannot fully capture patterns in real data. In this paper we analyze zero-shot parsers through the lenses of the language and logical gaps (Herzig and Berant, 2019), which quantify the discrepancy of language and programmatic patterns between the canonical examples and real-world user-issued ones. We propose bridging these gaps using improved grammars, stronger paraphrasers, and efficient learning methods using canonical examples that most likely reflect real user intents. Our model achieves strong performance on two semantic parsing benchmarks (Scholar, ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2110.08381
https://arxiv.org/abs/2110.08381
BASE
Hide details
20
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
BASE
Show details

Page: 1 2 3 4

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
73
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern