DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models ...
BASE
Show details
5
Including Signed Languages in Natural Language Processing ...
BASE
Show details
6
Including Signed Languages in Natural Language Processing ...
BASE
Show details
7
Contrastive Explanations for Model Interpretability ...
BASE
Show details
8
Provable Limitations of Acquiring Meaning from Ungrounded Form: What will Future Language Models Understand? ...
BASE
Show details
9
Measuring and Improving Consistency in Pretrained Language Models ...
BASE
Show details
10
Aligning Faithful Interpretations with their Social Attribution ...
BASE
Show details
11
Amnesic Probing: Behavioral Explanation With Amnesic Counterfactuals ...
BASE
Show details
12
Data Augmentation for Sign Language Gloss Translation ...
Abstract: Sign language translation (SLT) is often decomposed into video-to-gloss recognition and gloss-to-text translation, where a gloss is a sequence of transcribed spoken-language words in the order in which they are signed. We focus here on gloss-to-text translation, which we treat as a low-resource neural machine translation (NMT) problem. However, unlike traditional low-resource NMT, gloss-to-text translation differs because gloss-text pairs often have a higher lexical overlap and lower syntactic overlap than pairs of spoken languages. We exploit this lexical overlap and handle syntactic divergence by proposing two rule-based heuristics that generate pseudo-parallel gloss-text pairs from monolingual spoken language text. By pre-training on the thus obtained synthetic data, we improve translation from American Sign Language (ASL) to English and German Sign Language (DGS) to German by up to 3.14 and 2.20 BLEU, respectively. ... : 4 pages, 1 page abstract ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2105.07476
https://dx.doi.org/10.48550/arxiv.2105.07476
BASE
Hide details
13
Effects of Parameter Norm Growth During Transformer Training: Inductive Bias from Gradient Descent ...
BASE
Show details
14
Asking It All: Generating Contextualized Questions for any Semantic Role ...
BASE
Show details
15
Counterfactual Interventions Reveal the Causal Effect of Relative Clause Representations on Agreement Prediction ...
BASE
Show details
16
Neural Extractive Search ...
BASE
Show details
17
Counterfactual Interventions Reveal the Causal Effect of Relative Clause Representations on Agreement Prediction ...
BASE
Show details
18
Ab Antiquo: Neural Proto-language Reconstruction ...
NAACL 2021 2021; Goldberg, Yoav; Meloni, Carlo. - : Underline Science Inc., 2021
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern