DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 55

1
Ara-Women-Hate: The first Arabic Hate Speech corpus regarding Women ...
BASE
Show details
2
Towards the Early Detection of Child Predators in Chat Rooms: A BERT-based Approach ...
BASE
Show details
3
STaCK: Sentence Ordering with Temporal Commonsense Knowledge ...
BASE
Show details
4
Searching for an Effective Defender: Benchmarking Defense against Adversarial Word Substitution ...
BASE
Show details
5
Graphine: A Dataset for Graph-aware Terminology Definition Generation ...
BASE
Show details
6
To what extent do human explanations of model behavior align with actual model behavior? ...
BASE
Show details
7
Time-aware Graph Neural Network for Entity Alignment between Temporal Knowledge Graphs ...
BASE
Show details
8
What’s Hidden in a One-layer Randomly Weighted Transformer? ...
BASE
Show details
9
Finetuning Pretrained Transformers into RNNs ...
BASE
Show details
10
Sometimes We Want Ungrammatical Translations ...
BASE
Show details
11
Pruning Neural Machine Translation for Speed Using Group Lasso ...
BASE
Show details
12
Elementary-Level Math Word Problem Generation using Pre-Trained Transformers ...
BASE
Show details
13
Does External Knowledge Help Explainable Natural Language Inference? Automatic Evaluation vs. Human Ratings ...
BASE
Show details
14
The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation ...
Abstract: A ���bigger is better�۝ explosion in the number of parameters in deep neural networks has made it increasingly challenging to make state-of-the-art networks accessible in compute-restricted environments. Compression techniques have taken on renewed importance as a way to bridge the gap. However, evaluation of the trade-offs incurred by popular compression techniques has been centered on high-resource datasets. In this work, we instead consider the impact of compression in a data-limited regime. We introduce the term low-resource double bind to refer to the co-occurrence of data limitations and compute resource constraints. This is a common setting for NLP for low-resource languages, yet the trade-offs in performance are poorly studied. Our work offers surprising insights into the relationship between capacity and generalization in data-limited regimes for the task of machine translation. Our experiments on magnitude pruning for translations from English into Yoruba, Hausa, Igbo and German show that in ...
Keyword: Computational Linguistics; Machine Learning; Machine Learning and Data Mining; Machine translation; Natural Language Processing; Neural Network
URL: https://underline.io/lecture/39484-the-low-resource-double-bind-an-empirical-study-of-pruning-for-low-resource-machine-translation
https://dx.doi.org/10.48448/mdq2-6d93
BASE
Hide details
15
Knowledge Graph Representation Learning using Ordinary Differential Equations ...
BASE
Show details
16
What Models Know About Their Attackers: Deriving Attacker Information From Latent Representations ...
BASE
Show details
17
Mind the Context: The Impact of Contextualization in Neural Module Networks for Grounding Visual Referring Expressions ...
BASE
Show details
18
EM ALBERT: a step towards equipping Manipuri for NLP ...
BASE
Show details
19
ProtoInfoMax: Prototypical Networks with Mutual Information Maximization for Out-of-Domain Detection ...
BASE
Show details
20
Influence Tuning: Demoting Spurious Correlations via Instance Attribution and Instance-Driven Updates ...
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
55
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern