DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6...72
Hits 21 – 40 of 1.423

21
HIT - A Hierarchically Fused Deep Attention Network for Robust Code-mixed Language Representation ...
BASE
Show details
22
Minimally-Supervised Morphological Segmentation using Adaptor Grammars with Linguistic Priors ...
BASE
Show details
23
Bridging Subword Gaps in Pretrain-Finetune Paradigm for Natural Language Generation ...
BASE
Show details
24
LearnDA: Learnable Knowledge-Guided Data Augmentation for Event Causality Identification ...
BASE
Show details
25
Quotation Recommendation and Interpretation Based on Transformation from Queries to Quotations ...
BASE
Show details
26
How Did This Get Funded?! Automatically Identifying Quirky Scientific Achievements ...
BASE
Show details
27
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.106 Abstract: Model-agnostic meta-learning (MAML) has been recently put forth as a strategy to learn resource-poor languages in a sample-efficient fashion. Nevertheless, the properties of these languages are often not well represented by those available during training. Hence, we argue that the i.i.d. assumption ingrained in MAML makes it ill-suited for cross-lingual NLP. In fact, under a decision-theoretic framework, MAML can be interpreted as minimising the expected risk across training languages (with a uniform prior), which is known as Bayes criterion. To increase its robustness to outlier languages, we create two variants of MAML based on alternative criteria: Minimax MAML reduces the maximum risk across languages, while Neyman–Pearson MAML constrains the risk in each language to a maximum threshold. Both criteria constitute fully differentiable two-player games. In light of this, we propose a new adaptive optimiser solving for a local ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/26197-minimax-and-neyman-pearson-meta-learning-for-outlier-languages
https://dx.doi.org/10.48448/zydv-7c20
BASE
Hide details
28
CLINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding ...
BASE
Show details
29
Towards Protecting Vital Healthcare Programs by Extracting Actionable Knowledge from Policy ...
BASE
Show details
30
DYPLOC: Dynamic Planning of Content Using Mixed Language Models for Text Generation ...
BASE
Show details
31
Automated Concatenation of Embeddings for Structured Prediction ...
BASE
Show details
32
QASR: QCRI Aljazeera Speech Resource A Large Scale Annotated Arabic Speech Corpus ...
BASE
Show details
33
Code Generation from Natural Language with Less Prior Knowledge and More Monolingual Data ...
BASE
Show details
34
On the Distribution, Sparsity, and Inference-time Quantization of Attention Values in Transformers ...
BASE
Show details
35
Learning Disentangled Latent Topics for Twitter Rumour Veracity Classification ...
BASE
Show details
36
Sequence Models for Computational Etymology of Borrowings ...
BASE
Show details
37
Scaling Within Document Coreference to Long Texts ...
BASE
Show details
38
How to Split: the Effect of Word Segmentation on Gender Bias in Speech Translation ...
BASE
Show details
39
Prefix-Tuning: Optimizing Continuous Prompts for Generation ...
BASE
Show details
40
Chase: A Large-Scale and Pragmatic Chinese Dataset for Cross-Database Context-Dependent Text-to-SQL ...
BASE
Show details

Page: 1 2 3 4 5 6...72

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1.423
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern