DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 37

1
Neural machine translation with a polysynthetic low resource language [<Journal>]
Ortega, John E. [Verfasser]; Castro Mamani, Richard [Verfasser]; Cho, Kyunghyun [Verfasser]
DNB Subject Category Language
Show details
2
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search ...
BASE
Show details
3
Monotonic Simultaneous Translation with Chunk-wise Reordering and Refinement ...
BASE
Show details
4
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search ...
BASE
Show details
5
The Future is not One-dimensional: Complex Event Schema Induction by Graph Modeling for Event Prediction ...
BASE
Show details
6
Comparing Test Sets with Item Response Theory ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.92 Abstract: Recent years have seen numerous NLP datasets introduced to evaluate the performance of fine-tuned models on natural language understanding tasks. Recent results from large pretrained models, though, show that many of these datasets are largely saturated and unlikely to be able to detect further progress. What kind of datasets are still effective at discriminating among strong models, and what kind of datasets should we expect to be able to detect future improvements? To measure this uniformly across datasets, we draw on Item Response Theory and evaluate 29 datasets using predictions from 18 pretrained Transformer models on individual test examples. We find that Quoref, HellaSwag, and MC-TACO are best suited for distinguishing among state-of-the-art models, while SNLI, MNLI, and CommitmentBank seem to be saturated for current strong models. We also observe span selection task format, which is used for QA datasets like QAMR or SQuAD2.0, is ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25976-comparing-test-sets-with-item-response-theory
https://dx.doi.org/10.48448/0k5e-6z96
BASE
Hide details
7
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
BASE
Show details
8
VisualSem: A High-quality Knowledge Graph for Vision and Language ...
BASE
Show details
9
Learning to Learn Morphological Inflection for Resource-Poor Languages ...
BASE
Show details
10
Improving Conversational Question Answering Systems after Deployment using Feedback-Weighted Learning ...
BASE
Show details
11
Cold-start universal information extraction
Huang, Lifu. - 2020
BASE
Show details
12
AdapterHub: A Framework for Adapting Transformers
Pfeiffer, Jonas; Ruckle, Andreas; Poth, Clifton. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing: System Demonstrations (EMNLP 2020), 2020
BASE
Show details
13
Neural Machine Translation with Byte-Level Subwords ...
BASE
Show details
14
Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations ...
Gu, Jiatao; Wang, Yong; Cho, Kyunghyun. - : arXiv, 2019
BASE
Show details
15
Emergent Linguistic Phenomena in Multi-Agent Communication Games ...
BASE
Show details
16
Countering Language Drift via Visual Grounding ...
BASE
Show details
17
Insertion-based Decoding with Automatically Inferred Generation Order
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 661-676 (2019) (2019)
BASE
Show details
18
Meta-Learning for Low-Resource Neural Machine Translation ...
Gu, Jiatao; Wang, Yong; Chen, Yun. - : arXiv, 2018
BASE
Show details
19
Multi-lingual Common Semantic Space Construction via Cluster-consistent Word Embedding ...
BASE
Show details
20
From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042)
Cho, Kyunghyun; Dyer, Chris; Blunsom, Phil. - : Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, 2017. : Dagstuhl Reports. Dagstuhl Reports, Volume 7, Issue 1, 2017
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
36
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern