DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 37

1
Neural machine translation with a polysynthetic low resource language [<Journal>]
Ortega, John E. [Verfasser]; Castro Mamani, Richard [Verfasser]; Cho, Kyunghyun [Verfasser]
DNB Subject Category Language
Show details
2
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search ...
BASE
Show details
3
Monotonic Simultaneous Translation with Chunk-wise Reordering and Refinement ...
BASE
Show details
4
Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search ...
BASE
Show details
5
The Future is not One-dimensional: Complex Event Schema Induction by Graph Modeling for Event Prediction ...
BASE
Show details
6
Comparing Test Sets with Item Response Theory ...
BASE
Show details
7
DEEP: DEnoising Entity Pre-training for Neural Machine Translation ...
BASE
Show details
8
VisualSem: A High-quality Knowledge Graph for Vision and Language ...
BASE
Show details
9
Learning to Learn Morphological Inflection for Resource-Poor Languages ...
BASE
Show details
10
Improving Conversational Question Answering Systems after Deployment using Feedback-Weighted Learning ...
BASE
Show details
11
Cold-start universal information extraction
Huang, Lifu. - 2020
BASE
Show details
12
AdapterHub: A Framework for Adapting Transformers
Pfeiffer, Jonas; Ruckle, Andreas; Poth, Clifton. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing: System Demonstrations (EMNLP 2020), 2020
BASE
Show details
13
Neural Machine Translation with Byte-Level Subwords ...
BASE
Show details
14
Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations ...
Gu, Jiatao; Wang, Yong; Cho, Kyunghyun. - : arXiv, 2019
BASE
Show details
15
Emergent Linguistic Phenomena in Multi-Agent Communication Games ...
BASE
Show details
16
Countering Language Drift via Visual Grounding ...
Abstract: Emergent multi-agent communication protocols are very different from natural language and not easily interpretable by humans. We find that agents that were initially pretrained to produce natural language can also experience detrimental language drift: when a non-linguistic reward is used in a goal-based task, e.g. some scalar success metric, the communication protocol may easily and radically diverge from natural language. We recast translation as a multi-agent communication game and examine auxiliary training constraints for their effectiveness in mitigating language drift. We show that a combination of syntactic (language model likelihood) and semantic (visual grounding) constraints gives the best communication performance, allowing pre-trained agents to retain English syntax while learning to accurately convey the intended meaning. ... : Accepted to EMNLP 2019 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.1909.04499
https://arxiv.org/abs/1909.04499
BASE
Hide details
17
Insertion-based Decoding with Automatically Inferred Generation Order
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 661-676 (2019) (2019)
BASE
Show details
18
Meta-Learning for Low-Resource Neural Machine Translation ...
Gu, Jiatao; Wang, Yong; Chen, Yun. - : arXiv, 2018
BASE
Show details
19
Multi-lingual Common Semantic Space Construction via Cluster-consistent Word Embedding ...
BASE
Show details
20
From Characters to Understanding Natural Language (C2NLU): Robust End-to-End Deep Learning for NLP (Dagstuhl Seminar 17042)
Cho, Kyunghyun; Dyer, Chris; Blunsom, Phil. - : Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, 2017. : Dagstuhl Reports. Dagstuhl Reports, Volume 7, Issue 1, 2017
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
36
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern