DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 32

1
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
2
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
3
On the Copying Behaviors of Pre-Training for Neural Machine Translation ...
BASE
Show details
4
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation ...
BASE
Show details
5
On the Inference Calibration of Neural Machine Translation ...
Wang, Shuo; Tu, Zhaopeng; Shi, Shuming. - : arXiv, 2020
BASE
Show details
6
EmpDG: Multi-resolution Interactive Empathetic Dialogue Generation ...
BASE
Show details
7
On the Sparsity of Neural Machine Translation Models ...
BASE
Show details
8
Assessing the Bilingual Knowledge Learned by Neural Machine Translation Models ...
He, Shilin; Wang, Xing; Shi, Shuming. - : arXiv, 2020
BASE
Show details
9
Understanding and Improving Lexical Choice in Non-Autoregressive Translation ...
Ding, Liang; Wang, Longyue; Liu, Xuebo. - : arXiv, 2020
BASE
Show details
10
Information Aggregation for Multi-Head Attention with Routing-by-Agreement ...
Li, Jian; Yang, Baosong; Dou, Zi-Yi. - : arXiv, 2019
BASE
Show details
11
Neuron Interaction Based Representation Composition for Neural Machine Translation ...
Abstract: Recent NLP studies reveal that substantial linguistic information can be attributed to single neurons, i.e., individual dimensions of the representation vectors. We hypothesize that modeling strong interactions among neurons helps to better capture complex information by composing the linguistic properties embedded in individual neurons. Starting from this intuition, we propose a novel approach to compose representations learned by different components in neural machine translation (e.g., multi-layer networks or multi-head attention), based on modeling strong interactions among neurons in the representation vectors. Specifically, we leverage bilinear pooling to model pairwise multiplicative interactions among individual neurons, and a low-rank approximation to make the model computationally feasible. We further propose extended bilinear pooling to incorporate first-order representations. Experiments on WMT14 English-German and English-French translation tasks show that our model consistently improves ... : AAAI 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.1911.09877
https://arxiv.org/abs/1911.09877
BASE
Hide details
12
Multi-Granularity Self-Attention for Neural Machine Translation ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
13
Towards Understanding Neural Machine Translation with Word Importance ...
He, Shilin; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2019
BASE
Show details
14
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
15
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18), 2–7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
16
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: 32nd AAAI Conference on Artificial Intelligence (AAAI 2018), 2 - 7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
17
Translating Pro-Drop Languages with Reconstruction Models ...
BASE
Show details
18
Exploiting Deep Representations for Neural Machine Translation ...
Dou, Zi-Yi; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2018
BASE
Show details
19
A novel and robust approach for pro-drop language translation [<Journal>]
Wang, Longyue [Verfasser]; Tu, Zhaopeng [Sonstige]; Zhang, Xiaojun [Sonstige].
DNB Subject Category Language
Show details
20
Exploiting cross-sentence context for neural machine translation
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Way, Andy orcid:0000-0001-5736-5930 and Liu, Qun orcid:0000-0002-7000-1792 (2017) Exploiting cross-sentence context for neural machine translation. In: 2017 Conference on Empirical Methods in Natural Language Processing, 7-8 Sept 2017, Copenhagen, Denmark. (2017)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
31
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern