DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 32

1
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
2
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
3
On the Copying Behaviors of Pre-Training for Neural Machine Translation ...
BASE
Show details
4
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation ...
BASE
Show details
5
On the Inference Calibration of Neural Machine Translation ...
Wang, Shuo; Tu, Zhaopeng; Shi, Shuming. - : arXiv, 2020
BASE
Show details
6
EmpDG: Multi-resolution Interactive Empathetic Dialogue Generation ...
BASE
Show details
7
On the Sparsity of Neural Machine Translation Models ...
BASE
Show details
8
Assessing the Bilingual Knowledge Learned by Neural Machine Translation Models ...
He, Shilin; Wang, Xing; Shi, Shuming. - : arXiv, 2020
BASE
Show details
9
Understanding and Improving Lexical Choice in Non-Autoregressive Translation ...
Ding, Liang; Wang, Longyue; Liu, Xuebo. - : arXiv, 2020
BASE
Show details
10
Information Aggregation for Multi-Head Attention with Routing-by-Agreement ...
Li, Jian; Yang, Baosong; Dou, Zi-Yi. - : arXiv, 2019
BASE
Show details
11
Neuron Interaction Based Representation Composition for Neural Machine Translation ...
Li, Jian; Wang, Xing; Yang, Baosong. - : arXiv, 2019
BASE
Show details
12
Multi-Granularity Self-Attention for Neural Machine Translation ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
13
Towards Understanding Neural Machine Translation with Word Importance ...
He, Shilin; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2019
BASE
Show details
14
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
15
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18), 2–7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
16
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: 32nd AAAI Conference on Artificial Intelligence (AAAI 2018), 2 - 7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
17
Translating Pro-Drop Languages with Reconstruction Models ...
BASE
Show details
18
Exploiting Deep Representations for Neural Machine Translation ...
Abstract: Advanced neural machine translation (NMT) models generally implement encoder and decoder as multiple layers, which allows systems to model complex functions and capture complicated linguistic structures. However, only the top layers of encoder and decoder are leveraged in the subsequent process, which misses the opportunity to exploit the useful information embedded in other layers. In this work, we propose to simultaneously expose all of these signals with layer aggregation and multi-layer attention mechanisms. In addition, we introduce an auxiliary regularization term to encourage different layers to capture diverse information. Experimental results on widely-used WMT14 English-German and WMT17 Chinese-English translation data demonstrate the effectiveness and universality of the proposed approach. ... : EMNLP 2018 ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1810.10181
https://arxiv.org/abs/1810.10181
BASE
Hide details
19
A novel and robust approach for pro-drop language translation [<Journal>]
Wang, Longyue [Verfasser]; Tu, Zhaopeng [Sonstige]; Zhang, Xiaojun [Sonstige].
DNB Subject Category Language
Show details
20
Exploiting cross-sentence context for neural machine translation
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Way, Andy orcid:0000-0001-5736-5930 and Liu, Qun orcid:0000-0002-7000-1792 (2017) Exploiting cross-sentence context for neural machine translation. In: 2017 Conference on Empirical Methods in Natural Language Processing, 7-8 Sept 2017, Copenhagen, Denmark. (2017)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
31
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern