1 |
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
On the Copying Behaviors of Pre-Training for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
On the Inference Calibration of Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
EmpDG: Multi-resolution Interactive Empathetic Dialogue Generation ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Assessing the Bilingual Knowledge Learned by Neural Machine Translation Models ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Understanding and Improving Lexical Choice in Non-Autoregressive Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Information Aggregation for Multi-Head Attention with Routing-by-Agreement ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Neuron Interaction Based Representation Composition for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Multi-Granularity Self-Attention for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Towards Understanding Neural Machine Translation with Word Importance ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Translating pro-drop languages with reconstruction models
|
|
|
|
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18), 2–7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
|
|
BASE
|
|
Show details
|
|
16 |
Translating pro-drop languages with reconstruction models
|
|
|
|
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: 32nd AAAI Conference on Artificial Intelligence (AAAI 2018), 2 - 7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
|
|
Abstract:
Pronouns are frequently omitted in pro-drop languages, such as Chinese, generally leading to significant challenges with respect to the production of complete translations. To date, very little attention has been paid to the dropped pronoun (DP) problem within neural machine translation (NMT). In this work, we propose a novel reconstruction-based approach to alleviating DP translation problems for NMT models. Firstly, DPs within all source sentences are automatically annotated with parallel information extracted from the bilingual training corpus. Next, the annotated source sentence is reconstructed from hidden representations in the NMT model. With auxiliary training objectives, in the terms of reconstruction scores, the parameters associated with the NMT model are guided to produce enhanced hidden representations that are encouraged as much as possible to embed annotated DP information. Experimental results on both Chinese-English and Japanese-English dialogue translation tasks show that the proposed approach significantly and consistently improves translation performance over a strong NMT baseline, which is directly built on the training data annotated with DPs.
|
|
Keyword:
Artificial intelligence; Computational linguistics; Dialogue; Dropped Pronoun; Neural Machine Translation; Pro-Drop Language; Reconstruction Model
|
|
URL: http://doras.dcu.ie/23122/
|
|
BASE
|
|
Hide details
|
|
17 |
Translating Pro-Drop Languages with Reconstruction Models ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Exploiting Deep Representations for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Exploiting cross-sentence context for neural machine translation
|
|
|
|
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Way, Andy orcid:0000-0001-5736-5930 and Liu, Qun orcid:0000-0002-7000-1792 (2017) Exploiting cross-sentence context for neural machine translation. In: 2017 Conference on Empirical Methods in Natural Language Processing, 7-8 Sept 2017, Copenhagen, Denmark. (2017)
|
|
BASE
|
|
Show details
|
|
|
|