1 |
Multilingual Neural Machine Translation:Can Linguistic Hierarchies Help? ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Generalised Unsupervised Domain Adaptation of Neural Machine Translation with Cross-Lingual Data Selection ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Uncertainty-Aware Balancing for Multilingual and Multi-Domain Neural Machine Translation Training ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Harnessing Cross-lingual Features to Improve Cognate Detection for Low-resource Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Learning Coupled Policies for Simultaneous Machine Translation using Imitation Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
SummPip: Unsupervised Multi-Document Summarization with Sentence Graph Compression ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Leveraging Discourse Rewards for Document-Level Neural Machine Translation ...
|
|
|
|
Abstract:
Document-level machine translation focuses on the translation of entire documents from a source to a target language. It is widely regarded as a challenging task since the translation of the individual sentences in the document needs to retain aspects of the discourse at document level. However, document-level translation models are usually not trained to explicitly ensure discourse quality. Therefore, in this paper we propose a training approach that explicitly optimizes two established discourse metrics, lexical cohesion (LC) and coherence (COH), by using a reinforcement learning objective. Experiments over four different language pairs and three translation domains have shown that our training approach has been able to achieve more cohesive and coherent document translations than other competitive approaches, yet without compromising the faithfulness to the reference translation. In the case of the Zh-En language pair, our method has achieved an improvement of 2.46 percentage points (pp) in LC and 1.17 pp ... : Accepted at COLING 2020 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2010.03732 https://arxiv.org/abs/2010.03732
|
|
BASE
|
|
Hide details
|
|
9 |
Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Learning to Multi-Task Learn for Better Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Contextual Neural Model for Translating Bilingual Multi-Speaker Conversations ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Sequence to Sequence Mixture Model for Diverse Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Learning how to actively learn: a deep imitation learning approach
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Neural Machine Translation for Bilingually Scarce Scenarios: A Deep Multi-task Learning Approach ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Phonemic transcription of low-resource tonal languages
|
|
|
|
In: ISSN: 1834-7037 ; Australasian Language Technology Association Workshop 2017 ; https://halshs.archives-ouvertes.fr/halshs-01656683 ; Australasian Language Technology Association Workshop 2017, Dec 2017, Brisbane, Australia. pp.53-60 (2017)
|
|
BASE
|
|
Show details
|
|
18 |
Towards Decoding as Continuous Optimization in Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Leveraging linguistic resources for improving neural text classification
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Phonemic transcription of low-resource tonal languages
|
|
|
|
In: ISSN: 1834-7037 ; Australasian Language Technology Association Workshop 2017 ; https://halshs.archives-ouvertes.fr/halshs-01656683 ; Australasian Language Technology Association Workshop 2017, Dec 2017, Brisbane, Australia. pp.53-60 (2017)
|
|
BASE
|
|
Show details
|
|
|
|