DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
Generating Authentic Adversarial Examples beyond Meaning-preserving with Doubly Round-trip Translation ...
Lai, Siyu; Yang, Zhen; Meng, Fandong. - : arXiv, 2022
BASE
Show details
2
Conditional Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation ...
BASE
Show details
3
MSCTD: A Multimodal Sentiment Chat Translation Dataset ...
BASE
Show details
4
ClidSum: A Benchmark Dataset for Cross-Lingual Dialogue Summarization ...
Wang, Jiaan; Meng, Fandong; Lu, Ziyao. - : arXiv, 2022
BASE
Show details
5
EAG: Extract and Generate Multi-way Aligned Corpus for Complete Multi-lingual Neural Machine Translation ...
Xu, Yulin; Yang, Zhen; Meng, Fandong. - : arXiv, 2022
BASE
Show details
6
A Survey on Cross-Lingual Summarization ...
Wang, Jiaan; Meng, Fandong; Zheng, Duo. - : arXiv, 2022
BASE
Show details
7
Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation ...
BASE
Show details
8
Modeling Bilingual Conversational Characteristics for Neural Chat Translation ...
BASE
Show details
9
Sequence-Level Training for Non-Autoregressive Neural Machine Translation ...
Abstract: In recent years, Neural Machine Translation (NMT) has achieved notable results in various translation tasks. However, the word-by-word generation manner determined by the autoregressive mechanism leads to high translation latency of the NMT and restricts its low-latency applications. Non-Autoregressive Neural Machine Translation (NAT) removes the autoregressive mechanism and achieves significant decoding speedup through generating target words independently and simultaneously. Nevertheless, NAT still takes the word-level cross-entropy loss as the training objective, which is not optimal because the output of NAT cannot be properly evaluated due to the multimodality problem. In this article, we propose using sequence-level training objectives to train NAT models, which evaluate the NAT outputs as a whole and correlates well with the real translation quality. Firstly, we propose training NAT models to optimize sequence-level evaluation metrics (e.g., BLEU) based on several novel reinforcement algorithms ... : Computational Linguistics Journal ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; I.2.7; Machine Learning cs.LG
URL: https://arxiv.org/abs/2106.08122
https://dx.doi.org/10.48550/arxiv.2106.08122
BASE
Hide details
10
Competence-based Curriculum Learning for Multilingual Machine Translation ...
BASE
Show details
11
Marginal Utility Diminishes: Exploring the Minimum Knowledge for BERT Knowledge Distillation ...
BASE
Show details
12
Towards Making the Most of Dialogue Characteristics for Neural Chat Translation ...
BASE
Show details
13
Prevent the Language Model from being Overconfident in Neural Machine Translation ...
BASE
Show details
14
Modeling Bilingual Conversational Characteristics for Neural Chat Translation ...
BASE
Show details
15
Selective Knowledge Distillation for Neural Machine Translation ...
BASE
Show details
16
Target-oriented Fine-tuning for Zero-Resource Named Entity Recognition ...
BASE
Show details
17
Exploring Dynamic Selection of Branch Expansion Orders for Code Generation ...
BASE
Show details
18
The ICT ’s Patent MT System Description for NTCIR-9
In: http://research.nii.ac.jp/ntcir/workshop/OnlineProceedings9/NTCIR/19-NTCIR9-PATENTMT-XiongH.pdf
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern