DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 46

1
Observation of new excited ${B} ^0_{s} $ states
In: Eur.Phys.J.C ; https://hal.archives-ouvertes.fr/hal-03010999 ; Eur.Phys.J.C, 2021, 81 (7), pp.601. ⟨10.1140/epjc/s10052-021-09305-3⟩ (2021)
BASE
Show details
2
Generative Imagination Elevates Machine Translation ...
NAACL 2021 2021; Li, Lei; Long, Quanyu. - : Underline Science Inc., 2021
BASE
Show details
3
End-to-end Speech Translation via Cross-modal Progressive Training ...
Ye, Rong; Wang, Mingxuan; Li, Lei. - : arXiv, 2021
BASE
Show details
4
Locate then Segment: A Strong Pipeline for Referring Image Segmentation ...
Jing, Ya; Kong, Tao; Wang, Wei. - : arXiv, 2021
BASE
Show details
5
Personalized Transformer for Explainable Recommendation ...
Li, Lei; Zhang, Yongfeng; Chen, Li. - : arXiv, 2021
BASE
Show details
6
Learning Language Specific Sub-network for Multilingual Machine Translation ...
Lin, Zehui; Wu, Liwei; Wang, Mingxuan. - : arXiv, 2021
BASE
Show details
7
Contrastive Learning for Many-to-many Multilingual Neural Machine Translation ...
Pan, Xiao; Wang, Mingxuan; Wu, Liwei. - : arXiv, 2021
BASE
Show details
8
Multilingual Translation via Grafting Pre-trained Language Models ...
Sun, Zewei; Wang, Mingxuan; Li, Lei. - : arXiv, 2021
BASE
Show details
9
Counter-Interference Adapter for Multilingual Machine Translation ...
Abstract: Developing a unified multilingual model has long been a pursuit for machine translation. However, existing approaches suffer from performance degradation -- a single multilingual model is inferior to separately trained bilingual ones on rich-resource languages. We conjecture that such a phenomenon is due to interference caused by joint training with multiple languages. To accommodate the issue, we propose CIAT, an adapted Transformer model with a small parameter overhead for multilingual machine translation. We evaluate CIAT on multiple benchmark datasets, including IWSLT, OPUS-100, and WMT. Experiments show that CIAT consistently outperforms strong multilingual baselines on 64 of total 66 language directions, 42 of which see above 0.5 BLEU improvement. Our code is available at \url{https://github.com/Yaoming95/CIAT}~. ... : 12 pages, accepted by EMNLP'21 Findings ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2104.08154
https://arxiv.org/abs/2104.08154
BASE
Hide details
10
MTG: A Benchmarking Suite for Multilingual Text Generation ...
BASE
Show details
11
Language Tags Matter for Zero-Shot Neural Machine Translation ...
BASE
Show details
12
Personalized Transformer for Explainable Recommendation ...
BASE
Show details
13
Dynamic Knowledge Distillation for Pre-trained Language Models ...
BASE
Show details
14
Learning Shared Semantic Space for Speech-to-Text Translation ...
BASE
Show details
15
Glancing Transformer for Non-Autoregressive Neural Machine Translation ...
BASE
Show details
16
Text AutoAugment: Learning Compositional Augmentation Policy for Text Classification ...
BASE
Show details
17
Probabilistic Graph Reasoning for Natural Proof Generation ...
BASE
Show details
18
Document-level Event Extraction via Heterogeneous Graph-based Interaction Model with a Tracker ...
BASE
Show details
19
Multilingual Translation via Grafting Pre-trained Language Models ...
BASE
Show details
20
Language Tags Matter for Zero-Shot Neural Machine Translation ...
BASE
Show details

Page: 1 2 3

Catalogues
0
0
2
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
44
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern