1 |
SMDT: Selective Memory-Augmented Neural Document Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
StableMoE: Stable Routing Strategy for Mixture of Experts ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
On the Representation Collapse of Sparse Mixture of Experts ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
A Unified Strategy for Multilingual Grammatical Error Correction with Pre-trained Cross-Lingual Language Model ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Controllable Natural Language Generation with Contrastive Prefixes ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Blow the Dog Whistle: A Chinese Dataset for Cant Understanding with Common Sense and World Knowledge ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Allocating Large Vocabulary Capacity for Cross-lingual Language Model Pre-training ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Improving Pretrained Cross-Lingual Language Models via Self-Labeled Word Alignment ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
LayoutXLM: Multimodal Pre-training for Multilingual Visually-rich Document Understanding ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Jointly Learning to Repair Code and Generate Commit Message ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
MT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
XLM-E: Cross-lingual Language Model Pre-training via ELECTRA ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
LayoutLMv2: Multi-modal Pre-training for Visually-rich Document Understanding ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|