1 |
StableMoE: Stable Routing Strategy for Mixture of Experts ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Allocating Large Vocabulary Capacity for Cross-lingual Language Model Pre-training ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Improving Pretrained Cross-Lingual Language Models via Self-Labeled Word Alignment ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
XLM-E: Cross-lingual Language Model Pre-training via ELECTRA ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Consistency Regularization for Cross-Lingual Fine-Tuning ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Allocating Large Vocabulary Capacity for Cross-Lingual Language Model Pre-Training ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Persuasiveness of text messages generated by machine learning language model
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Towards Better UD Parsing: Deep Contextualized Word Embeddings, Ensemble, and Treebank Concatenation ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|