1 |
s2s-ft: Fine-Tuning Pretrained Transformer Encoders for Sequence-to-Sequence Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Harvesting and Refining Question-Answer Pairs for Unsupervised QA ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Cross-Lingual Natural Language Generation via Pre-Training ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|