2 |
huggingface/transformers: v4.4.0: S2T, M2M100, I-BERT, mBART-50, DeBERTa-v2, XLSR-Wav2Vec2 ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Transformers: State-of-the-Art Natural Language Processing ...
|
|
Wolf, Thomas; Debut, Lysandre; Sanh, Victor; Chaumond, Julien; Delangue, Clement; Moi, Anthony; Cistac, Perric; Ma, Clara; Jernite, Yacine; Plu, Julien; Xu, Canwen; Le Scao, Teven; Gugger, Sylvain; Drame, Mariama; Lhoest, Quentin; Rush, Alexander M.. - : Zenodo, 2020
|
|
Abstract:
New Model additions Perceiver Eight new models are released as part of the Perceiver implementation: PerceiverModel , PerceiverForMaskedLM , PerceiverForSequenceClassification , PerceiverForImageClassificationLearned , PerceiverForImageClassificationFourier , PerceiverForImageClassificationConvProcessing , PerceiverForOpticalFlow , PerceiverForMultimodalAutoencoding , in PyTorch. The Perceiver IO model was proposed in Perceiver IO: A General Architecture for Structured Inputs & Outputs by Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira. Add Perceiver IO by @NielsRogge in https://github.com/huggingface/transformers/pull/14487 Compatible checkpoints can be found on the hub: https://huggingface.co/models?other=perceiver mLUKE The mLUKE tokenizer is added. The tokenizer can be used for the multilingual variant of ... : If you use this software, please cite it using these metadata. ...
|
|
URL: https://zenodo.org/record/5770483 https://dx.doi.org/10.5281/zenodo.5770483
|
|
BASE
|
|
Hide details
|
|
5 |
Transformers: State-of-the-Art Natural Language Processing ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
huggingface/transformers: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
huggingface/transformers: Trainer, TFTrainer, Multilingual BART, Encoder-decoder improvements, Generation Pipeline ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
huggingface/pytorch-transformers: DistilBERT, GPT-2 Large, XLM multilingual models, bug fixes ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|