DE eng

Search in the Catalogues and Directories

Hits 1 – 7 of 7

1
huggingface/datasets: 1.18.1 ...
BASE
Show details
2
huggingface/transformers: v4.4.0: S2T, M2M100, I-BERT, mBART-50, DeBERTa-v2, XLSR-Wav2Vec2 ...
BASE
Show details
3
huggingface/datasets: 1.16.0 ...
BASE
Show details
4
Transformers: State-of-the-Art Natural Language Processing ...
Abstract: New Model additions Perceiver Eight new models are released as part of the Perceiver implementation: PerceiverModel , PerceiverForMaskedLM , PerceiverForSequenceClassification , PerceiverForImageClassificationLearned , PerceiverForImageClassificationFourier , PerceiverForImageClassificationConvProcessing , PerceiverForOpticalFlow , PerceiverForMultimodalAutoencoding , in PyTorch. The Perceiver IO model was proposed in Perceiver IO: A General Architecture for Structured Inputs & Outputs by Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira. Add Perceiver IO by @NielsRogge in https://github.com/huggingface/transformers/pull/14487 Compatible checkpoints can be found on the hub: https://huggingface.co/models?other=perceiver mLUKE The mLUKE tokenizer is added. The tokenizer can be used for the multilingual variant of ... : If you use this software, please cite it using these metadata. ...
URL: https://zenodo.org/record/5770483
https://dx.doi.org/10.5281/zenodo.5770483
BASE
Hide details
5
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
6
huggingface/transformers: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ...
BASE
Show details
7
MultiFiT: Efficient Multi-lingual Language Model Fine-tuning ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
7
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern