DE eng

Search in the Catalogues and Directories

Hits 1 – 8 of 8

1
huggingface/datasets: 1.18.1 ...
BASE
Show details
2
huggingface/transformers: v4.4.0: S2T, M2M100, I-BERT, mBART-50, DeBERTa-v2, XLSR-Wav2Vec2 ...
BASE
Show details
3
huggingface/datasets: 1.16.0 ...
BASE
Show details
4
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
5
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
6
huggingface/transformers: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ...
Abstract: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ProphetNET Two new models are released as part of the ProphetNet implementation: ProphetNet and XLM-ProphetNet . ProphetNet is an encoder-decoder model and can predict n-future tokens for "ngram" language modeling instead of just the next token. XLM-ProphetNet is an encoder-decoder model with an identical architecture to ProhpetNet, but the model was trained on the multi-lingual "wiki100" Wikipedia dump. The ProphetNet model was proposed in ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training, by Yu Yan, Weizhen Qi, Yeyun Gong, Dayiheng Liu, Nan Duan, Jiusheng Chen, Ruofei Zhang, Ming Zhou on 13 Jan, 2020. It was added to the library in PyTorch with the following checkpoints: microsoft/xprophetnet-large-wiki100-cased-xglue-ntg microsoft/prophetnet-large-uncased microsoft/prophetnet-large-uncased-cnndm microsoft/xprophetnet-large-wiki100-cased microsoft/xprophetnet-large-wiki100-cased-xglue-qg Contributions: ProphetNet #7157 (@qiweizhen, ...
URL: https://dx.doi.org/10.5281/zenodo.4110065
https://zenodo.org/record/4110065
BASE
Hide details
7
huggingface/transformers: Trainer, TFTrainer, Multilingual BART, Encoder-decoder improvements, Generation Pipeline ...
BASE
Show details
8
huggingface/pytorch-transformers: DistilBERT, GPT-2 Large, XLM multilingual models, bug fixes ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
8
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern