2 |
huggingface/transformers: v4.4.0: S2T, M2M100, I-BERT, mBART-50, DeBERTa-v2, XLSR-Wav2Vec2 ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Motor constraints influence cultural evolution of rhythm
|
|
|
|
In: ISSN: 0962-8452 ; EISSN: 1471-2954 ; Proceedings of the Royal Society B: Biological Sciences ; https://jeannicod.ccsd.cnrs.fr/ijn_03085983 ; Proceedings of the Royal Society B: Biological Sciences, Royal Society, The, 2020, 287 (1937), ⟨10.1098/rspb.2020.2001⟩ (2020)
|
|
BASE
|
|
Show details
|
|
5 |
Transformers: State-of-the-Art Natural Language Processing ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Transformers: State-of-the-Art Natural Language Processing ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
huggingface/transformers: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ...
|
|
Wolf, Thomas; Debut, Lysandre; Chaumond, Julien; Platen, Patrick Von; Shleifer, Sam; Gugger, Sylvain; SANH, Victor; Romero, Manuel; Funtowicz Morgan; Bekman, Stas; Augustin, Aymeric; Louf, Rémi; Schweter, Stefan; , Denis; Patil, Suraj; Erenup; Plu, Julien; , Matt; Molino, Piero; Châtel, Grégory; Vanroy, Bram; MOI, Anthony; Teven; , Clement; Davison, Joe; Briem, Gunnlaugur Thor; Xu, Kevin Canwen; Rault, Tim; Pietsch, Malte; Catalin Voss. - : Zenodo, 2020
|
|
Abstract:
ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ProphetNET Two new models are released as part of the ProphetNet implementation: ProphetNet and XLM-ProphetNet . ProphetNet is an encoder-decoder model and can predict n-future tokens for "ngram" language modeling instead of just the next token. XLM-ProphetNet is an encoder-decoder model with an identical architecture to ProhpetNet, but the model was trained on the multi-lingual "wiki100" Wikipedia dump. The ProphetNet model was proposed in ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training, by Yu Yan, Weizhen Qi, Yeyun Gong, Dayiheng Liu, Nan Duan, Jiusheng Chen, Ruofei Zhang, Ming Zhou on 13 Jan, 2020. It was added to the library in PyTorch with the following checkpoints: microsoft/xprophetnet-large-wiki100-cased-xglue-ntg microsoft/prophetnet-large-uncased microsoft/prophetnet-large-uncased-cnndm microsoft/xprophetnet-large-wiki100-cased microsoft/xprophetnet-large-wiki100-cased-xglue-qg Contributions: ProphetNet #7157 (@qiweizhen, ...
|
|
URL: https://dx.doi.org/10.5281/zenodo.4110065 https://zenodo.org/record/4110065
|
|
BASE
|
|
Hide details
|
|
8 |
huggingface/transformers: Trainer, TFTrainer, Multilingual BART, Encoder-decoder improvements, Generation Pipeline ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
huggingface/pytorch-transformers: DistilBERT, GPT-2 Large, XLM multilingual models, bug fixes ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|