DE eng

Search in the Catalogues and Directories

Hits 1 – 13 of 13

1
huggingface/datasets: 1.18.1 ...
BASE
Show details
2
huggingface/transformers: v4.4.0: S2T, M2M100, I-BERT, mBART-50, DeBERTa-v2, XLSR-Wav2Vec2 ...
BASE
Show details
3
huggingface/datasets: 1.16.0 ...
BASE
Show details
4
Motor constraints influence cultural evolution of rhythm
In: ISSN: 0962-8452 ; EISSN: 1471-2954 ; Proceedings of the Royal Society B: Biological Sciences ; https://jeannicod.ccsd.cnrs.fr/ijn_03085983 ; Proceedings of the Royal Society B: Biological Sciences, Royal Society, The, 2020, 287 (1937), ⟨10.1098/rspb.2020.2001⟩ (2020)
BASE
Show details
5
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
6
Transformers: State-of-the-Art Natural Language Processing ...
BASE
Show details
7
huggingface/transformers: ProphetNet, Blenderbot, SqueezeBERT, DeBERTa ...
BASE
Show details
8
huggingface/transformers: Trainer, TFTrainer, Multilingual BART, Encoder-decoder improvements, Generation Pipeline ...
BASE
Show details
9
huggingface/pytorch-transformers: DistilBERT, GPT-2 Large, XLM multilingual models, bug fixes ...
Abstract: New model architecture: DistilBERT Adding Huggingface's new transformer architecture, DistilBERT described in Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT by Victor Sanh, Lysandre Debut and Thomas Wolf. This new model architecture comes with two pretrained checkpoints: distilbert-base-uncased : the base DistilBert model distilbert-base-uncased-distilled-squad : DistilBert model fine-tuned with distillation on SQuAD. An awaited new pretrained checkpoint: GPT-2 large (774M parameters) The third OpenAI GPT-2 checkpoint (GPT-2 large) is available in the library under the shortcut name gpt2-large : 774M parameters, 36 layers, and 20 heads. New XLM multilingual pretrained checkpoints in 17 and 100 languages We have added two new XLM models in 17 and 100 languages which obtain better performance than multilingual BERT on the XNLI cross-lingual classification task. New dependency: sacremoses Support for XLM is improved by carefully reproducing the original tokenization ...
URL: https://dx.doi.org/10.5281/zenodo.3385998
https://zenodo.org/record/3385998
BASE
Hide details
10
Brüder, Geister und Fossilien : Eduard Mörikes Erfahrungen der Umwelt
Wolf, Thomas [Verfasser]. - Berlin/Boston : De Gruyter, 2001
DNB Subject Category Language
Show details
11
Forward pruning and other heuristic search techniques in tsume go
In: Information sciences. - New York, NY : Elsevier Science Inc. 122 (2000) 1, 59-76
OLC Linguistik
Show details
12
Pustkuchen und Goethe : Die Streitschrift als produktives Verwirrspiel
Wolf, Thomas [Verfasser]. - Berlin/Boston : De Gruyter, 1999
DNB Subject Category Language
Show details
13
Reading reconsidered
In: Thought & language/language & reading (Cambridge, MA, 1980), p. 109-127
MPI für Psycholinguistik
Show details

Catalogues
0
0
1
0
2
0
0
Bibliographies
0
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
9
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern