DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6 7
Hits 61 – 80 of 137

61
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Vulic, Ivan; Pfeiffer, Jonas; Ruder, Sebastian. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
62
SemEval-2020 Task 3: Graded Word Similarity in Context
Santos Armendariz, Carlos; Purver, Matthew; Pollak, Senja. - : International Committee for Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.semeval-1.3, 2020. : Proceedings of the 14th International Workshop on Semantic Evaluation (SemEval 2020), 2020
BASE
Show details
63
Multidirectional Associative Optimization of Function-Specific Word Representations
Gerz, Daniela; Vulic, Ivan; Rei, Marek. - : Association for Computational Linguistics, 2020. : 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020
BASE
Show details
64
AdapterHub: A Framework for Adapting Transformers
Pfeiffer, Jonas; Ruckle, Andreas; Poth, Clifton; Kamath, Aishwarya; Vulic, Ivan; Ruder, Sebastian; Cho, Kyunghyun; Gurevych, Iryna. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing: System Demonstrations (EMNLP 2020), 2020
Abstract: The current modus operandi in NLP involves downloading and fine-tuning pre-trained models consisting of hundreds of millions, or even billions of parameters. Storing and sharing such large trained models is expensive, slow, and time-consuming, which impedes progress towards more general and versatile NLP methods that learn from and for many tasks. Adapters-small learnt bottleneck layers inserted within each layer of a pre-trained model- ameliorate this issue by avoiding full fine-tuning of the entire model. However, sharing and integrating adapter layers is not straightforward. We propose AdapterHub, a framework that allows dynamic "stiching-in" of pre-trained adapters for different tasks and languages. The framework, built on top of the popular HuggingFace Transformers library, enables extremely easy and quick adaptations of state-of-the-art pre-trained models (e.g., BERT, RoBERTa, XLM-R) across tasks and languages. Downloading, sharing, and training adapters is as seamless as possible using minimal changes to the training scripts and a specialized infrastructure. Our framework enables scalable and easy access to sharing of task-specific models, particularly in low-resource scenarios. AdapterHub includes all recent adapter architectures and can be found at AdapterHub.ml.
URL: https://doi.org/10.17863/CAM.62205
https://www.repository.cam.ac.uk/handle/1810/315098
BASE
Hide details
65
Efficient Intent Detection with Dual Sentence Encoders
Casanueva, Inigo; Temcinas, Tadas; Gerz, Daniela. - : NLP FOR CONVERSATIONAL AI, 2020
BASE
Show details
66
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Liu, Qianchu; Korhonen, Anna-Leena; Majewska, Olga. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
67
XHate-999: Analyzing and Detecting Abusive Language Across Domains and Languages
Glavas, Goran; Karan, Mladen; Vulic, Ivan. - : International Committee on Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.coling-main.559, 2020. : Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020
BASE
Show details
68
Span-ConveRT: Few-shot Span Extraction for Dialog with Pretrained Conversational Representations
Coope, Sam; Farghly, Tyler; Gerz, Daniela. - : Association for Computational Linguistics, 2020. : 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020
BASE
Show details
69
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details
70
Non-linear instance-based cross-lingual mapping for non-isomorphic embedding spaces
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
71
Classification-based self-learning for weakly supervised bilingual lexicon induction
Vulić, Ivan; Korhonen, Anna; Glavaš, Goran. - : Association for Computational Linguistics, 2020
BASE
Show details
72
Probing pretrained language models for lexical semantics
Vulić, Ivan; Korhonen, Anna; Litschko, Robert. - : Association for Computational Linguistics, 2020
BASE
Show details
73
Common sense or world knowledge? Investigating adapter-based knowledge injection into pretrained transformers
Lauscher, Anne; Majewska, Olga; Ribeiro, Leonardo F. R.. - : Association for Computational Linguistics, 2020
BASE
Show details
74
XHate-999: analyzing and detecting abusive language across domains and languages
Glavaš, Goran; Karan, Mladen; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
75
XCOPA: A multilingual dataset for causal commonsense reasoning
Ponti, Edoardo Maria; Majewska, Olga; Liu, Qianchu. - : Association for Computational Linguistics, 2020
BASE
Show details
76
Improving bilingual lexicon induction with unsupervised post-processing of monolingual word vector spaces
Glavaš, Goran; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
77
From zero to hero: On the limitations of zero-shot language transfer with multilingual transformers
Ravishankar, Vinit; Glavaš, Goran; Lauscher, Anne. - : Association for Computational Linguistics, 2020
BASE
Show details
78
SemEval-2020 Task 2: Predicting multilingual and cross-lingual (graded) lexical entailment
Glavaš, Goran; Vulić, Ivan; Korhonen, Anna. - : Association for Computational Linguistics, 2020
BASE
Show details
79
Towards instance-level parser selection for cross-lingual transfer of dependency parsers
Litschko, Robert; Vulić, Ivan; Agić, Želiko. - : Association for Computational Linguistics, 2020
BASE
Show details
80
Modeling Language Variation and Universals: A Survey on Typological Linguistics for Natural Language Processing
In: ISSN: 0891-2017 ; EISSN: 1530-9312 ; Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-02425462 ; Computational Linguistics, Massachusetts Institute of Technology Press (MIT Press), 2019, 45 (3), pp.559-601. ⟨10.1162/coli_a_00357⟩ ; https://www.mitpressjournals.org/doi/abs/10.1162/coli_a_00357 (2019)
BASE
Show details

Page: 1 2 3 4 5 6 7

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
137
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern