1 |
BERT is to NLP what AlexNet is to CV: Can Pre-Trained Language Models Identify Analogies? ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Distilling Relation Embeddings from Pre-trained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Distilling Relation Embeddings from Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Modelling general properties of nouns by selectively averaging contextualised embeddings
|
|
|
|
BASE
|
|
Show details
|
|
6 |
BERT is to NLP what AlexNet is to CV: can pre-trained language models identify analogies?
|
|
|
|
BASE
|
|
Show details
|
|
7 |
A Mixture-of-Experts Model for Learning Multi-Facet Entity Embeddings
|
|
|
|
In: The 28th International Conference on Computational Linguistics (COLING 2020) ; https://hal-univ-artois.archives-ouvertes.fr/hal-03300233 ; The 28th International Conference on Computational Linguistics (COLING 2020), 2020, Barcelona, Spain (2020)
|
|
BASE
|
|
Show details
|
|
8 |
Modelling General Properties of Nouns by Selectively Averaging Contextualised Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Cardiff University at SemEval-2020 Task 6: Fine-tuning BERT for Domain-Specific Definition Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
A Mixture-of-Experts Model for Learning Multi-Facet Entity Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Learning Cross-Lingual Word Embeddings from Twitter via Distant Supervision
|
|
|
|
In: Proceedings of the International AAAI Conference on Web and Social Media; Vol. 14 (2020): Fourteenth International AAAI Conference on Web and Social Media; 72-82 ; 2334-0770 ; 2162-3449 (2020)
|
|
BASE
|
|
Show details
|
|
12 |
Don't patronize me! an annotated dataset with patronizing and condescending language towards vulnerable communities
|
|
|
|
BASE
|
|
Show details
|
|
13 |
A mixture-of-experts model for learning multi-facet entity embeddings
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Learning cross-lingual word embeddings from Twitter via distant supervision
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Don't patronize me! An annotated dataset with patronizing and condescending language towards vulnerable communities
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Learning Conceptual Spaces with Disentangled Facets
|
|
|
|
In: 23rd Conference on Computational Natural Language Learning (CoNLL 2019) ; https://hal-univ-artois.archives-ouvertes.fr/hal-03300237 ; 23rd Conference on Computational Natural Language Learning (CoNLL 2019), 2019, Hong Kong, Hong Kong SAR China. pp.131-139 (2019)
|
|
BASE
|
|
Show details
|
|
17 |
Meemi: A Simple Method for Post-processing and Integrating Cross-lingual Word Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
On the Robustness of Unsupervised and Semi-supervised Cross-lingual Word Embedding Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Word and Document Embedding with vMF-Mixture Priors on Context Word Vectors
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Word and document embedding with vMF-mixture priors on context word vectors
|
|
|
|
BASE
|
|
Show details
|
|
|
|