DE eng

Search in the Catalogues and Directories

Hits 1 – 8 of 8

1
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT ...
BASE
Show details
2
It's not Greek to mBERT: Inducing Word-Level Translations from Multilingual BERT ...
BASE
Show details
3
The Extraordinary Failure of Complement Coercion Crowdsourcing ...
BASE
Show details
4
Do Language Embeddings Capture Scales? ...
Abstract: Pretrained Language Models (LMs) have been shown to possess significant linguistic, common sense, and factual knowledge. One form of knowledge that has not been studied yet in this context is information about the scalar magnitudes of objects. We show that pretrained language models capture a significant amount of this information but are short of the capability required for general common-sense reasoning. We identify contextual information in pre-training and numeracy as two key factors affecting their performance and show that a simple method of canonicalizing numbers can have a significant effect on the results. ... : Accepted at EMNLP Findings 2020 and EMNLP BlackboxNLP workshop 2020; 8 pages, 2 figures; Minor changes to the acknowledgment section ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; I.2.7
URL: https://dx.doi.org/10.48550/arxiv.2010.05345
https://arxiv.org/abs/2010.05345
BASE
Hide details
5
Amnesic Probing: Behavioral Explanation with Amnesic Counterfactuals ...
BASE
Show details
6
Evaluating Models' Local Decision Boundaries via Contrast Sets ...
BASE
Show details
7
Unsupervised Distillation of Syntactic Information from Contextualized Word Representations ...
BASE
Show details
8
How Large Are Lions? Inducing Distributions over Quantitative Attributes ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
8
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern