DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...26
Hits 1 – 20 of 515

1
Priorless Recurrent Networks Learn Curiously ...
BASE
Show details
2
Composing Byte-Pair Encodings for Morphological Sequence Classification ...
BASE
Show details
3
Variation in Universal Dependencies annotation: A token based typological case study on adpossessive constructions ...
BASE
Show details
4
Corpus evidence for word order freezing in Russian and German ...
BASE
Show details
5
An analysis of language models for metaphor recognition ...
BASE
Show details
6
Noise Isn't Always Negative: Countering Exposure Bias in Sequence-to-Sequence Inflection Models ...
BASE
Show details
7
Exhaustive Entity Recognition for Coptic - Challenges and Solutions ...
BASE
Show details
8
Imagining Grounded Conceptual Representations from Perceptual Information in Situated Guessing Games ...
BASE
Show details
9
Attentively Embracing Noise for Robust Latent Representation in BERT ...
BASE
Show details
10
Catching Attention with Automatic Pull Quote Selection ...
BASE
Show details
11
Classifier Probes May Just Learn from Linear Context Features ...
BASE
Show details
12
Seeing the world through text: Evaluating image descriptions for commonsense reasoning in machine reading comprehension ...
BASE
Show details
13
Part 6 - Cross-linguistic Studies ...
BASE
Show details
14
Manifold Learning-based Word Representation Refinement Incorporating Global and Local Information ...
BASE
Show details
15
HMSid and HMSid2 at PARSEME Shared Task 2020: Computational Corpus Linguistics and unseen-in-training MWEs ...
BASE
Show details
16
Autoencoding Improves Pre-trained Word Embeddings ...
BASE
Show details
17
Exploring End-to-End Differentiable Natural Logic Modeling ...
BASE
Show details
18
AutoMeTS: The Autocomplete for Medical Text Simplification. ...
BASE
Show details
19
A Closer Look at Linguistic Knowledge in Masked Language Models: The Case of Relative Clauses in American English ...
Abstract: Transformer-based language models achieve high performance on various tasks, but we still lack understanding of the kind of linguistic knowledge they learn and rely on. We evaluate three models (BERT, RoBERTa, and ALBERT), testing their grammatical and semantic knowledge by sentence-level probing, diagnostic cases, and masked prediction tasks. We focus on relative clauses (in American English) as a complex phenomenon needing contextual information and antecedent identification to be resolved. Based on a naturalistic dataset, probing shows that all three models indeed capture linguistic knowledge about grammaticality, achieving high performance.Evaluation on diagnostic cases and masked prediction tasks considering fine-grained linguistic knowledge, however, shows pronounced model-specific weaknesses especially on semantic knowledge, strongly impacting models’ performance. Our results highlight the importance of (a)model comparison in evaluation task and (b) building up claims of model performance and the ...
Keyword: Computer and Information Science; Natural Language Processing; Neural Network
URL: https://dx.doi.org/10.48448/xwge-6p50
https://underline.io/lecture/6358-a-closer-look-at-linguistic-knowledge-in-masked-language-models-the-case-of-relative-clauses-in-american-english
BASE
Hide details
20
SemEval Task 6: DeftEval ...
BASE
Show details

Page: 1 2 3 4 5...26

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
515
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern