DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
SCiL 2022 Editors' Note
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
2
Sorting through the noise: Testing robustness of information processing in pre-trained language models ...
BASE
Show details
3
On the Interplay Between Fine-tuning and Composition in Transformers ...
Yu, Lang; Ettinger, Allyson. - : arXiv, 2021
BASE
Show details
4
On the Interplay Between Fine-tuning and Composition in Transformers ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.201 Abstract: Pre-trained transformer language models have shown remarkable performance on a variety of NLP tasks. However, recent research has suggested that phrase-level representations in these models reflect heavy influences of lexical content, but lack evidence of sophisticated, compositional phrase information. Here we investigate the impact of fine-tuning on the capacity of contextualized embeddings to capture phrase meaning information beyond lexical content. Specifically, we fine-tune models on an adversarial paraphrase classification task with high lexical overlap, and on a sentiment classification task. After fine-tuning, we analyze phrasal representations in controlled settings following prior work. We find that fine-tuning largely fails to benefit compositionality in these representations, though training on sentiment yields a small, localized benefit for certain models. In follow-up analyses, we identify confounding cues in the ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/h1kw-6946
https://underline.io/lecture/26292-on-the-interplay-between-fine-tuning-and-composition-in-transformers
BASE
Hide details
5
Pragmatic competence of pre-trained language models through the lens of discourse connectives ...
BASE
Show details
6
Pragmatic competence of pre-trained language models through the lens of discourse connectives ...
BASE
Show details
7
Preface: SCiL 2021 Editors' Note
In: Proceedings of the Society for Computation in Linguistics (2021)
BASE
Show details
8
Exploring BERT's Sensitivity to Lexical Cues using Tests from Semantic Priming ...
BASE
Show details
9
Assessing Phrasal Representation and Composition in Transformers ...
Yu, Lang; Ettinger, Allyson. - : arXiv, 2020
BASE
Show details
10
Preface: SCiL 2020 Editors' Note
In: Proceedings of the Society for Computation in Linguistics (2020)
BASE
Show details
11
What BERT Is Not: Lessons from a New Suite of Psycholinguistic Diagnostics for Language Models
In: Transactions of the Association for Computational Linguistics, Vol 8, Pp 34-48 (2020) (2020)
BASE
Show details
12
Mandarin utterance-final particle ba (吧) in the conversational scoreboard
In: Sinn und Bedeutung; Bd. 19 (2015): Proceedings of Sinn und Bedeutung 19; 232-251 ; Proceedings of Sinn und Bedeutung; Vol 19 (2015): Proceedings of Sinn und Bedeutung 19; 232-251 ; 2629-6055 (2019)
BASE
Show details
13
Assessing Composition in Sentence Vector Representations ...
BASE
Show details
14
Relating lexical and syntactic processes in language: Bridging research in humans and machines ...
Ettinger, Allyson. - : Digital Repository at the University of Maryland, 2018
BASE
Show details
15
Relating lexical and syntactic processes in language: Bridging research in humans and machines
BASE
Show details
16
Towards Linguistically Generalizable NLP Systems: A Workshop and Shared Task ...
BASE
Show details
17
The role of morphology in phoneme prediction: Evidence from MEG
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 129 (2014), 14-23
OLC Linguistik
Show details
18
Mandarin utterance-final particle ba in the conversational scoreboard
In: LSA Annual Meeting Extended Abstracts; Vol 4: LSA Annual Meeting Extended Abstracts 2013; 13:1-5 ; 2377-3367 (2013)
BASE
Show details

Catalogues
0
0
1
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
17
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern