1 |
How Efficiency Shapes Human Language
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
Abstract:
We probe nouns in BERT contextual embedding space for grammatical role (subject vs. object of a clause), and examine how probing results vary between prototypical examples, where the role matches what we would expect from seeing that word in the context, and non-prototypical examples, where the role is mostly imparted by the context. In this way, engage with the contrast that has arisen in the literature, between studies that show contextual models as grammatically sensitive, and others that show that these models are robust to changes in word order. Our experiments yield three results: 1) Grammatical role is recovered in later layers for difficult non-prototypical cases, while prototypical cases are accurate without many layers of context 2) When we switch the subject and the object of a sentence around (eg, The chef cut the onion, The onion cut the chef), we see that the same word (eg, onion) can be fluently identified as both a subject and an object 3) Subjecthood probing breaks if we ablate local word order by shuffle words locally and break grammaticality.
|
|
Keyword:
BERT; Computational Linguistics; Contextual embeddings; grammatical role; prototype; subjecthood; verb arguments; word order
|
|
URL: https://scholarworks.umass.edu/scil/vol5/iss1/18 https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1244&context=scil
|
|
BASE
|
|
Hide details
|
|
5 |
Efficient communication and the organization of the lexicon
|
|
|
|
In: OUP volume on the Mental Lexicon ; https://hal.archives-ouvertes.fr/hal-03482414 ; OUP volume on the Mental Lexicon, In press, ⟨10.31234/osf.io/4an6v⟩ (2021)
|
|
BASE
|
|
Show details
|
|
6 |
Decrypting Cryptic Crosswords: Semantically Complex Wordplay Puzzles as a Target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
A Massively Multilingual Analysis of Cross-linguality in Shared Embedding Space ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
A Massively Multilingual Analysis of Cross-linguality in Shared Embedding Space ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
REPLICATING A FUNDAMENTAL FINDING IN PSYCHOLINGUISTICS: SYNTACTIC PRIMING ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
How (Non-)Optimal is the Lexicon?
|
|
|
|
In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
|
|
BASE
|
|
Show details
|
|
|
|