DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 45

1
How Efficiency Shapes Human Language
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
BASE
Show details
2
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
BASE
Show details
3
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
BASE
Show details
4
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
5
Efficient communication and the organization of the lexicon
In: OUP volume on the Mental Lexicon ; https://hal.archives-ouvertes.fr/hal-03482414 ; OUP volume on the Mental Lexicon, In press, ⟨10.31234/osf.io/4an6v⟩ (2021)
BASE
Show details
6
Decrypting Cryptic Crosswords: Semantically Complex Wordplay Puzzles as a Target for NLP ...
BASE
Show details
7
A Massively Multilingual Analysis of Cross-linguality in Shared Embedding Space ...
BASE
Show details
8
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
Abstract: We investigate how Multilingual BERT (mBERT) encodes grammar by examining how the high-order grammatical feature of morphosyntactic alignment (how different languages define what counts as a "subject") is manifested across the embedding spaces of different languages. To understand if and how morphosyntactic alignment affects contextual embedding spaces, we train classifiers to recover the subjecthood of mBERT embeddings in transitive sentences (which do not contain overt information about morphosyntactic alignment) and then evaluate them zero-shot on intransitive sentences (where subjecthood classification depends on alignment), within and across languages. We find that the resulting classifier distributions reflect the morphosyntactic alignment of their training languages. Our results demonstrate that mBERT representations are influenced by high-level grammatical features that are not manifested in any one input sentence, and that this is robust across languages. Further examining the characteristics that ... : EACL 2021 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2101.11043
https://dx.doi.org/10.48550/arxiv.2101.11043
BASE
Hide details
9
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
Papadimitriou, Isabel; Chi, Ethan A.; Futrell, Richard. - : University of Massachusetts Amherst, 2021
BASE
Show details
10
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
BASE
Show details
11
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
BASE
Show details
12
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
BASE
Show details
13
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
BASE
Show details
14
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
BASE
Show details
15
Decrypting cryptic crosswords: Semantically complex wordplay puzzles as a target for NLP ...
BASE
Show details
16
A Massively Multilingual Analysis of Cross-linguality in Shared Embedding Space ...
BASE
Show details
17
REPLICATING A FUNDAMENTAL FINDING IN PSYCHOLINGUISTICS: SYNTACTIC PRIMING ...
Mahowald, Kyle. - : Open Science Framework, 2021
BASE
Show details
18
How (Non-)Optimal is the Lexicon? ...
BASE
Show details
19
How (Non-)Optimal is the Lexicon?
In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
BASE
Show details
20
How (Non-)Optimal is the Lexicon? ...
NAACL 2021 2021; Blasi, Damián; Cotterell, Ryan. - : Underline Science Inc., 2021
BASE
Show details

Page: 1 2 3

Catalogues
0
0
2
0
0
0
0
Bibliographies
2
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
43
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern