1 |
Are neural language models sensitive to false belief? A computational study. ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Can distributional semantics explain performance on the false belief task? ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
sj-docx-1-las-10.1177_00238309221087715 – Supplemental material for The Role of Prosody in Disambiguating English Indirect Requests ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
The Role of Prosody in Disambiguating English Indirect Requests ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
The Role of Prosody in Disambiguating English Indirect Requests ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
sj-docx-1-las-10.1177_00238309221087715 – Supplemental material for The Role of Prosody in Disambiguating English Indirect Requests ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Can comprehenders use prosody to interpret potential indirect requests? ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Ambiguity in the mental lexicon: Pre-registered replication ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Taboo: Analysis of referring expressions in an interactive communication game ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
RAW-C: Relatedness of Ambiguous Words in Context (A New Lexical Resource for English) ...
|
|
|
|
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-long.550 Abstract: Most words are ambiguous––i.e., they convey distinct meanings in different contexts––and even the meanings of unambiguous words are context-dependent. Both phenomena present a challenge for NLP. Recently, the advent of contextualized word embeddings has led to success on tasks involving lexical ambiguity, such as Word Sense Disambiguation. However, there are few tasks that directly evaluate how well these contextualized embeddings accommodate the more continuous, dynamic nature of word meaning––particularly in a way that matches human intuitions. We introduce RAW-C, a dataset of graded, human relatedness judgments for 112 ambiguous words in context (with 672 sentence pairs total), as well as human estimates of sense dominance. The average inter-annotator agreement (assessed using a leave-one-annotator-out method) was 0.79. We then show that a measure of cosine distance, computed using contextualized embeddings from BERT and ELMo, correlates ...
|
|
Keyword:
Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
|
|
URL: https://dx.doi.org/10.48448/fc6y-2r22 https://underline.io/lecture/25785-raw-c-relatedness-of-ambiguous-words-in-context-(a-new-lexical-resource-for-english)
|
|
BASE
|
|
Hide details
|
|
|
|