2 |
Exploiting Microblog Conversation Structures to Detect Rumors ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Enhanced Labelling in Active Learning for Coreference Resolution ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Predicting Coreference in Abstract Meaning Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
NILC at SR'20: Exploring Pre-Trained Models in Surface Realisation ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Persuasiveness of News Editorials depending on Ideology and Personality ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
E.T.: Entity-Transformers. Coreference augmented Neural Language Model for richer mention representations via Entity-Transformer blocks ...
|
|
|
|
Abstract:
In the last decade, the field of Neural Language Modelling has witnessed enormous changes, with the development of novel models through the use of Transformer architectures. However, even these models struggle to model long sequences due to memory constraints and increasing computational complexity. Coreference annotations over the training data can provide context far beyond the modelling limitations of such language models. In this paper we present an extension over the Transformer-block architecture used in neural language models, specifically in GPT2, in order to incorporate entity annotations during training. Our model, GPT2E, extends the Transformer layers architecture of GPT2 to Entity-Transformers, an architecture designed to handle coreference information when present. To that end, we achieve richer representations for entity mentions, with insignificant training cost. We show the comparative model performance between GPT2 and GPT2E in terms of Perplexity on the CoNLL 2012 and LAMBADA datasets as ...
|
|
Keyword:
Computer and Information Science; Digital Media; Engineering; Information and Knowledge Engineering; Natural Language Processing; Neural Network
|
|
URL: https://underline.io/lecture/6573-e.t.-entity-transformers.-coreference-augmented-neural-language-model-for-richer-mention-representations-via-entity-transformer-blocks https://dx.doi.org/10.48448/payz-qm06
|
|
BASE
|
|
Hide details
|
|
8 |
TwiConv: A Coreference-annotated Corpus of Twitter Conversations ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Resolving Pronouns in Twitter Streams: Context can Help! ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Matching Theory and Data with Personal-ITY: What a Corpus of Italian YouTube Comments Reveals About Personality ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
IMSurReal Too: IMS at the Surface Realization Shared Task 2020 ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Multilingual Emoticon Prediction of Tweets about COVID-19 ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
BME-TUW at SR’20: Lexical grammar induction for surface realization ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
It's absolutely divine! Can fine-grained sentiment analysis benefit from coreference resolution? ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Integrating knowledge graph embeddings to improve mention representation for bridging anaphora resolution ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Anaphoric Zero Pronoun Identification: A Multilingual Approach ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|