DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
Neural Coreference Resolution for Arabic ...
BASE
Show details
2
Enhanced Labelling in Active Learning for Coreference Resolution ...
BASE
Show details
3
Predicting Coreference in Abstract Meaning Representations ...
BASE
Show details
4
NILC at SR'20: Exploring Pre-Trained Models in Surface Realisation ...
BASE
Show details
5
Persuasiveness of News Editorials depending on Ideology and Personality ...
BASE
Show details
6
E.T.: Entity-Transformers. Coreference augmented Neural Language Model for richer mention representations via Entity-Transformer blocks ...
Abstract: In the last decade, the field of Neural Language Modelling has witnessed enormous changes, with the development of novel models through the use of Transformer architectures. However, even these models struggle to model long sequences due to memory constraints and increasing computational complexity. Coreference annotations over the training data can provide context far beyond the modelling limitations of such language models. In this paper we present an extension over the Transformer-block architecture used in neural language models, specifically in GPT2, in order to incorporate entity annotations during training. Our model, GPT2E, extends the Transformer layers architecture of GPT2 to Entity-Transformers, an architecture designed to handle coreference information when present. To that end, we achieve richer representations for entity mentions, with insignificant training cost. We show the comparative model performance between GPT2 and GPT2E in terms of Perplexity on the CoNLL 2012 and LAMBADA datasets as ...
Keyword: Computer and Information Science; Digital Media; Engineering; Information and Knowledge Engineering; Natural Language Processing; Neural Network
URL: https://underline.io/lecture/6573-e.t.-entity-transformers.-coreference-augmented-neural-language-model-for-richer-mention-representations-via-entity-transformer-blocks
https://dx.doi.org/10.48448/payz-qm06
BASE
Hide details
7
TwiConv: A Coreference-annotated Corpus of Twitter Conversations ...
BASE
Show details
8
Resolving Pronouns in Twitter Streams: Context can Help! ...
BASE
Show details
9
Matching Theory and Data with Personal-ITY: What a Corpus of Italian YouTube Comments Reveals About Personality ...
BASE
Show details
10
Surface Realization Using Pretrained Language Models ...
BASE
Show details
11
IMSurReal Too: IMS at the Surface Realization Shared Task 2020 ...
BASE
Show details
12
Multilingual Emoticon Prediction of Tweets about COVID-19 ...
BASE
Show details
13
Coreference Strategies in English-German Translation ...
BASE
Show details
14
BME-TUW at SR’20: Lexical grammar induction for surface realization ...
BASE
Show details
15
It's absolutely divine! Can fine-grained sentiment analysis benefit from coreference resolution? ...
BASE
Show details
16
Integrating knowledge graph embeddings to improve mention representation for bridging anaphora resolution ...
BASE
Show details
17
Partially-supervised Mention Detection ...
BASE
Show details
18
Anaphoric Zero Pronoun Identification: A Multilingual Approach ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern