1 |
CLEVE: Contrastive Pre-training for Event Extraction ...
|
|
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021; Han, Xu; Hou, Lei; Li, Juanzi; Li, Peng; Lin, Yankai; Liu, Zhiyuan; Wang, Ziqi; Wang, Xiaozhi; Zhou, Jie. - : Underline Science Inc., 2021
|
|
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-long.491 Abstract: Event extraction (EE) has considerably benefited from pre-trained language models (PLMs) by fine-tuning. However, existing pre-training methods have not involved modeling event characteristics, resulting in the developed EE models cannot take full advantage of large-scale unsupervised data. To this end, we propose CLEVE, a contrastive pre-training framework for EE to better learn event knowledge from large unsupervised data and their semantic structures (e.g. AMR) obtained with automatic parsers. CLEVE contains a text encoder to learn event semantics and a graph encoder to learn event structures respectively. Specifically, the text encoder learns event semantic representations by self-supervised contrastive learning to represent the words of the same events closer than those unrelated words; the graph encoder learns event structure representations by graph contrastive pre-training on parsed event-related semantic structures. The two ...
|
|
Keyword:
Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
|
|
URL: https://underline.io/lecture/25941-cleve-contrastive-pre-training-for-event-extraction https://dx.doi.org/10.48448/pjm0-5118
|
|
BASE
|
|
Hide details
|
|
|
|