DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 23

1
Towards the Early Detection of Child Predators in Chat Rooms: A BERT-based Approach ...
BASE
Show details
2
To what extent do human explanations of model behavior align with actual model behavior? ...
BASE
Show details
3
Elementary-Level Math Word Problem Generation using Pre-Trained Transformers ...
BASE
Show details
4
What Models Know About Their Attackers: Deriving Attacker Information From Latent Representations ...
BASE
Show details
5
EM ALBERT: a step towards equipping Manipuri for NLP ...
BASE
Show details
6
Language, Brains & Interpretability ...
BASE
Show details
7
Segment, Mask, and Predict: Augmenting Chinese Word Segmentation with Self-Supervision ...
BASE
Show details
8
Coral: An Approach for Conversational Agents in Mental Health Applications ...
BASE
Show details
9
#WhyDidTheyStay: An NLP-driven approach to analyzing the factors that affect domestic violence victims ...
BASE
Show details
10
SciBERT-based Multitasking Deep Neural Architecture to identify Contribution Statements from Scientific articles ...
BASE
Show details
11
"I don't know who she is": Discourse and Knowledge Driven Coreference Resolution ...
BASE
Show details
12
Certified Robustness to Programmable Transformations in LSTMs ...
BASE
Show details
13
Sinhala-English Code-mixed and Code-switched Data Classification ...
BASE
Show details
14
Adverse Drug Reaction Classification of Tweets with Fusion of Text and Drug Representations ...
BASE
Show details
15
Learning Cross-lingual Representations for Event Coreference Resolution with Multi-view Alignment and Optimal Transport ...
Abstract: We study a new problem of cross-lingual transfer learning for event coreference resolution (ECR) where models trained on data from a source language are adapted for evaluations in different target languages. We introduce the first baseline model for this task based on XLM-RoBERTa, a state-of-the-art multilingual pre-trained language model. We also explore language adversarial neural networks (LANN) that present language discriminators to distinguish texts from the source and target languages to improve the language generalization for ECR. In addition, we introduce two novel mechanisms to further enhance the general representation learning of LANN, featuring: (i) multi-view alignment to penalize cross coreference-label alignment of examples in the source and target languages, and (ii) optimal transport to select close examples in the source and target languages to provide better training signals for the language discriminators. Finally, we perform extensive experiments for cross-lingual ECR from English to ...
Keyword: Computational Linguistics; Language Models; Machine Learning; Machine Learning and Data Mining; Natural Language Processing; Neural Network
URL: https://dx.doi.org/10.48448/s96d-s948
https://underline.io/lecture/39632-learning-cross-lingual-representations-for-event-coreference-resolution-with-multi-view-alignment-and-optimal-transport
BASE
Hide details
16
VisualSem: a high-quality knowledge graph for vision and language ...
BASE
Show details
17
Specializing Multilingual Language Models: An Empirical Study ...
BASE
Show details
18
On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning ...
BASE
Show details
19
Occupational Gender stereotypes in Indic Languages ...
BASE
Show details
20
One-Shot Lexicon Learning for Low-Resource Machine Translation ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
23
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern