DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...45
Hits 1 – 20 of 887

1
On Homophony and Rényi Entropy ...
BASE
Show details
2
Identity-Based Patterns in Deep Convolutional Networks: Generative Adversarial Phonology and Reduplication ...
BASE
Show details
3
Signed Coreference Resolution ...
BASE
Show details
4
Backtranslation in Neural Morphological Inflection ...
BASE
Show details
5
Rule-based Morphological Inflection Improves Neural Terminology Translation ...
BASE
Show details
6
Translating Headers of Tabular Data: A Pilot Study of Schema Translation ...
BASE
Show details
7
A Prototype Free/Open-Source Morphological Analyser and Generator for Sakha ...
BASE
Show details
8
Automatic Error Type Annotation for Arabic ...
BASE
Show details
9
Developing Conversational Data and Detection of Conversational Humor in Telugu ...
BASE
Show details
10
An Information-Theoretic Characterization of Morphological Fusion ...
BASE
Show details
11
Cross-document Event Identity via Dense Annotation ...
BASE
Show details
12
Navigating the Kaleidoscope of COVID-19 Misinformation Using Deep Learning ...
BASE
Show details
13
(Mis)alignment Between Stance Expressed in Social Media Data and Public Opinion Surveys ...
BASE
Show details
14
Adversarial Regularization as Stackelberg Game: An Unrolled Optimization Approach ...
BASE
Show details
15
Rewards with Negative Examples for Reinforced Topic-Focused Abstractive Summarization ...
BASE
Show details
16
Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.810/ Abstract: We study the problem of training named entity recognition (NER) models using only distantly-labeled data, which can be automatically obtained by matching entity mentions in the raw text with entity types in a knowledge base. The biggest challenge of distantly-supervised NER is that the distant supervision may induce incomplete and noisy labels, rendering the straightforward application of supervised learning ineffective. In this paper, we propose (1) a noise-robust learning scheme comprised of a new loss function and a noisy label removal step, for training NER models on distantly-labeled data, and (2) a self-training method that uses contextualized augmentations created by pre-trained language models to improve the generalization ability of the NER model. On three benchmark datasets, our method achieves superior performance, outperforming existing distantly-supervised NER models by significant margins. ...
Keyword: Computational Linguistics; Information Extraction; Machine Learning; Machine Learning and Data Mining; Natural Language Processing
URL: https://underline.io/lecture/37472-distantly-supervised-named-entity-recognition-with-noise-robust-learning-and-language-model-augmented-self-training
https://dx.doi.org/10.48448/yba1-0652
BASE
Hide details
17
Low-Resource Dialogue Summarization with Domain-Agnostic Multi-Source Pretraining ...
BASE
Show details
18
HittER: Hierarchical Transformers for Knowledge Graph Embeddings ...
BASE
Show details
19
Ara-Women-Hate: The first Arabic Hate Speech corpus regarding Women ...
BASE
Show details
20
Detecting Gender Bias using Explainability ...
BASE
Show details

Page: 1 2 3 4 5...45

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
887
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern