DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...45
Hits 1 – 20 of 881

1
Multi-National Topics Maps for Parliamentary Debate Analysis
BASE
Show details
2
Assessing English language sentences readability using machine learning models
In: PeerJ Comput Sci (2022)
BASE
Show details
3
Identity-Based Patterns in Deep Convolutional Networks: Generative Adversarial Phonology and Reduplication ...
BASE
Show details
4
G-4 - A pipeline for Hand 2-D Keypoint Localization using Unpaired Image to Image Translation ...
BASE
Show details
5
Signed Coreference Resolution ...
BASE
Show details
6
Backtranslation in Neural Morphological Inflection ...
BASE
Show details
7
Rule-based Morphological Inflection Improves Neural Terminology Translation ...
BASE
Show details
8
Translating Headers of Tabular Data: A Pilot Study of Schema Translation ...
BASE
Show details
9
A Prototype Free/Open-Source Morphological Analyser and Generator for Sakha ...
BASE
Show details
10
Automatic Error Type Annotation for Arabic ...
BASE
Show details
11
Developing Conversational Data and Detection of Conversational Humor in Telugu ...
BASE
Show details
12
Cross-document Event Identity via Dense Annotation ...
BASE
Show details
13
Navigating the Kaleidoscope of COVID-19 Misinformation Using Deep Learning ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.485/ Abstract: Irrespective of the success of the deep learning- based mixed-domain transfer learning approach for solving various Natural Language Processing tasks, it does not lend a generalizable solution for detecting misinformation from COVID-19 social media data. Due to the inherent complexity of this type of data, caused by its dynamic (context evolves rapidly), nuanced (misinformation types are often ambiguous), and diverse (skewed, fine-grained, and overlapping categories) nature, it is imperative for an effective model to capture both the local and global context of the target domain. By conducting a systematic investigation, we show that: (i) the deep Transformer- based pre-trained models, utilized via the mixed-domain transfer learning, are only good at capturing the local context, thus exhibits poor generalization, and (ii) a combination of shallow network-based domain-specific models and convolutional neural networks can efficiently ...
Keyword: Computational Linguistics; Covid-19; Deep Learning; Language Models; Machine Learning; Machine Learning and Data Mining; Natural Language Processing
URL: https://dx.doi.org/10.48448/yyza-sr36
https://underline.io/lecture/37959-navigating-the-kaleidoscope-of-covid-19-misinformation-using-deep-learning
BASE
Hide details
14
(Mis)alignment Between Stance Expressed in Social Media Data and Public Opinion Surveys ...
BASE
Show details
15
Adversarial Regularization as Stackelberg Game: An Unrolled Optimization Approach ...
BASE
Show details
16
Rewards with Negative Examples for Reinforced Topic-Focused Abstractive Summarization ...
BASE
Show details
17
Distantly-Supervised Named Entity Recognition with Noise-Robust Learning and Language Model Augmented Self-Training ...
BASE
Show details
18
Low-Resource Dialogue Summarization with Domain-Agnostic Multi-Source Pretraining ...
BASE
Show details
19
HittER: Hierarchical Transformers for Knowledge Graph Embeddings ...
BASE
Show details
20
Ara-Women-Hate: The first Arabic Hate Speech corpus regarding Women ...
BASE
Show details

Page: 1 2 3 4 5...45

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
881
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern