DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
Cross-lingual Aspect-based Sentiment Analysis with Aspect Term Code-Switching ...
BASE
Show details
2
Knowledge Based Multilingual Language Model ...
Abstract: Knowledge enriched language representation learning has shown promising performance across various knowledge-intensive NLP tasks. However, existing knowledge based language models are all trained with monolingual knowledge graph data, which limits their application to more languages. In this work, we present a novel framework to pretrain knowledge based multilingual language models (KMLMs). We first generate a large amount of code-switched synthetic sentences and reasoning-based multilingual training data using the Wikidata knowledge graphs. Then based on the intra- and inter-sentence structures of the generated data, we design pretraining tasks to facilitate knowledge learning, which allows the language models to not only memorize the factual knowledge but also learn useful logical patterns. Our pretrained KMLMs demonstrate significant performance improvements on a wide range of knowledge-intensive cross-lingual NLP tasks, including named entity recognition, factual knowledge retrieval, relation ...
Keyword: Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2111.10962
https://arxiv.org/abs/2111.10962
BASE
Hide details
3
MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER ...
Zhou, Ran; Li, Xin; He, Ruidan. - : arXiv, 2021
BASE
Show details
4
On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern