21 |
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
|
|
|
|
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
|
|
BASE
|
|
Show details
|
|
22 |
Measuring Terminology Consistency in Translated Corpora: Implementation of the Herfindahl-Hirshman Index
|
|
|
|
In: Information; Volume 13; Issue 2; Pages: 43 (2022)
|
|
BASE
|
|
Show details
|
|
23 |
Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models
|
|
|
|
In: Applied Sciences; Volume 12; Issue 9; Pages: 4522 (2022)
|
|
Abstract:
Recently, transformer-based pretrained language models have demonstrated stellar performance in natural language understanding (NLU) tasks. For example, bidirectional encoder representations from transformers (BERT) have achieved outstanding performance through masked self-supervised pretraining and transformer-based modeling. However, the original BERT may only be effective for English-based NLU tasks, whereas its effectiveness for other languages such as Korean is limited. Thus, the applicability of BERT-based language models pretrained in languages other than English to NLU tasks based on those languages must be investigated. In this study, we comparatively evaluated seven BERT-based pretrained language models and their expected applicability to Korean NLU tasks. We used the climate technology dataset, which is a Korean-based large text classification dataset, in research proposals involving 45 classes. We found that the BERT-based model pretrained on the most recent Korean corpus performed the best in terms of Korean-based multiclass text classification. This suggests the necessity of optimal pretraining for specific NLU tasks, particularly those in languages other than English.
|
|
Keyword:
bidirectional encoder representations from transformers; cross-lingual representation learning; multiclass text classification; multilingual representation learning; natural language understanding; transfer learning
|
|
URL: https://doi.org/10.3390/app12094522
|
|
BASE
|
|
Hide details
|
|
24 |
The Role of Task Complexity and Dominant Articulatory Routines in the Acquisition of L3 Spanish
|
|
|
|
In: Languages; Volume 7; Issue 2; Pages: 90 (2022)
|
|
BASE
|
|
Show details
|
|
25 |
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
|
|
|
|
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
|
|
BASE
|
|
Show details
|
|
26 |
Analyzing COVID-19 Medical Papers Using Artificial Intelligence: Insights for Researchers and Medical Professionals
|
|
|
|
In: Big Data and Cognitive Computing; Volume 6; Issue 1; Pages: 4 (2022)
|
|
BASE
|
|
Show details
|
|
27 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning
|
|
|
|
BASE
|
|
Show details
|
|
30 |
ETHNOCULTURAL AND SOCIOLINGUISTIC FACTORS IN TEACHING RUSSIAN AS A FOREIGN LANGUAGE ...
|
|
|
|
BASE
|
|
Show details
|
|
31 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
34 |
Toward an Epistemic Web
|
|
|
|
In: 197 ; RatSWD Working Paper Series ; 22 (2022)
|
|
BASE
|
|
Show details
|
|
35 |
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
|
|
|
|
BASE
|
|
Show details
|
|
37 |
An Empirical Study of Factors Affecting Language-Independent Models
|
|
|
|
BASE
|
|
Show details
|
|
38 |
„A Hund is er scho’“. Die Migration eines Ausdrucks und seine bayerisch-ungarische Transfergeschichte
|
|
|
|
BASE
|
|
Show details
|
|
39 |
Neural-based Knowledge Transfer in Natural Language Processing
|
|
|
|
BASE
|
|
Show details
|
|
40 |
Chinese Idioms: Stepping Into L2 Student’s Shoes
|
|
|
|
In: Acta Linguistica Asiatica, Vol 12, Iss 1 (2022) (2022)
|
|
BASE
|
|
Show details
|
|
|
|