1 |
MAGIC DUST FOR CROSS-LINGUAL ADAPTATION OF MONOLINGUAL WAV2VEC-2.0
|
|
|
|
In: ICASSP 2022 ; https://hal.archives-ouvertes.fr/hal-03544515 ; ICASSP 2022, May 2022, Singapour, Singapore (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Cross-lingual few-shot hate speech and offensive language detection using meta learning
|
|
|
|
In: ISSN: 2169-3536 ; EISSN: 2169-3536 ; IEEE Access ; https://hal.archives-ouvertes.fr/hal-03559484 ; IEEE Access, IEEE, 2022, 10, pp.14880-14896. ⟨10.1109/ACCESS.2022.3147588⟩ (2022)
|
|
BASE
|
|
Show details
|
|
5 |
Attributions of Successful English Language Learners in Transfer-Level English
|
|
|
|
In: Doctoral Dissertations and Projects (2022)
|
|
BASE
|
|
Show details
|
|
6 |
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
|
|
|
|
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
|
|
BASE
|
|
Show details
|
|
7 |
Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models
|
|
|
|
In: Applied Sciences; Volume 12; Issue 9; Pages: 4522 (2022)
|
|
Abstract:
Recently, transformer-based pretrained language models have demonstrated stellar performance in natural language understanding (NLU) tasks. For example, bidirectional encoder representations from transformers (BERT) have achieved outstanding performance through masked self-supervised pretraining and transformer-based modeling. However, the original BERT may only be effective for English-based NLU tasks, whereas its effectiveness for other languages such as Korean is limited. Thus, the applicability of BERT-based language models pretrained in languages other than English to NLU tasks based on those languages must be investigated. In this study, we comparatively evaluated seven BERT-based pretrained language models and their expected applicability to Korean NLU tasks. We used the climate technology dataset, which is a Korean-based large text classification dataset, in research proposals involving 45 classes. We found that the BERT-based model pretrained on the most recent Korean corpus performed the best in terms of Korean-based multiclass text classification. This suggests the necessity of optimal pretraining for specific NLU tasks, particularly those in languages other than English.
|
|
Keyword:
bidirectional encoder representations from transformers; cross-lingual representation learning; multiclass text classification; multilingual representation learning; natural language understanding; transfer learning
|
|
URL: https://doi.org/10.3390/app12094522
|
|
BASE
|
|
Hide details
|
|
8 |
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
|
|
|
|
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
|
|
BASE
|
|
Show details
|
|
9 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning
|
|
|
|
BASE
|
|
Show details
|
|
10 |
ETHNOCULTURAL AND SOCIOLINGUISTIC FACTORS IN TEACHING RUSSIAN AS A FOREIGN LANGUAGE ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
|
|
|
|
BASE
|
|
Show details
|
|
15 |
An Empirical Study of Factors Affecting Language-Independent Models
|
|
|
|
BASE
|
|
Show details
|
|
16 |
„A Hund is er scho’“. Die Migration eines Ausdrucks und seine bayerisch-ungarische Transfergeschichte
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Neural-based Knowledge Transfer in Natural Language Processing
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Chinese Idioms: Stepping Into L2 Student’s Shoes
|
|
|
|
In: Acta Linguistica Asiatica, Vol 12, Iss 1 (2022) (2022)
|
|
BASE
|
|
Show details
|
|
19 |
Some remarks on the history of transfer in language studies
|
|
|
|
In: Proceedings of the Linguistic Society of America; Vol 7, No 1 (2022): Proceedings of the Linguistic Society of America; 5206 ; 2473-8689 (2022)
|
|
BASE
|
|
Show details
|
|
20 |
The Value and Use of the Telugu Language in Young Adults of Telugu-Speaking Backgrounds in New Zealand
|
|
|
|
BASE
|
|
Show details
|
|
|
|