DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...32
Hits 1 – 20 of 631

1
MAGIC DUST FOR CROSS-LINGUAL ADAPTATION OF MONOLINGUAL WAV2VEC-2.0
In: ICASSP 2022 ; https://hal.archives-ouvertes.fr/hal-03544515 ; ICASSP 2022, May 2022, Singapour, Singapore (2022)
BASE
Show details
2
Cross-lingual few-shot hate speech and offensive language detection using meta learning
In: ISSN: 2169-3536 ; EISSN: 2169-3536 ; IEEE Access ; https://hal.archives-ouvertes.fr/hal-03559484 ; IEEE Access, IEEE, 2022, 10, pp.14880-14896. ⟨10.1109/ACCESS.2022.3147588⟩ (2022)
BASE
Show details
3
APPLIED LINGUISTICS AND LANGUAGE TEACHER'S STRATERGY ...
Jamoldinov Sanjarbek. - : Zenodo, 2022
BASE
Show details
4
APPLIED LINGUISTICS AND LANGUAGE TEACHER'S STRATERGY ...
Jamoldinov Sanjarbek. - : Zenodo, 2022
BASE
Show details
5
Attributions of Successful English Language Learners in Transfer-Level English
In: Doctoral Dissertations and Projects (2022)
BASE
Show details
6
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
Abstract: Due to the promising performance of pre-trained language models for task-oriented dialogue systems (DS) in English, some efforts to provide multilingual models for task-oriented DS in low-resource languages have emerged. These efforts still face a long-standing challenge due to the lack of high-quality data for these languages, especially Arabic. To circumvent the cost and time-intensive data collection and annotation, cross-lingual transfer learning can be used when few training data are available in the low-resource target language. Therefore, this study aims to explore the effectiveness of cross-lingual transfer learning in building an end-to-end Arabic task-oriented DS using the mT5 transformer model. We use the Arabic task-oriented dialogue dataset (Arabic-TOD) in the training and testing of the model. We present the cross-lingual transfer learning deployed with three different approaches: mSeq2Seq, Cross-lingual Pre-training (CPT), and Mixed-Language Pre-training (MLT). We obtain good results for our model compared to the literature for Chinese language using the same settings. Furthermore, cross-lingual transfer learning deployed with the MLT approach outperform the other two approaches. Finally, we show that our results can be improved by increasing the training dataset size.
Keyword: Arabic language; cross-lingual transfer learning; mixed-language pre-training; mT5; multilingual transformer model; natural language processing; task-oriented dialogue systems
URL: https://doi.org/10.3390/math10050746
BASE
Hide details
7
Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models
In: Applied Sciences; Volume 12; Issue 9; Pages: 4522 (2022)
BASE
Show details
8
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
BASE
Show details
9
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning
Nguyen, Huong Thi Thu. - : Humboldt-Universität zu Berlin, 2022
BASE
Show details
10
ETHNOCULTURAL AND SOCIOLINGUISTIC FACTORS IN TEACHING RUSSIAN AS A FOREIGN LANGUAGE ...
Sanobar Saidovna Abdumuratova. - : Academic research in educational sciences, 2022
BASE
Show details
11
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning ...
Nguyen, Huong Thi Thu. - : Humboldt-Universität zu Berlin, 2022
BASE
Show details
12
The Rise and Fall of Linguistic Transfer ...
Ozernyi, Daniil M.. - : Zenodo, 2022
BASE
Show details
13
The Rise and Fall of Linguistic Transfer ...
Ozernyi, Daniil M.. - : Zenodo, 2022
BASE
Show details
14
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
Ni, Pin; Li, Gangmin; Hung, Patrick C.K.. - : Elsevier Ltd, 2022
BASE
Show details
15
An Empirical Study of Factors Affecting Language-Independent Models
BASE
Show details
16
„A Hund is er scho’“. Die Migration eines Ausdrucks und seine bayerisch-ungarische Transfergeschichte
Weithmann, Michael. - : Universität Tübingen, 2022
BASE
Show details
17
Neural-based Knowledge Transfer in Natural Language Processing
Wang, Chao. - 2022
BASE
Show details
18
Chinese Idioms: Stepping Into L2 Student’s Shoes
In: Acta Linguistica Asiatica, Vol 12, Iss 1 (2022) (2022)
BASE
Show details
19
Some remarks on the history of transfer in language studies
In: Proceedings of the Linguistic Society of America; Vol 7, No 1 (2022): Proceedings of the Linguistic Society of America; 5206 ; 2473-8689 (2022)
BASE
Show details
20
The Value and Use of the Telugu Language in Young Adults of Telugu-Speaking Backgrounds in New Zealand
Kasarla, Lahari. - : Auckland University of Technology, 2021
BASE
Show details

Page: 1 2 3 4 5...32

Catalogues
28
0
1
0
0
0
3
Bibliographies
68
0
0
0
0
0
0
2
7
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
547
2
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern