DE eng

Search in the Catalogues and Directories

Hits 1 – 9 of 9

1
Transformer-Based Abstractive Summarization for Reddit and Twitter: Single Posts vs. Comment Pools in Three Languages
In: Future Internet; Volume 14; Issue 3; Pages: 69 (2022)
BASE
Show details
2
FedQAS: Privacy-Aware Machine Reading Comprehension with Federated Learning
In: Applied Sciences; Volume 12; Issue 6; Pages: 3130 (2022)
BASE
Show details
3
Correcting Diacritics and Typos with a ByT5 Transformer Model
In: Applied Sciences; Volume 12; Issue 5; Pages: 2636 (2022)
BASE
Show details
4
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
Abstract: Due to the promising performance of pre-trained language models for task-oriented dialogue systems (DS) in English, some efforts to provide multilingual models for task-oriented DS in low-resource languages have emerged. These efforts still face a long-standing challenge due to the lack of high-quality data for these languages, especially Arabic. To circumvent the cost and time-intensive data collection and annotation, cross-lingual transfer learning can be used when few training data are available in the low-resource target language. Therefore, this study aims to explore the effectiveness of cross-lingual transfer learning in building an end-to-end Arabic task-oriented DS using the mT5 transformer model. We use the Arabic task-oriented dialogue dataset (Arabic-TOD) in the training and testing of the model. We present the cross-lingual transfer learning deployed with three different approaches: mSeq2Seq, Cross-lingual Pre-training (CPT), and Mixed-Language Pre-training (MLT). We obtain good results for our model compared to the literature for Chinese language using the same settings. Furthermore, cross-lingual transfer learning deployed with the MLT approach outperform the other two approaches. Finally, we show that our results can be improved by increasing the training dataset size.
Keyword: Arabic language; cross-lingual transfer learning; mixed-language pre-training; mT5; multilingual transformer model; natural language processing; task-oriented dialogue systems
URL: https://doi.org/10.3390/math10050746
BASE
Hide details
5
AraConv: Developing an Arabic Task-Oriented Dialogue System Using Multi-Lingual Transformer Model mT5
In: Applied Sciences; Volume 12; Issue 4; Pages: 1881 (2022)
BASE
Show details
6
Retrieval-Based Transformer Pseudocode Generation
In: Mathematics; Volume 10; Issue 4; Pages: 604 (2022)
BASE
Show details
7
Breaking Down the Invisible Wall of Informal Fallacies in Online Discussions
In: ACL-IJCNLP 2021 - Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing ; https://hal.inria.fr/hal-03351649 ; ACL-IJCNLP 2021 - Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Aug 2021, Online, France ; https://2021.aclweb.org/ (2021)
BASE
Show details
8
Language Representation Models: An Overview
In: Entropy ; Volume 23 ; Issue 11 (2021)
BASE
Show details
9
Exploiting BERT and RoBERTa to Improve Performance for Aspect Based Sentiment Analysis
Narayanaswamy, Gagan Reddy. - : Technological University Dublin, 2021
In: Dissertations (2021)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
9
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern