DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...7
Hits 1 – 20 of 121

1
Cross-lingual few-shot hate speech and offensive language detection using meta learning
In: ISSN: 2169-3536 ; EISSN: 2169-3536 ; IEEE Access ; https://hal.archives-ouvertes.fr/hal-03559484 ; IEEE Access, IEEE, 2022, 10, pp.14880-14896. ⟨10.1109/ACCESS.2022.3147588⟩ (2022)
BASE
Show details
2
Measuring Terminology Consistency in Translated Corpora: Implementation of the Herfindahl-Hirshman Index
In: Information; Volume 13; Issue 2; Pages: 43 (2022)
BASE
Show details
3
ETHNOCULTURAL AND SOCIOLINGUISTIC FACTORS IN TEACHING RUSSIAN AS A FOREIGN LANGUAGE ...
Sanobar Saidovna Abdumuratova. - : Academic research in educational sciences, 2022
BASE
Show details
4
Neural-based Knowledge Transfer in Natural Language Processing
Wang, Chao. - 2022
BASE
Show details
5
Science research writing: for native and non-native speakers of English
Glasman-Deal, Hilary. - : World Scientific, 2021
BASE
Show details
6
Hate speech and offensive language detection using transfer learning approaches ; Détection du discours de haine et du langage offensant utilisant des approches de Transfer Learning
Mozafari, Marzieh. - : HAL CCSD, 2021
In: https://tel.archives-ouvertes.fr/tel-03276023 ; Document and Text Processing. Institut Polytechnique de Paris, 2021. English. ⟨NNT : 2021IPPAS007⟩ (2021)
BASE
Show details
7
Improving Multilingual Models for the Swedish Language : Exploring CrossLingual Transferability and Stereotypical Biases
Katsarou, Styliani. - : KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021
Abstract: The best performing Transformer-based Language Models are monolingual and mainly focus on high-resource languages such as English. In an attempt to extend their usage to more languages, multilingual models have been introduced. Nevertheless, multilingual models still underperform on a specific language when compared to a similarly sized monolingual model that has been trained solely on that specific language. The main objective of this thesis project is to explore how a multilingual model can be improved for Swedish which is a low-resource language. We study if a multilingual model can benefit from further pre-training on Swedish or on a mix of English and Swedish text before fine-tuning. Our results on the task of semantic text similarity show that further pre-training increases the Pearson Correlation Score by 5% for specific cross-lingual language settings. Taking into account the responsibilities that arise from the increased use of Language Models in real-world applications, we supplement our work by additional experiments that measure stereotypical biases associated to gender. We use a new dataset that we designed specifically for that purpose. Our systematic study compares Swedish to English as well as various model sizes. The insights from our exploration indicate that the Swedish language carries less bias associated to gender than English and that higher manifestation of gender bias is associated to the use of larger Language Models. ; De bästa Transformerbaserade språkmodellerna är enspråkiga och fokuserar främst på resursrika språk som engelska. I ett försök att utöka deras användning till fler språk har flerspråkiga modeller introducerats. Flerspråkiga modeller underpresterar dock fortfarande på enskilda språk när man jämför med en enspråkig modell av samma storlek som enbart har tränats på det specifika språket. Huvudsyftet med detta examensarbete är att utforska hur en flerspråkig modell kan förbättras för svenska som är ett resurssnålt språk. Vi studerar om en flerspråkig modell kan dra nytta av ytterligare förträning på svenska eller av en blandning av engelsk och svensk text innan finjustering. Våra resultat på uppgiften om semantisk textlikhet visar att ytterligare förträning ökar Pearsons korrelationspoäng med 5% för specifika tvärspråkiga språkinställningar. Med hänsyn till det ansvar som uppstår från den ökade användningen av språkmodeller i verkliga tillämpningar, kompletterar vi vårt arbete med ytterligare experiment som mäter stereotypa fördomar kopplade till kön. Vi använder en ny datauppsättning som vi har utformat specifikt för det ändamålet. Vår systematiska studie jämför svenska med engelska samt olika modellstorlekar. Insikterna från vår forskning tyder på att det svenska språket har mindre partiskhet förknippat med kön än engelska, samt att högre manifestation av könsfördomar är förknippat med användningen av större språkmodeller.
Keyword: Computer and Information Sciences; Cross-Lingual Transfer; Data- och informationsvetenskap; Deep Learning; Multilingual Models; Natural Language Processing; Stereotypical Biases; Transformers
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-305920
BASE
Hide details
8
Europe, l'autre cap, entre traductions et transferts
In: https://hal.archives-ouvertes.fr/hal-03124335 ; 2020 (2020)
BASE
Show details
9
Clickbait detection using multimodel fusion and transfer learning ; Détection de clickbait utilisant fusion multimodale et apprentissage par transfert
In: https://tel.archives-ouvertes.fr/tel-03139880 ; Social and Information Networks [cs.SI]. Institut Polytechnique de Paris, 2020. English. ⟨NNT : 2020IPPAS025⟩ (2020)
BASE
Show details
10
Interpretation of Swedish Sign Language using Convolutional Neural Networks and Transfer Learning
Halvardsson, Gustaf; Peterson, Johanna. - : KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020
BASE
Show details
11
Comments on the book "Architects of Intelligence" by Martin Ford in the light of the SP Theory of Intelligence
In: https://hal.archives-ouvertes.fr/hal-02061171 ; 2019 (2019)
BASE
Show details
12
The dialectical meaning of offshored work: neoliberal desires and labour arbitrage in post-socialist Romania
Miszczyński, Miłosz. - : BRILL, 2019
BASE
Show details
13
Theory and approaches of group decision making with uncertain linguistic expressions
Wang, Hai; Xu, Zeshui. - : Springer, 2019
BASE
Show details
14
Terminology acquisition methods in Arabic : Application in the medical domain ; Méthodes d'acquisition terminologique en arabe : Application au domaine médical
Neifar, Wafa. - : HAL CCSD, 2019
In: https://tel.archives-ouvertes.fr/tel-02326714 ; Informatique et langage [cs.CL]. Université Paris Saclay (COmUE); Université de Sfax (Tunisie). Faculté des Sciences économiques et de gestion, 2019. Français. ⟨NNT : 2019SACLS085⟩ (2019)
BASE
Show details
15
Neural Methods Towards Concept Discovery from Text via Knowledge Transfer
In: http://rave.ohiolink.edu/etdc/view?acc_num=osu1572387318988274 (2019)
BASE
Show details
16
Perceptual attention as the locus of transfer to nonnative speech perception
Chang, Charles B.. - : ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD, 2018
BASE
Show details
17
Business Perspectives of Philosophy and Knowledge: Occident Contra Islam
Fascia, Michael. - : Universidad Politécnica de Valencia, 2018
BASE
Show details
18
Communicating science: a practical guide for engineers and physical scientists
Boxman, Raymond; Boxman, Edith. - : World Scientific, 2017
BASE
Show details
19
Data-driven problem-solving in international business communication: examining the use of bilingual web-based tools for text production with advanced English as a foreign language professionals
Zielonka, Alexander. - : Peter Lang, 2017
BASE
Show details
20
Handbook of business communication: linguistic approaches
Mautner, Gerlinde; Rainer, Franz. - : De Gruyter, 2017
BASE
Show details

Page: 1 2 3 4 5...7

Catalogues
0
0
0
0
0
0
0
Bibliographies
2
0
0
0
0
0
0
1
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
118
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern