DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...26
Hits 1 – 20 of 510

1
Neural-based Knowledge Transfer in Natural Language Processing
Wang, Chao. - 2022
Abstract: In Natural Language Processing (NLP), neural-based knowledge transfer, which is to transfer out-of-domain (OOD) knowledge to task-specific neural networks, has been applied to many NLP tasks. To further explore neural-based knowledge transfer in NLP, in this dissertation, we consider both structured OOD knowledge and unstructured OOD knowledge, and deal with several representative NLP tasks. For structured OOD knowledge, we study the neural-based knowledge transfer in Machine Reading Comprehension (MRC). In single-passage MRC tasks, to bridge the gap between MRC models and human beings, which is mainly reflected in the hunger for data and the robustness to noise, we integrate the neural networks of MRC models with the general knowledge of human beings embodied in knowledge bases. On the one hand, we propose a data enrichment method, which uses WordNet to extract inter-word semantic connections as general knowledge from each given passage-question pair. On the other hand, we propose a novel MRC model named Knowledge Aided Reader (KAR), which explicitly uses the above extracted general knowledge to assist its attention mechanisms. According to the experimental results, KAR is comparable in performance with the state-of-the-art MRC models, and significantly more robust to noise than them. On top of that, when only a subset (20%-80%) of the training examples are available, KAR outperforms the state-of-the-art MRC models by a large margin, and is still reasonably robust to noise. In multi-hop MRC tasks, to probe the strength of Graph Neural Networks (GNNs), we propose a novel multi-hop MRC model named Graph Aided Reader (GAR), which uses GNN methods to perform multi-hop reasoning, but is free of any pre-trained language model and completely end-to-end. For graph construction, GAR utilizes the topic-referencing relations between passages and the entity-sharing relations between sentences, which is aimed at obtaining the most sensible reasoning clues. For message passing, GAR simulates a top-down reasoning and a bottom-up reasoning, which is aimed at making the best use of the above obtained reasoning clues. According to the experimental results, GAR even outperforms several competitors relying on pre-trained language models and filter-reader pipelines, which implies that GAR benefits a lot from its GNN methods. On this basis, GAR can further benefit from applying pre-trained language models, but pre-trained language models can mainly facilitate the within-passage reasoning rather than cross-passage reasoning of GAR. Moreover, compared with the competitors constructed as filter-reader pipelines, GAR is not only easier to train, but also more applicable to the low-resource cases. For unstructured OOD knowledge, we study the neural-based knowledge transfer in Natural Language Understanding (NLU), and focus on the neural-based knowledge transfer between languages, which is also known as Cross-Lingual Transfer Learning (CLTL). To facilitate the CLTL of NLU models, especially the CLTL between distant languages, we propose a novel CLTL model named Translation Aided Language Learner (TALL), where CLTL is integrated with Machine Translation (MT). Specifically, we adopt a pre-trained multilingual language model as our baseline model, and construct TALL by appending a decoder to it. On this basis, we directly fine-tune the baseline model as an NLU model to conduct CLTL, but put TALL through an MT-oriented pre-training before its NLU-oriented fine-tuning. To make use of unannotated data, we implement the recently proposed Unsupervised Machine Translation (UMT) technique in the MT-oriented pre-training of TALL. According to the experimental results, the application of UMT enables TALL to consistently achieve better CLTL performance than the baseline model without using more annotated data, and the performance gain is relatively prominent in the case of distant languages.
Keyword: Cross-lingual transfer learning; Graph neural network; Information technology; Knowledge base; Knowledge graph; Knowledge transfer; Machine Reading Comprehension; Multi-hop reasoning; Natural Language Processing; Natural language understanding; Neural network; unsupervised machine translation
URL: http://hdl.handle.net/10315/39096
BASE
Hide details
2
English machine reading comprehension: new approaches to answering multiple-choice questions
Dzendzik, Daria. - : Dublin City University. School of Computing, 2021. : Dublin City University. ADAPT, 2021
In: Dzendzik, Daria (2021) English machine reading comprehension: new approaches to answering multiple-choice questions. PhD thesis, Dublin City University. (2021)
BASE
Show details
3
繪本閱讀教學研究──以國小一年級為對象 ; The Research of Picture Books Teaching Activity : 1st Graders of a Bilingual Flementary School Students as Research Subject
BASE
Show details
4
Home language at school: Crosslinguistic sentence integration supports second language comprehension of oral and written school-based discourse
BASE
Show details
5
La comprensión lectora: relación con la teoría de la mente y las funciones ejecutivas. Estudio comparativo en adolescentes con implante coclear y con desarrollo típico
Figueroa González, Mario. - : Universitat Autònoma de Barcelona, 2021
In: TDX (Tesis Doctorals en Xarxa) (2021)
BASE
Show details
6
Comprensión lectora en L2: Patrones textuales, organizadores gráficos y el uso de la literatura
Martul Beltrán, Luis Mª. - : Universitat de Barcelona, 2021
In: TDX (Tesis Doctorals en Xarxa) (2021)
BASE
Show details
7
Is There Any Space for Critical Literacy? English Language Teachers’ Perceptions, Views, and Perceived Challenges
Dehdary, N. - : University of Exeter, 2021. : GRADUATE SCHOOL OF EDUCATION, 2021
BASE
Show details
8
Comprensión lectora en L2: Patrones textuales, organizadores gráficos y el uso de la literatura
Martul Beltrán, Luis Mª. - : Universitat de Barcelona, 2021
BASE
Show details
9
O gênero textual conto de animais e a compreensão em leitura : a sequência didática e os gestos didáticos como instrumentos de mediação na educação infantil
BASE
Show details
10
Improving the Reading Achievement of Language Minority and Disadvantaged Youth At Risk of Academic Failure
Iwenofu, Linda. - : University of Toronto, 2021
BASE
Show details
11
Impacto de variáveis cognitivo - linguísticas na compreensão da leitura ; Effect of cognitive - linguistic variables on reading comprehension
BASE
Show details
12
Propuesta de intervención no implementada para desarrollar la competencia lectora a partir de la educación física
BASE
Show details
13
Une approche computationnelle de la complexité linguistique par le traitement automatique du langage naturel et l'oculométrie
BASE
Show details
14
Dificultades en la comprensión lectora. Una aplicación desde el modelo de respuesta a la intervención
BASE
Show details
15
Reading Analyses with Chilean Children
BASE
Show details
16
A Less Simple View of Reading: The Role of Inhibition and Working Memory in the Decoding-Comprehension Relationship
McClure, Jane. - : Brock University, 2020
BASE
Show details
17
臺灣國中生英語閱讀困難及策略使用之研究 ; A Study on Taiwanese Junior High School Students’ Difficulties and Strategy-use in Reading English
BASE
Show details
18
高中英文閱讀理解次序選擇題測驗之效度檢核 ; The Validation of an English Reading Comprehension Test with Ordered Multiple-Choice Items for EFL Senior High School Students
BASE
Show details
19
外語學習者對於手機裝置為語言學習工具之實務評估 ; Practical Evaluation of a Mobile Language Learning Tool for EFL Learners
BASE
Show details
20
Conhecimento metalinguístico e compreensão da leitura em alunos surdos do 2º e 3º ciclos
BASE
Show details

Page: 1 2 3 4 5...26

Catalogues
182
0
0
0
0
0
4
Bibliographies
186
0
0
0
0
0
0
1
5
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
313
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern