Page: 1 2 3 4 5 6 7 8 9... 205
81 |
Neural-based Knowledge Transfer in Natural Language Processing
|
|
|
|
Abstract:
In Natural Language Processing (NLP), neural-based knowledge transfer, which is to transfer out-of-domain (OOD) knowledge to task-specific neural networks, has been applied to many NLP tasks. To further explore neural-based knowledge transfer in NLP, in this dissertation, we consider both structured OOD knowledge and unstructured OOD knowledge, and deal with several representative NLP tasks. For structured OOD knowledge, we study the neural-based knowledge transfer in Machine Reading Comprehension (MRC). In single-passage MRC tasks, to bridge the gap between MRC models and human beings, which is mainly reflected in the hunger for data and the robustness to noise, we integrate the neural networks of MRC models with the general knowledge of human beings embodied in knowledge bases. On the one hand, we propose a data enrichment method, which uses WordNet to extract inter-word semantic connections as general knowledge from each given passage-question pair. On the other hand, we propose a novel MRC model named Knowledge Aided Reader (KAR), which explicitly uses the above extracted general knowledge to assist its attention mechanisms. According to the experimental results, KAR is comparable in performance with the state-of-the-art MRC models, and significantly more robust to noise than them. On top of that, when only a subset (20%-80%) of the training examples are available, KAR outperforms the state-of-the-art MRC models by a large margin, and is still reasonably robust to noise. In multi-hop MRC tasks, to probe the strength of Graph Neural Networks (GNNs), we propose a novel multi-hop MRC model named Graph Aided Reader (GAR), which uses GNN methods to perform multi-hop reasoning, but is free of any pre-trained language model and completely end-to-end. For graph construction, GAR utilizes the topic-referencing relations between passages and the entity-sharing relations between sentences, which is aimed at obtaining the most sensible reasoning clues. For message passing, GAR simulates a top-down reasoning and a bottom-up reasoning, which is aimed at making the best use of the above obtained reasoning clues. According to the experimental results, GAR even outperforms several competitors relying on pre-trained language models and filter-reader pipelines, which implies that GAR benefits a lot from its GNN methods. On this basis, GAR can further benefit from applying pre-trained language models, but pre-trained language models can mainly facilitate the within-passage reasoning rather than cross-passage reasoning of GAR. Moreover, compared with the competitors constructed as filter-reader pipelines, GAR is not only easier to train, but also more applicable to the low-resource cases. For unstructured OOD knowledge, we study the neural-based knowledge transfer in Natural Language Understanding (NLU), and focus on the neural-based knowledge transfer between languages, which is also known as Cross-Lingual Transfer Learning (CLTL). To facilitate the CLTL of NLU models, especially the CLTL between distant languages, we propose a novel CLTL model named Translation Aided Language Learner (TALL), where CLTL is integrated with Machine Translation (MT). Specifically, we adopt a pre-trained multilingual language model as our baseline model, and construct TALL by appending a decoder to it. On this basis, we directly fine-tune the baseline model as an NLU model to conduct CLTL, but put TALL through an MT-oriented pre-training before its NLU-oriented fine-tuning. To make use of unannotated data, we implement the recently proposed Unsupervised Machine Translation (UMT) technique in the MT-oriented pre-training of TALL. According to the experimental results, the application of UMT enables TALL to consistently achieve better CLTL performance than the baseline model without using more annotated data, and the performance gain is relatively prominent in the case of distant languages.
|
|
Keyword:
Cross-lingual transfer learning; Graph neural network; Information technology; Knowledge base; Knowledge graph; Knowledge transfer; Machine Reading Comprehension; Multi-hop reasoning; Natural Language Processing; Natural language understanding; Neural network; unsupervised machine translation
|
|
URL: http://hdl.handle.net/10315/39096
|
|
BASE
|
|
Hide details
|
|
83 |
Investigating alignment interpretability for low-resource NMT
|
|
|
|
In: ISSN: 0922-6567 ; EISSN: 1573-0573 ; Machine Translation ; https://hal.archives-ouvertes.fr/hal-03139744 ; Machine Translation, Springer Verlag, 2021, ⟨10.1007/s10590-020-09254-w⟩ (2021)
|
|
BASE
|
|
Show details
|
|
84 |
The 2011 Tohoku Tsunami from the Sky: A Review on the Evolution of Artificial Intelligence Methods for Damage Assessment
|
|
|
|
In: ISSN: 2076-3263 ; Geosciences ; https://hal.archives-ouvertes.fr/hal-03168500 ; Geosciences, MDPI, 2021, ⟨10.3390/geosciences11030133⟩ (2021)
|
|
BASE
|
|
Show details
|
|
85 |
End-to-end speaker segmentation for overlap-aware resegmentation
|
|
|
|
In: Interspeech 2021 ; https://hal-univ-lemans.archives-ouvertes.fr/hal-03257524 ; Interspeech 2021, Aug 2021, Brno, Czech Republic ; https://www.interspeech2021.org/ (2021)
|
|
BASE
|
|
Show details
|
|
86 |
High-resolution speaker counting in reverberant rooms using CRNN with Ambisonics features
|
|
|
|
In: EUSIPCO 2020 - 28th European Signal Processing Conference (EUSIPCO) ; https://hal.archives-ouvertes.fr/hal-03537323 ; EUSIPCO 2020 - 28th European Signal Processing Conference (EUSIPCO), Jan 2021, Amsterdam, Netherlands. pp.71-75, ⟨10.23919/Eusipco47968.2020.9287637⟩ (2021)
|
|
BASE
|
|
Show details
|
|
87 |
Tackling Morphological Analogies Using Deep Learning -- Extended Version
|
|
|
|
In: https://hal.inria.fr/hal-03425776 ; 2021 (2021)
|
|
BASE
|
|
Show details
|
|
88 |
Alternate Endings: Improving Prosody for Incremental Neural TTS with Predicted Future Text Input
|
|
|
|
In: Interspeech 2021 - 22nd Annual Conference of the International Speech Communication Association ; https://hal.archives-ouvertes.fr/hal-03372802 ; Interspeech 2021 - 22nd Annual Conference of the International Speech Communication Association, Aug 2021, Brno, Czech Republic. pp.3865-3869, ⟨10.21437/Interspeech.2021-275⟩ (2021)
|
|
BASE
|
|
Show details
|
|
89 |
Ontology as neuronal-space manifold: Towards symbolic and numerical artificial embedding
|
|
|
|
In: KRHCAI 2021 Workshop on Knowledge Representation for Hybrid & Compositional AI @ KR2021 ; https://hal.inria.fr/hal-03360307 ; KRHCAI 2021 Workshop on Knowledge Representation for Hybrid & Compositional AI @ KR2021, Nov 2021, Hanoi, Vietnam (2021)
|
|
BASE
|
|
Show details
|
|
90 |
Ontology as neuronal-space manifold: Towards symbolic and numerical artificial embedding
|
|
|
|
In: KRHCAI 2021 Workshop on Knowledge Representation for Hybrid & Compositional AI @ KR2021 ; https://hal.inria.fr/hal-03360307 ; KRHCAI 2021 Workshop on Knowledge Representation for Hybrid & Compositional AI @ KR2021, Nov 2021, Hanoi, Vietnam (2021)
|
|
BASE
|
|
Show details
|
|
91 |
Ontology as neuronal-space manifold: Towards symbolic and numerical artificial embedding
|
|
|
|
In: KRHCAI 2021 Workshop on Knowledge Representation for Hybrid & Compositional AI @ KR2021 ; https://hal.inria.fr/hal-03360307 ; KRHCAI 2021 Workshop on Knowledge Representation for Hybrid & Compositional AI @ KR2021, Nov 2021, Hanoi, Vietnam (2021)
|
|
BASE
|
|
Show details
|
|
92 |
Speaker Attentive Speech Emotion Recognition
|
|
|
|
In: Proccedings of interspeech 2021 ; Interspeech 2021 ; https://hal.archives-ouvertes.fr/hal-03554368 ; Interspeech 2021, Aug 2021, Brno, Czech Republic. pp.2866-2870, ⟨10.21437/interspeech.2021-573⟩ (2021)
|
|
BASE
|
|
Show details
|
|
93 |
Recognition of Grammatical Class of Imagined Words from EEG Signals using Convolutional Neural Network
|
|
|
|
BASE
|
|
Show details
|
|
94 |
Deep Learning Methods for Human Behavior Recognition
|
|
Lu, Jia. - : Auckland University of Technology, 2021
|
|
BASE
|
|
Show details
|
|
95 |
Brain-Inspired Audio-Visual Information Processing Using Spiking Neural Networks
|
|
Wendt, Anne. - : Auckland University of Technology, 2021
|
|
BASE
|
|
Show details
|
|
96 |
Gender Bias in Neural Translation: a preliminary study ; Biais de genre dans un système de traduction automatique neuronale : une étude préliminaire
|
|
|
|
In: Actes de la 28e Conférence sur le Traitement Automatique des Langues Naturelles. Volume 1 : conférence principale ; Traitement Automatique des Langues Naturelles ; https://hal.archives-ouvertes.fr/hal-03265895 ; Traitement Automatique des Langues Naturelles, 2021, Lille, France. pp.11-25 ; https://talnrecital2021.inria.fr/ (2021)
|
|
BASE
|
|
Show details
|
|
97 |
Developmental changes in neural letter‐selectivity: A 1‐year follow‐up of beginning readers
|
|
|
|
In: ISSN: 1363-755X ; EISSN: 1467-7687 ; Developmental Science ; https://hal.archives-ouvertes.fr/hal-02931200 ; Developmental Science, Wiley, 2021, 21 (1), pp.e12999. ⟨10.1111/desc.12999⟩ (2021)
|
|
BASE
|
|
Show details
|
|
98 |
SPLADE: Sparse Lexical and Expansion Model for First Stage Ranking
|
|
|
|
In: SIGIR '21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval ; https://hal.sorbonne-universite.fr/hal-03290774 ; SIGIR '21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul 2021, Virtual Event, Canada. pp.2288-2292, ⟨10.1145/3404835.3463098⟩ (2021)
|
|
BASE
|
|
Show details
|
|
99 |
Recognizing lexical units in low-resource language contexts with supervised and unsupervised neural networks
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03429051 ; [Research Report] LACITO (UMR 7107). 2021 (2021)
|
|
BASE
|
|
Show details
|
|
100 |
What does the Canary Say? Low-Dimensional GAN Applied to Birdsong
|
|
|
|
In: https://hal.inria.fr/hal-03244723 ; 2021 (2021)
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8 9... 205
|
|