DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6 7...213
Hits 41 – 60 of 4.248

41
Linguistic Mathematical Relationships Saved or Lost in Translating Texts: Extension of the Statistical Theory of Translation and Its Application to the New Testament
In: Information; Volume 13; Issue 1; Pages: 20 (2022)
BASE
Show details
42
Sign Language Avatars: A Question of Representation
In: Information; Volume 13; Issue 4; Pages: 206 (2022)
BASE
Show details
43
Identifying Source-Language Dialects in Translation
In: Mathematics; Volume 10; Issue 9; Pages: 1431 (2022)
BASE
Show details
44
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
BASE
Show details
45
X-Transformer: A Machine Translation Model Enhanced by the Self-Attention Mechanism
In: Applied Sciences; Volume 12; Issue 9; Pages: 4502 (2022)
BASE
Show details
46
Retrieval-Based Transformer Pseudocode Generation
In: Mathematics; Volume 10; Issue 4; Pages: 604 (2022)
BASE
Show details
47
Evaluating the Impact of Integrating Similar Translations into Neural Machine Translation
In: Information; Volume 13; Issue 1; Pages: 19 (2022)
BASE
Show details
48
Hebrew Transformed: Machine Translation of Hebrew Using the Transformer Architecture
Crater, David T. - 2022
BASE
Show details
49
Technology in audiovisual translation practices and training ; Las tecnologías en la formación y las prácticas de traducción audiovisual
In: CLINA Revista Interdisciplinaria de Traducción Interpretación y Comunicación Intercultural; Vol. 7 Núm. 1 (2021); 17-24 ; CLINA Revista Interdisciplinaria de Traducción Interpretación y Comunicación Intercultural; Vol. 7 No. 1 (2021); 17-24 ; 2444-1961 ; 10.14201/clina202171 (2022)
BASE
Show details
50
Training in machine translation post-editing for foreign language students
Zhang, Hong; Torres-Hostench, Olga. - : University of Hawaii National Foreign Language Resource Center, 2022. : Center for Language & Technology, 2022. : (co-sponsored by Center for Open Educational Resources and Language Learning, University of Texas at Austin), 2022
BASE
Show details
51
Pushing the right buttons: adversarial evaluation of quality estimation
In: Proceedings of the Sixth Conference on Machine Translation ; 625 ; 638 (2022)
BASE
Show details
52
Some Contributions to Interactive Machine Translation and to the Applications of Machine Translation for Historical Documents
Domingo Ballester, Miguel. - : Universitat Politècnica de València, 2022
BASE
Show details
53
Neural-based Knowledge Transfer in Natural Language Processing
Wang, Chao. - 2022
Abstract: In Natural Language Processing (NLP), neural-based knowledge transfer, which is to transfer out-of-domain (OOD) knowledge to task-specific neural networks, has been applied to many NLP tasks. To further explore neural-based knowledge transfer in NLP, in this dissertation, we consider both structured OOD knowledge and unstructured OOD knowledge, and deal with several representative NLP tasks. For structured OOD knowledge, we study the neural-based knowledge transfer in Machine Reading Comprehension (MRC). In single-passage MRC tasks, to bridge the gap between MRC models and human beings, which is mainly reflected in the hunger for data and the robustness to noise, we integrate the neural networks of MRC models with the general knowledge of human beings embodied in knowledge bases. On the one hand, we propose a data enrichment method, which uses WordNet to extract inter-word semantic connections as general knowledge from each given passage-question pair. On the other hand, we propose a novel MRC model named Knowledge Aided Reader (KAR), which explicitly uses the above extracted general knowledge to assist its attention mechanisms. According to the experimental results, KAR is comparable in performance with the state-of-the-art MRC models, and significantly more robust to noise than them. On top of that, when only a subset (20%-80%) of the training examples are available, KAR outperforms the state-of-the-art MRC models by a large margin, and is still reasonably robust to noise. In multi-hop MRC tasks, to probe the strength of Graph Neural Networks (GNNs), we propose a novel multi-hop MRC model named Graph Aided Reader (GAR), which uses GNN methods to perform multi-hop reasoning, but is free of any pre-trained language model and completely end-to-end. For graph construction, GAR utilizes the topic-referencing relations between passages and the entity-sharing relations between sentences, which is aimed at obtaining the most sensible reasoning clues. For message passing, GAR simulates a top-down reasoning and a bottom-up reasoning, which is aimed at making the best use of the above obtained reasoning clues. According to the experimental results, GAR even outperforms several competitors relying on pre-trained language models and filter-reader pipelines, which implies that GAR benefits a lot from its GNN methods. On this basis, GAR can further benefit from applying pre-trained language models, but pre-trained language models can mainly facilitate the within-passage reasoning rather than cross-passage reasoning of GAR. Moreover, compared with the competitors constructed as filter-reader pipelines, GAR is not only easier to train, but also more applicable to the low-resource cases. For unstructured OOD knowledge, we study the neural-based knowledge transfer in Natural Language Understanding (NLU), and focus on the neural-based knowledge transfer between languages, which is also known as Cross-Lingual Transfer Learning (CLTL). To facilitate the CLTL of NLU models, especially the CLTL between distant languages, we propose a novel CLTL model named Translation Aided Language Learner (TALL), where CLTL is integrated with Machine Translation (MT). Specifically, we adopt a pre-trained multilingual language model as our baseline model, and construct TALL by appending a decoder to it. On this basis, we directly fine-tune the baseline model as an NLU model to conduct CLTL, but put TALL through an MT-oriented pre-training before its NLU-oriented fine-tuning. To make use of unannotated data, we implement the recently proposed Unsupervised Machine Translation (UMT) technique in the MT-oriented pre-training of TALL. According to the experimental results, the application of UMT enables TALL to consistently achieve better CLTL performance than the baseline model without using more annotated data, and the performance gain is relatively prominent in the case of distant languages.
Keyword: Cross-lingual transfer learning; Graph neural network; Information technology; Knowledge base; Knowledge graph; Knowledge transfer; Machine Reading Comprehension; Multi-hop reasoning; Natural Language Processing; Natural language understanding; Neural network; unsupervised machine translation
URL: http://hdl.handle.net/10315/39096
BASE
Hide details
54
Machine Translation of polysemic words: current technology in light of Cognitive Linguistics ; Tradução automática de palavras polissêmicas: tecnologias atuais à luz da Linguística Cognitiva
In: Entrepalavras; v. 11, n. 3 (11): Linguagem e Tecnologia; 52-74 (2022)
BASE
Show details
55
The role of machine translation in translation education: A thematic analysis of translator educators’ beliefs
In: Translation and Interpreting : the International Journal of Translation and Interpreting Research, Vol 14, Iss 1, Pp 177-197 (2022) (2022)
BASE
Show details
56
Predicting and Critiquing Machine Virtuosity: Mawwal Accompaniment as Case Study
In: International Computer Music Conference ; https://hal.archives-ouvertes.fr/hal-03044066 ; International Computer Music Conference, Jul 2021, Santiago, Chile (2021)
BASE
Show details
57
Investigating alignment interpretability for low-resource NMT
In: ISSN: 0922-6567 ; EISSN: 1573-0573 ; Machine Translation ; https://hal.archives-ouvertes.fr/hal-03139744 ; Machine Translation, Springer Verlag, 2021, ⟨10.1007/s10590-020-09254-w⟩ (2021)
BASE
Show details
58
Parallel Corpora Preparation for English-Amharic Machine Translation
In: IWANN 2021 - International Work on Artificial Neural Networks, Conference Springer LNCS proceedings ; https://hal.inria.fr/hal-03272258 ; IWANN 2021 - International Work on Artificial Neural Networks, Conference Springer LNCS proceedings, Jun 2021, Online, Spain (2021)
BASE
Show details
59
DESIGNING TO SUPPORT SENSEMAKING IN CROSS-LINGUAL COMPUTER-MEDIATED COMMUNICATION USING NLP TECHNIQUES
Lim, Hajin. - 2021
BASE
Show details
60
DELA Corpus - A Document-Level Corpus Annotated with Context-Related Issues
In: Castilho, Sheila orcid:0000-0002-8416-6555 , Cavalheiro Camargo, João Lucas orcid:0000-0003-3746-1225 , Menezes, Miguel and Way, Andy orcid:0000-0001-5736-5930 (2021) DELA Corpus - A Document-Level Corpus Annotated with Context-Related Issues. In: Sixth Conference on Machine Translation (WMT21), 10-11 Nov 2021, Punta Cana, Dominican Republic (Online). ISBN 978-1-954085-94-7 (2021)
BASE
Show details

Page: 1 2 3 4 5 6 7...213

Catalogues
322
0
287
0
0
3
7
Bibliographies
1.935
0
0
0
0
0
2
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
2.285
1
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern