DE eng

Search in the Catalogues and Directories

Hits 1 – 12 of 12

1
Europarl Direct Translationese Dataset ...
BASE
Show details
2
Europarl Direct Translationese Dataset ...
BASE
Show details
3
Europarl Direct Translationese Dataset ...
BASE
Show details
4
Integrating Unsupervised Data Generation into Self-Supervised Neural Machine Translation for Low-Resource Languages ...
BASE
Show details
5
Comparing Feature-Engineering and Feature-Learning Approaches for Multilingual Translationese Classification ...
BASE
Show details
6
Investigating the Helpfulness of Word-Level Quality Estimation for Post-Editing Machine Translation Output ...
BASE
Show details
7
Multi-Head Highly Parallelized LSTM Decoder for Neural Machine Translation ...
BASE
Show details
8
Comparing Feature-Engineering and Feature-Learning Approaches for Multilingual Translationese Classification ...
BASE
Show details
9
Modeling Task-Aware MIMO Cardinality for Efficient Multilingual Neural Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-short.46 Abstract: Neural machine translation has achieved great success in bilingual settings, as well as in multilingual settings. With the increase of the number of languages, multilingual systems tend to underperform their bilingual counterparts. Model capacity has been found crucial for massively multilingual NMT to support language pairs with varying typological characteristics. Previous work increases the modeling capacity by deepening or widening the Transformer. However, modeling cardinality based on aggregating a set of transformations with the same topology has been proven more effective than going deeper or wider when increasing capacity. In this paper, we propose to efficiently increase the capacity for multilingual NMT by increasing the cardinality. Unlike previous work which feeds the same input to several transformations and merges their outputs into one, we present a Multi-Input-Multi-Output (MIMO) architecture that allows each transformation ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/29pk-ag57
https://underline.io/lecture/25472-modeling-task-aware-mimo-cardinality-for-efficient-multilingual-neural-machine-translation
BASE
Hide details
10
A Bidirectional Transformer Based Alignment Model for Unsupervised Word Alignment ...
BASE
Show details
11
Automatic classification of human translation and machine translation : a study from the perspective of lexical diversity
Fu, Yingxue; Nederhof, Mark Jan. - : Linkoping University Electronic Press, 2021
BASE
Show details
12
Transformer-based NMT : modeling, training and implementation
Xu, Hongfei. - : Saarländische Universitäts- und Landesbibliothek, 2021
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
12
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern