DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 56

1
MultiTraiNMT: training materials to approach neural machine translation from scratch ; MultiTraiNMT: des outils pour se former à la traduction automatique neuronale
In: TRITON 2021 (Translation and Interpreting Technology Online) ; https://hal.archives-ouvertes.fr/hal-03272570 ; TRITON 2021 (Translation and Interpreting Technology Online), Jul 2021, Online, United Kingdom (2021)
BASE
Show details
2
Rethinking Data Augmentation for Low-Resource Neural Machine Translation: A Multi-Task Learning Approach ...
BASE
Show details
3
Rethinking Data Augmentation for Low-Resource Neural Machine Translation: A Multi-Task Learning Approach
Sánchez-Cartagena, Víctor M.; Sánchez-Martínez, Felipe; Pérez-Ortiz, Juan Antonio. - : Association for Computational Linguistics, 2021
BASE
Show details
4
Fuzzy-Match Repair in Computer-Aided Translation Using Black-Box Machine Translation
Ortega, John E.. - : Universidad de Alicante, 2021
BASE
Show details
5
Understanding the effects of word-level linguistic annotations in under-resourced neural machine translation ...
BASE
Show details
6
Understanding the effects of word-level linguistic annotations in under-resourced neural machine translation
Sánchez-Cartagena, Víctor M.; Pérez-Ortiz, Juan Antonio; Sánchez-Martínez, Felipe. - : Association for Computational Linguistics, 2020
BASE
Show details
7
The Universitat d'Alacant submissions to the English-to-Kazakh news translation task at WMT 2019 ...
BASE
Show details
8
The Universitat d'Alacant submissions to the English-to-Kazakh news translation task at WMT 2019 ...
BASE
Show details
9
Reading comprehension of machine translation output: what makes for a better read?
In: Castilho, Sheila orcid:0000-0002-8416-6555 and Guerberof Arenas, Ana orcid:0000-0001-9820-7074 (2018) Reading comprehension of machine translation output: what makes for a better read? In: 21st Annual Conference of the European for Machine Translation, 28-30 May 2018, Alacant/Alicante, Spain. ISBN 978-84-09-01901-4 (2018)
BASE
Show details
10
Predicting insertion positions in word-level machine translation quality estimation
BASE
Show details
11
Towards Optimizing MT for Post-Editing Effort: Can BLEU Still Be Useful?
BASE
Show details
12
Assisting non-expert speakers of under-resourced languages in assigning stems and inflectional paradigms to new word entries of morphological dictionaries
Forcada, Mikel L.; Carrasco, Rafael C.; Pérez-Ortiz, Juan Antonio. - : Springer Science+Business Media Dordrecht, 2017
BASE
Show details
13
Towards Optimizing MT for Post-Editing Effort: Can BLEU Still Be Useful?
In: Prague Bulletin of Mathematical Linguistics , Vol 108, Iss 1, Pp 183-195 (2017) (2017)
Abstract: We propose a simple, linear-combination automatic evaluation measure (AEM) to approximate post-editing (PE) effort. Effort is measured both as PE time and as the number of PE operations performed. The ultimate goal is to define an AEM that can be used to optimize machine translation (MT) systems to minimize PE effort, but without having to perform unfeasible repeated PE during optimization. As PE effort is expected to be an extensive magnitude (i.e., one growing linearly with the sentence length and which may be simply added to represent the effort for a set of sentences), we use a linear combination of extensive and pseudo-extensive features. One such pseudo-extensive feature, 1–BLEU times the length of the reference, proves to be almost as good a predictor of PE effort as the best combination of extensive features. Surprisingly, effort predictors computed using independently obtained reference translations perform reasonably close to those using actual post-edited references. In the early stage of this research and given the inherent complexity of carrying out experiments with professional post-editors, we decided to carry out an automatic evaluation of the AEMs proposed rather than a manual evaluation to measure the effort needed to post-edit the output of an MT system tuned on these AEMs. The results obtained seem to support current tuning practice using BLEU, yet pointing at some limitations. Apart from this intrinsic evaluation, an extrinsic evaluation was also carried out in which the AEMs proposed were used to build synthetic training corpora for MT quality estimation, with results comparable to those obtained when training with measured PE efforts.
Keyword: Computational linguistics. Natural language processing; P98-98.5
URL: https://doi.org/10.1515/pralin-2017-0019
https://doaj.org/article/40f7759398754624ae8baf6dca0dbad5
BASE
Hide details
14
Integrating Rules and Dictionaries from Shallow-Transfer Machine Translation into Phrase-Based Statistical Machine Translation
BASE
Show details
15
RuLearn: an Open-source Toolkit for the Automatic Inference of Shallow-transfer Rules for Machine Translation
BASE
Show details
16
Phrase-based statistical machine translation: explanation of its processes and statistical models and evaluation of the English to Spanish translations produced
BASE
Show details
17
Using external sources of bilingual information for word-level quality estimation in translation technologies
Esplà-Gomis, Miquel. - : Universidad de Alicante, 2016
BASE
Show details
18
RuLearn: an Open-source Toolkit for the Automatic Inference of Shallow-transfer Rules for Machine Translation
In: Prague Bulletin of Mathematical Linguistics , Vol 106, Iss 1, Pp 193-204 (2016) (2016)
BASE
Show details
19
A generalised alignment template formalism and its application to the inference of shallow-transfer machine translation rules from scarce bilingual corpora
BASE
Show details
20
Using Machine Translation to Provide Target-Language Edit Hints in Computer Aided Translation Based on Translation Memories
BASE
Show details

Page: 1 2 3

Catalogues
0
0
5
0
0
0
0
Bibliographies
4
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
51
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern