DE eng

Search in the Catalogues and Directories

Hits 1 – 17 of 17

1
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
2
TexSmart: A System for Enhanced Natural Language Understanding ...
BASE
Show details
3
Tail-to-Tail Non-Autoregressive Sequence Prediction for Chinese Grammatical Error Correction ...
BASE
Show details
4
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
5
On the Copying Behaviors of Pre-Training for Neural Machine Translation ...
BASE
Show details
6
deepQuest-py: large and distilled models for quality estimation
Alva-Manchego, Fernando; Obamuyide, Abiola; Gajbhiye, Amit. - : Association for Computational Linguistics, 2021
BASE
Show details
7
deepQuest-py: large and distilled models for quality estimation
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations ; 382 ; 389 (2021)
Abstract: © (2021) The Authors. Published by Association for Computational Linguistics. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://aclanthology.org/2021.emnlp-demo.42/ ; We introduce deepQuest-py, a framework for training and evaluation of large and lightweight models for Quality Estimation (QE). deepQuest-py provides access to (1) state-ofthe-art models based on pre-trained Transformers for sentence-level and word-level QE; (2) light-weight and efficient sentence-level models implemented via knowledge distillation; and (3) a web interface for testing models and visualising their predictions. deepQuestpy is available at https://github.com/ sheffieldnlp/deepQuest-py under a CC BY-NC-SA licence.
Keyword: machine translation; quality estimation
URL: http://hdl.handle.net/2436/624377
https://doi.org/10.18653/v1/2021.emnlp-demo.42
BASE
Hide details
8
On the Inference Calibration of Neural Machine Translation ...
Wang, Shuo; Tu, Zhaopeng; Shi, Shuming. - : arXiv, 2020
BASE
Show details
9
Assessing the Bilingual Knowledge Learned by Neural Machine Translation Models ...
He, Shilin; Wang, Xing; Shi, Shuming. - : arXiv, 2020
BASE
Show details
10
Neuron Interaction Based Representation Composition for Neural Machine Translation ...
Li, Jian; Wang, Xing; Yang, Baosong. - : arXiv, 2019
BASE
Show details
11
Multi-Granularity Self-Attention for Neural Machine Translation ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
12
Towards Understanding Neural Machine Translation with Word Importance ...
He, Shilin; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2019
BASE
Show details
13
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
14
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18), 2–7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
15
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: 32nd AAAI Conference on Artificial Intelligence (AAAI 2018), 2 - 7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
16
Translating Pro-Drop Languages with Reconstruction Models ...
BASE
Show details
17
Exploiting Deep Representations for Neural Machine Translation ...
Dou, Zi-Yi; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
17
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern