2 |
Rule-based Morphological Inflection Improves Neural Terminology Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Rule-based Morphological Inflection Improves Neural Terminology Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Evaluating the Evaluation Metrics for Style Transfer: A Case Study in Multilingual Formality Transfer ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Beyond Noise: Mitigating the Impact of Fine-grained Semantic Divergences on Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
The UMD Submission to the Explainable MT Quality Estimation Shared Task: Combining Explanation Models with Sequence Labeling ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Evaluating the Evaluation Metrics for Style Transfer: A Case Study in Multilingual Formality Transfer ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation? ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
A Non-Autoregressive Edit-Based Approach to Controllable Text Simplification ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Detecting Fine-Grained Cross-Lingual Semantic Divergences without Supervision by Learning to Rank ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Incorporating Terminology Constraints in Automatic Post-Editing ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints ...
|
|
|
|
Abstract:
We introduce an Edit-Based Transformer with Repositioning (EDITOR), which makes sequence generation flexible by seamlessly allowing users to specify preferences in output lexical choice. Building on recent models for non-autoregressive sequence generation (Gu et al., 2019), EDITOR generates new sequences by iteratively editing hypotheses. It relies on a novel reposition operation designed to disentangle lexical choice from word positioning decisions, while enabling efficient oracles for imitation learning and parallel edits at decoding time. Empirically, EDITOR uses soft lexical constraints more effectively than the Levenshtein Transformer (Gu et al., 2019) while speeding up decoding dramatically compared to constrained beam search (Post and Vilar, 2018). EDITOR also achieves comparable or better translation quality with faster decoding speed than the Levenshtein Transformer on standard Romanian-English, English-German, and English-Japanese machine translation tasks. ... : TACL 2021 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
|
|
URL: https://arxiv.org/abs/2011.06868 https://dx.doi.org/10.48550/arxiv.2011.06868
|
|
BASE
|
|
Hide details
|
|
14 |
Controlling Neural Machine Translation Formality with Synthetic Supervision ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Controlling Text Complexity in Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Formality Style Transfer Within and Across Languages with Limited Supervision
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Identifying Semantic Divergences in Parallel Text without Annotations ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Bi-Directional Neural Machine Translation with Synthetic Parallel Data ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Multi-Task Neural Models for Translating Between Styles Within and Across Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|