DE eng

Search in the Catalogues and Directories

Hits 1 – 12 of 12

1
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning ...
BASE
Show details
2
When Does Translation Require Context? A Data-driven, Multilingual Exploration ...
BASE
Show details
3
Distributionally Robust Multilingual Machine Translation ...
Zhou, Chunting; Levy, Daniel; Li, Xian. - : arXiv, 2021
BASE
Show details
4
Meta Back-translation ...
Pham, Hieu; Wang, Xinyi; Yang, Yiming. - : arXiv, 2021
BASE
Show details
5
Do Context-Aware Translation Models Pay the Right Attention? ...
BASE
Show details
6
The Return of Lexical Dependencies: Neural Lexicalized PCFGs ...
BASE
Show details
7
XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization ...
BASE
Show details
8
A Bilingual Generative Transformer for Semantic Sentence Embedding ...
BASE
Show details
9
Improving Robustness of Machine Translation with Synthetic Noise ...
BASE
Show details
10
Cross-lingual Alignment vs Joint Training: A Comparative Study and A Simple Unified Framework ...
Wang, Zirui; Xie, Jiateng; Xu, Ruochen. - : arXiv, 2019
BASE
Show details
11
Bilingual Lexicon Induction with Semi-supervision in Non-Isometric Embedding Spaces ...
Abstract: Recent work on bilingual lexicon induction (BLI) has frequently depended either on aligned bilingual lexicons or on distribution matching, often with an assumption about the isometry of the two spaces. We propose a technique to quantitatively estimate this assumption of the isometry between two embedding spaces and empirically show that this assumption weakens as the languages in question become increasingly etymologically distant. We then propose Bilingual Lexicon Induction with Semi-Supervision (BLISS) --- a semi-supervised approach that relaxes the isometric assumption while leveraging both limited aligned bilingual lexicons and a larger set of unaligned word embeddings, as well as a novel hubness filtering technique. Our proposed method obtains state of the art results on 15 of 18 language pairs on the MUSE dataset, and does particularly well when the embedding spaces don't appear to be isometric. In addition, we also show that adding supervision stabilizes the learning procedure, and is effective even ... : ACL 2019 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://arxiv.org/abs/1908.06625
https://dx.doi.org/10.48550/arxiv.1908.06625
BASE
Hide details
12
Parameter Sharing Methods for Multilingual Self-Attentional Translation Models ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
12
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern