1 |
Generating Authentic Adversarial Examples beyond Meaning-preserving with Doubly Round-trip Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Conditional Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
ConSLT: A Token-level Contrastive Framework for Sign Language Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
A Simple Multi-Modality Transfer Learning Baseline for Sign Language Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Focus on the Target's Vocabulary: Masked Label Smoothing for Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
USTC-NELSLIP at SemEval-2022 Task 11: Gazetteer-Adapted Integration Network for Multilingual Complex Named Entity Recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Towards the Next 1000 Languages in Multilingual Machine Translation: Exploring the Synergy Between Supervised and Self-Supervised Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
GL-CLeF: A Global-Local Contrastive Learning Framework for Cross-lingual Spoken Language Understanding ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Delving Deeper into Cross-lingual Visual Question Answering ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Multi-Level Contrastive Learning for Cross-Lingual Alignment ...
|
|
|
|
Abstract:
Cross-language pre-trained models such as multilingual BERT (mBERT) have achieved significant performance in various cross-lingual downstream NLP tasks. This paper proposes a multi-level contrastive learning (ML-CTL) framework to further improve the cross-lingual ability of pre-trained models. The proposed method uses translated parallel data to encourage the model to generate similar semantic embeddings for different languages. However, unlike the sentence-level alignment used in most previous studies, in this paper, we explicitly integrate the word-level information of each pair of parallel sentences into contrastive learning. Moreover, cross-zero noise contrastive estimation (CZ-NCE) loss is proposed to alleviate the impact of the floating-point error in the training process with a small batch size. The proposed method significantly improves the cross-lingual transfer ability of our basic model (mBERT) and outperforms on multiple zero-shot cross-lingual downstream tasks compared to the same-size models in ... : Accepted by ICASSP 2022 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2202.13083 https://arxiv.org/abs/2202.13083
|
|
BASE
|
|
Hide details
|
|
13 |
Cross-Lingual Text Classification with Multilingual Distillation and Zero-Shot-Aware Training ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
HFL at SemEval-2022 Task 8: A Linguistics-inspired Regression Model with Data Augmentation for Multilingual News Similarity ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Controllable Natural Language Generation with Contrastive Prefixes ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
SGL: Symbolic Goal Learning in a Hybrid, Modular Framework for Human Instruction Following ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Local-Global Context Aware Transformer for Language-Guided Video Segmentation ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Automatic Speech Recognition Datasets in Cantonese: A Survey and New Dataset ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|