1 |
XTREME-S: Evaluating Cross-lingual Speech Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Multilingual Speech Translation from Efficient Finetuning of Pretrained Models ...
|
|
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021; Auli, Michael; Baevski, Alexei; Conneau, Alexis; Li, Xian; Pino, Juan; Tang, Yuqing; Tang, Yun; Tran, Chau; Wang, Changhan. - : Underline Science Inc., 2021
|
|
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-long.68 Abstract: We present a simple yet effective approach to build multilingual speech-to-text (ST) translation through efficient transfer learning from a pretrained speech encoder and text decoder. Our key finding is that a minimalistic LNA (LayerNorm and Attention) finetuning can achieve zero-shot crosslingual and cross-modality transfer ability by only finetuning 10~50% of the pretrained parameters. This effectively leverages large pretrained models at low training cost such as wav2vec 2.0 for acoustic modeling, and mBART for multilingual text generation. This sets a new state-of-the-art for 36 translation directions (and surpassing cascaded ST for 26 of them) on the large-scale multilingual ST benchmark CoVoST 2 (+6.4 BLEU on average for En-X directions and +6.7 BLEU for X-En directions). Our approach demonstrates strong zero-shot performance in a many-to-many multilingual model (+5.6 BLEU on average across 28 non-English directions), making it an ...
|
|
Keyword:
Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
|
|
URL: https://dx.doi.org/10.48448/z2bj-rv03 https://underline.io/lecture/25429-multilingual-speech-translation-from-efficient-finetuning-of-pretrained-models
|
|
BASE
|
|
Hide details
|
|
3 |
Specializing distributional vectors of all words for lexical entailment
|
|
|
|
BASE
|
|
Show details
|
|
4 |
What you can cram into a single \$&!#* vector: Probing sentence embeddings for linguistic properties
|
|
|
|
In: ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-01898412 ; ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Jul 2018, Melbourne, Australia. pp.2126-2136 (2018)
|
|
BASE
|
|
Show details
|
|
5 |
Very Deep Convolutional Networks for Text Classification
|
|
|
|
In: European Chapter of the Association for Computational Linguistics EACL'17 ; https://hal.archives-ouvertes.fr/hal-01454940 ; European Chapter of the Association for Computational Linguistics EACL'17, 2017, Valencia, Spain (2017)
|
|
BASE
|
|
Show details
|
|
6 |
What you can cram into a single $&!#* vector: probing sentence embeddings for linguistic properties
|
|
|
|
BASE
|
|
Show details
|
|
|
|