DE eng

Search in the Catalogues and Directories

Hits 1 – 15 of 15

1
XTREME-S: Evaluating Cross-lingual Speech Representations ...
BASE
Show details
2
mSLAM: Massively multilingual joint pre-training for speech and text ...
Bapna, Ankur; Cherry, Colin; Zhang, Yu. - : arXiv, 2022
BASE
Show details
3
Larger-Scale Transformers for Multilingual Masked Language Modeling ...
Goyal, Naman; Du, Jingfei; Ott, Myle. - : arXiv, 2021
BASE
Show details
4
Multilingual Speech Translation from Efficient Finetuning of Pretrained Models ...
BASE
Show details
5
Unsupervised Cross-lingual Representation Learning for Speech Recognition ...
Abstract: This paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource ...
Keyword: Audio and Speech Processing eess.AS; Computation and Language cs.CL; FOS Computer and information sciences; FOS Electrical engineering, electronic engineering, information engineering; Machine Learning cs.LG; Sound cs.SD
URL: https://arxiv.org/abs/2006.13979
https://dx.doi.org/10.48550/arxiv.2006.13979
BASE
Hide details
6
Multilingual Speech Translation with Efficient Finetuning of Pretrained Models ...
Li, Xian; Wang, Changhan; Tang, Yun. - : arXiv, 2020
BASE
Show details
7
Unsupervised Cross-lingual Representation Learning at Scale ...
BASE
Show details
8
Emerging Cross-lingual Structure in Pretrained Language Models ...
BASE
Show details
9
Specializing distributional vectors of all words for lexical entailment
Ponti, Edoardo Maria; Kamath, Aishwarya; Pfeiffer, Jonas. - : Association for Computational Linguistics, 2019
BASE
Show details
10
What you can cram into a single \$&!#* vector: Probing sentence embeddings for linguistic properties
In: ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-01898412 ; ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Jul 2018, Melbourne, Australia. pp.2126-2136 (2018)
BASE
Show details
11
XNLI: Evaluating Cross-lingual Sentence Representations ...
BASE
Show details
12
What you can cram into a single vector: Probing sentence embeddings for linguistic properties ...
BASE
Show details
13
Very Deep Convolutional Networks for Text Classification
In: European Chapter of the Association for Computational Linguistics EACL'17 ; https://hal.archives-ouvertes.fr/hal-01454940 ; European Chapter of the Association for Computational Linguistics EACL'17, 2017, Valencia, Spain (2017)
BASE
Show details
14
Word Translation Without Parallel Data ...
BASE
Show details
15
What you can cram into a single $&!#* vector: probing sentence embeddings for linguistic properties
Kruszewski, German; Barrault, Loïc; Baroni, Marco. - : ACL (Association for Computational Linguistics)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
15
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern