DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
In: NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies ; https://hal.inria.fr/hal-03251105 ; NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun 2021, Mexico City, Mexico (2021)
Abstract: International audience ; Transfer learning based on pretraining language models on a large amount of raw data has become a new norm to reach state-of-theart performance in NLP. Still, it remains unclear how this approach should be applied for unseen languages that are not covered by any available large-scale multilingual language model and for which only a small amount of raw data is generally available. In this work, by comparing multilingual and monolingual models, we show that such models behave in multiple ways on unseen languages. Some languages greatly benefit from transfer learning and behave similarly to closely related high resource languages whereas others apparently do not. Focusing on the latter, we show that this failure to transfer is largely related to the impact of the script used to write such languages. We show that transliterating those languages significantly improves the potential of large-scale multilingual language models on downstream tasks. This result provides a promising direction towards making these massively multilingual models useful for a new set of unseen languages.
Keyword: [INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]
URL: https://hal.inria.fr/hal-03251105/file/NAACL21_Muller_et_al.pdf
https://hal.inria.fr/hal-03251105/document
https://hal.inria.fr/hal-03251105
BASE
Hide details
2
SD-QA: Spoken Dialectal Question Answering for the Real World ...
BASE
Show details
3
Evaluating the Morphosyntactic Well-formedness of Generated Texts ...
BASE
Show details
4
Towards more equitable question answering systems: How much more data do you need? ...
BASE
Show details
5
When is Wall a Pared and when a Muro?: Extracting Rules Governing Lexical Selection ...
BASE
Show details
6
Lexically-Aware Semi-Supervised Learning for OCR Post-Correction ...
BASE
Show details
7
AlloVera: a multilingual allophone database
In: LREC 2020: 12th Language Resources and Evaluation Conference ; https://halshs.archives-ouvertes.fr/halshs-02527046 ; LREC 2020: 12th Language Resources and Evaluation Conference, European Language Resources Association, May 2020, Marseille, France ; https://lrec2020.lrec-conf.org/ (2020)
BASE
Show details
8
It’s Easier to Translate out of English than into it: Measuring Neural Translation Difficulty by Cross-Mutual Information ...
BASE
Show details
9
It’s Easier to Translate out of English than into it: Measuring Neural Translation Difficulty by Cross-Mutual Information
In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
BASE
Show details
10
AlloVera: a multilingual allophone database
In: LREC 2020: 12th Language Resources and Evaluation Conference ; https://halshs.archives-ouvertes.fr/halshs-02527046 ; LREC 2020: 12th Language Resources and Evaluation Conference, European Language Resources Association, May 2020, Marseille, France ; https://lrec2020.lrec-conf.org/ (2020)
BASE
Show details
11
A small Griko-Italian speech translation corpus
In: 6th international workshop on spoken language technologies for under-resourced languages(SLTU'18) ; https://hal.archives-ouvertes.fr/hal-01962528 ; 6th international workshop on spoken language technologies for under-resourced languages(SLTU'18), Aug 2018, New Delhi, India (2018)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern