DE eng

Search in the Catalogues and Directories

Hits 1 – 18 of 18

1
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
In: https://hal.inria.fr/hal-03161685 ; 2021 (2021)
BASE
Show details
2
Can Multilingual Language Models Transfer to an Unseen Dialect? A Case Study on North African Arabizi
In: https://hal.inria.fr/hal-03161677 ; 2021 (2021)
BASE
Show details
3
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT
In: EACL 2021 - The 16th Conference of the European Chapter of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03239087 ; EACL 2021 - The 16th Conference of the European Chapter of the Association for Computational Linguistics, Apr 2021, Kyiv / Virtual, Ukraine ; https://2021.eacl.org/ (2021)
BASE
Show details
4
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
In: NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies ; https://hal.inria.fr/hal-03251105 ; NAACL-HLT 2021 - 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Jun 2021, Mexico City, Mexico (2021)
BASE
Show details
5
Cross-Lingual GenQA: A Language-Agnostic Generative Question Answering Approach for Open-Domain Question Answering ...
Abstract: Open-Retrieval Generative Question Answering (GenQA) is proven to deliver high-quality, natural-sounding answers in English. In this paper, we present the first generalization of the GenQA approach for the multilingual environment. To this end, we present the GenTyDiQA dataset, which extends the TyDiQA evaluation data (Clark et al., 2020) with natural-sounding, well-formed answers in Arabic, Bengali, English, Japanese, and Russian. For all these languages, we show that a GenQA sequence-to-sequence-based model outperforms a state-of-the-art Answer Sentence Selection model. We also show that a multilingually-trained model competes with, and in some cases outperforms, its monolingual counterparts. Finally, we show that our system can even compete with strong baselines, even when fed with information from a variety of languages. Essentially, our system is able to answer a question in any language of our language set using information from many languages, making it the first Language-Agnostic GenQA system. ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2110.07150
https://dx.doi.org/10.48550/arxiv.2110.07150
BASE
Hide details
6
First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT ...
BASE
Show details
7
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models ...
NAACL 2021 2021; Muller, Benjamin. - : Underline Science Inc., 2021
BASE
Show details
8
Establishing a New State-of-the-Art for French Named Entity Recognition
In: LREC 2020 - 12th Language Resources and Evaluation Conference ; https://hal.inria.fr/hal-02617950 ; LREC 2020 - 12th Language Resources and Evaluation Conference, May 2020, Marseille, France ; http://www.lrec-conf.org (2020)
BASE
Show details
9
Building a User-Generated Content North-African Arabizi Treebank: Tackling Hell
In: ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics ; https://hal.inria.fr/hal-02889804 ; ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Jul 2020, Seattle / Virtual, Canada. ⟨10.18653/v1/2020.acl-main.107⟩ (2020)
BASE
Show details
10
CamemBERT: a Tasty French Language Model
In: ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics ; https://hal.inria.fr/hal-02889805 ; ACL 2020 - 58th Annual Meeting of the Association for Computational Linguistics, Jul 2020, Seattle / Virtual, United States. ⟨10.18653/v1/2020.acl-main.645⟩ (2020)
BASE
Show details
11
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
In: https://hal.inria.fr/hal-03109106 ; 2020 (2020)
BASE
Show details
12
Can Multilingual Language Models Transfer to an Unseen Dialect? A Case Study on North African Arabizi ...
BASE
Show details
13
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models ...
BASE
Show details
14
Unsupervised Learning for Handling Code-Mixed Data: A Case Study on POS Tagging of North-African Arabizi Dialect
In: EurNLP - First annual EurNLP ; https://hal.archives-ouvertes.fr/hal-02270527 ; EurNLP - First annual EurNLP, Oct 2019, Londres, United Kingdom (2019)
BASE
Show details
15
CamemBERT: a Tasty French Language Model
In: https://hal.inria.fr/hal-02445946 ; 2019 (2019)
BASE
Show details
16
Enhancing BERT for Lexical Normalization
In: The 5th Workshop on Noisy User-generated Text (W-NUT) ; https://hal.inria.fr/hal-02294316 ; The 5th Workshop on Noisy User-generated Text (W-NUT), Nov 2019, Hong Kong, China (2019)
BASE
Show details
17
ELMoLex: Connecting ELMo and Lexicon features for Dependency Parsing
In: CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies ; https://hal.inria.fr/hal-01959045 ; CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, Oct 2018, Brussels, Belgium. ⟨10.18653/v1/K18-2023⟩ (2018)
BASE
Show details
18
CoNLL 2018 Shared Task System Outputs
Zeman, Daniel; Potthast, Martin; Duthoo, Elie. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
18
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern