Home
Catalogue search
Refine your search:
Keyword:
Computation and Language cs.CL (14)
FOS Computer and information sciences (14)
Computational Linguistics (6)
Deep Learning (5)
Neural Network (5)
Condensed Matter Physics (4)
Electromagnetism (4)
FOS Physical sciences (4)
Semantics (4)
Information and Knowledge Engineering (2)
more
Creator / Publisher:
Kann, Katharina (25)
Ebrahimi, Abteen (4)
Mager, Manuel (4)
Schütze, Hinrich (4)
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021 (4)
Bowman, Samuel R. (3)
Ganesh, Ananya (3)
Neubig, Graham (3)
Oncevay, Arturo (3)
Ortega, John (3)
more
Year:
2022 (2)
2021 (11)
2020 (3)
2019 (6)
2018 (2)
2017 (1)
Medium
Type:
Miscellaneous (14)
Article (11)
BLLDB-Access:
free (25)
subject to license (0)
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Page:
1
2
Hits 1 – 20 of 25
1
Morphological Processing of Low-Resource Languages: Where We Are and What's Next ...
Wiemerslage, Adam
;
Silfverberg, Miikka
;
Yang, Changbing
. - : arXiv, 2022
BASE
Show details
2
Match the Script, Adapt if Multilingual: Analyzing the Effect of Multilingual Pretraining on Cross-lingual Transferability ...
Fujinuma, Yoshinari
;
Boyd-Graber, Jordan
;
Kann, Katharina
. - : arXiv, 2022
BASE
Show details
3
Don't Rule Out Monolingual Speakers: A Method For Crowdsourcing Machine Translation Data ...
Bhatnagar, Rajat
;
Ganesh, Ananya
;
Kann, Katharina
. - : arXiv, 2021
BASE
Show details
4
Findings of the LoResMT 2021 Shared Task on COVID and Sign Language for Low-resource Languages ...
Ojha, Atul Kr.
;
Liu, Chao-Hong
;
Kann, Katharina
. - : arXiv, 2021
BASE
Show details
5
How to Adapt Your Pretrained Multilingual Model to 1600 Languages ...
Ebrahimi, Abteen
;
Kann, Katharina
. - : arXiv, 2021
BASE
Show details
6
Findings of the AmericasNLP 2021 Shared Task on Open Machine Translation for Indigenous Languages of the Americas ...
Mager, Manuel
;
Oncevay, Arturo
;
Ebrahimi, Abteen
. - : Association for Computational Linguistics, 2021
BASE
Show details
7
{PROST}: {P}hysical Reasoning about Objects through Space and Time ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Aroca-Ouellette, Stéphane
;
Kann, Katharina
. - : Underline Science Inc., 2021
BASE
Show details
8
Don't Rule Out Monolingual Speakers: A Method For Crowdsourcing Machine Translation Data ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Bhatnagar, Rajat
;
Ganesh, Ananya
. - : Underline Science Inc., 2021
BASE
Show details
9
What Would a Teacher Do? {P}redicting Future Talk Moves ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Ganesh, Ananya
;
Kann, Katharina
. - : Underline Science Inc., 2021
BASE
Show details
10
How to Adapt Your Pretrained Multilingual Model to 1600 Languages ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Ebrahimi, Abteen
;
Kann, Katharina
. - : Underline Science Inc., 2021
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-long.351 Abstract: Pretrained multilingual models (PMMs) enable zero-shot learning via cross-lingual transfer, performing best for languages seen during pretraining. While methods exist to improve performance for unseen languages, they have almost exclusively been evaluated using amounts of raw text only available for a small fraction of the world's languages. In this paper, we evaluate the performance of existing methods to adapt PMMs to new languages using a resource available for close to 1600 languages: the New Testament. This is challenging for two reasons: (1) the small corpus size, and (2) the narrow domain. While performance drops for all approaches, we surprisingly still see gains of up to 17.69% accuracy for part-of-speech tagging and 6.29 F1 for NER on average over all languages as compared to XLM-R. Another unexpected finding is that continued pretraining, the simplest approach, performs best. Finally, we perform a case study to disentangle the ...
Keyword:
Computational Linguistics
;
Condensed Matter Physics
;
Deep Learning
;
Electromagnetism
;
FOS Physical sciences
;
Information and Knowledge Engineering
;
Neural Network
;
Semantics
URL:
https://underline.io/lecture/26066-how-to-adapt-your-pretrained-multilingual-model-to-1600-languages
https://dx.doi.org/10.48448/xj48-9d02
BASE
Hide details
11
AmericasNLI: Evaluating Zero-shot Natural Language Understanding of Pretrained Multilingual Models in Truly Low-resource Languages ...
Ebrahimi, Abteen
;
Mager, Manuel
;
Oncevay, Arturo
. - : arXiv, 2021
BASE
Show details
12
CLiMP: A Benchmark for Chinese Language Model Evaluation ...
Xiang, Beilei
;
Yang, Changbing
;
Li, Yu
. - : arXiv, 2021
BASE
Show details
13
Proceedings of the First Workshop on Natural Language Processing for Indigenous Languages of the Americas
In: Proceedings of the First Workshop on Natural Language Processing for Indigenous Languages of the Americas. Edited by: Mager, Manuel; Oncevay, Arturo; Rios, Annette; Meza Ruiz, Ivan Vladimir; Palmer, Alexis; Neubig, Graham; Kann, Katharina (2021). Online: Association for Computational Linguistics. (2021)
BASE
Show details
14
Unsupervised Morphological Paradigm Completion ...
Jin, Huiming
;
Cai, Liwei
;
Peng, Yihui
. - : arXiv, 2020
BASE
Show details
15
Learning to Learn Morphological Inflection for Resource-Poor Languages ...
Kann, Katharina
;
Bowman, Samuel R.
;
Cho, Kyunghyun
. - : arXiv, 2020
BASE
Show details
16
Acquisition of Inflectional Morphology in Artificial Neural Networks With Prior Knowledge
Kann, Katharina
In: Proceedings of the Society for Computation in Linguistics (2020)
BASE
Show details
17
Probing for Semantic Classes: Diagnosing the Meaning Content of Word Embeddings
Yaghoobzadeh, Yadollah
;
Kann, Katharina
;
Hazen, Timothy
. - : Ludwig-Maximilians-Universität München, 2019
BASE
Show details
18
Probing for Semantic Classes: Diagnosing the Meaning Content of Word Embeddings ...
Yaghoobzadeh, Yadollah
;
Kann, Katharina
;
Hazen, Timothy
. - : Association for Computational Linguistics, 2019
BASE
Show details
19
Acquisition of Inflectional Morphology in Artificial Neural Networks With Prior Knowledge ...
Kann, Katharina
. - : arXiv, 2019
BASE
Show details
20
Grammatical Gender, Neo-Whorfianism, and Word Embeddings: A Data-Driven Approach to Linguistic Relativity ...
Kann, Katharina
. - : arXiv, 2019
BASE
Show details
Page:
1
2
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
25
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern