DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 36

1
Distributed representations for multilingual language processing
Dufter, Philipp [Verfasser]; Schütze, Hinrich [Akademischer Betreuer]. - München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2021
DNB Subject Category Language
Show details
2
Graph Algorithms for Multiparallel Word Alignment
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing ; The 2021 Conference on Empirical Methods in Natural Language Processing ; https://hal.archives-ouvertes.fr/hal-03424044 ; The 2021 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Nov 2021, Punta Cana, Dominica ; https://2021.emnlp.org/ (2021)
BASE
Show details
3
Distributed representations for multilingual language processing
Dufter, Philipp. - : Ludwig-Maximilians-Universität München, 2021
BASE
Show details
4
Distributed representations for multilingual language processing ...
Dufter, Philipp. - : Ludwig-Maximilians-Universität München, 2021
BASE
Show details
5
Static Embeddings as Efficient Knowledge Bases? ...
BASE
Show details
6
ParCourE: A Parallel Corpus Explorer for a Massively Multilingual Corpus ...
BASE
Show details
7
Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models ...
BASE
Show details
8
Wine is Not v i n. -- On the Compatibility of Tokenizations Across Languages ...
BASE
Show details
9
Graph Algorithms for Multiparallel Word Alignment ...
BASE
Show details
10
Locating Language-Specific Information in Contextualized Embeddings ...
BASE
Show details
11
BERT Cannot Align Characters ...
BASE
Show details
12
Static Embeddings as Efficient Knowledge Bases? ...
NAACL 2021 2021; Dufter, Philipp; Kassner, Nora; Schütze, Hinrich. - : Underline Science Inc., 2021
Abstract: Read the paper on the folowing link: https://www.aclweb.org/anthology/2021.naacl-main.186/ Abstract: Recent research investigates factual knowledge stored in large pretrained language models (PLMs). Instead of structural knowledge base (KB) queries, masked sentences such as “Paris is the capital of [MASK]” are used as probes. The good performance on this analysis task has been interpreted as PLMs becoming potential repositories of factual knowledge. In experiments across ten linguistically diverse languages, we study knowledge contained in static embeddings. We show that, when restricting the output space to a candidate set, simple nearest neighbor matching using static embeddings performs better than PLMs. E.g., static embeddings perform 1.6% points better than BERT while just using 0.3% of energy for training. One important factor in their good comparative performance is that static embed- dings are standardly learned for a large vocabulary. In contrast, BERT exploits its more sophisticated, but expensive ...
URL: https://underline.io/lecture/20004-static-embeddings-as-efficient-knowledge-basesquestion
https://dx.doi.org/10.48448/cf3z-dn34
BASE
Hide details
13
ParCourE: A Parallel Corpus Explorer for a Massively Multilingual Corpus ...
BASE
Show details
14
Wine is not v i n. On the Compatibility of Tokenizations across Languages ...
BASE
Show details
15
SimAlign: High Quality Word Alignments Without Parallel Training Data Using Static and Contextualized Embeddings
In: EMNLP 2020 ; https://hal.archives-ouvertes.fr/hal-03013194 ; EMNLP 2020, Association for Computational Linguistics, Nov 2020, Online, United States. pp.1627 - 1643 (2020)
BASE
Show details
16
Identifying Necessary Elements for BERT’s Multilinguality
BASE
Show details
17
Identifying Elements Essential for BERT’s Multilinguality
BASE
Show details
18
SimAlign: High Quality Word Alignments without Parallel Training Data using Static and Contextualized Embeddings
In: Findings of ACL: EMNLP 2020 (2020)
BASE
Show details
19
Monolingual and Multilingual Reduction of Gender Bias in Contextualized Representations
BASE
Show details
20
Increasing Learning Efficiency of Self-Attention Networks through Direct Position Interactions, Learnable Temperature, and Convoluted Attention
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
35
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern