3 |
Applying the Transformer to Character-level Transduction
|
|
|
|
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
|
|
BASE
|
|
Show details
|
|
4 |
Telling BERT's Full Story: from Local Attention to Global Aggregation
|
|
|
|
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
|
|
Abstract:
We take a deep look into the behaviour of self-attention heads in the transformer architecture. In light of recent work discouraging the use of attention distributions for explaining a model’s behaviour, we show that attention distributions can nevertheless provide insights into the local behaviour of attention heads. This way, we propose a distinction between local patterns revealed by attention and global patterns that refer back to the input, and analyze BERT from both angles. We use gradient attribution to analyze how the output of an attention head depends on the input tokens, effectively extending the local attention-based analysis to account for the mixing of information throughout the transformer layers. We find that there is a significant mismatch between attention and attribution distributions, caused by the mixing of context inside the model. We quantify this discrepancy and observe that interestingly, there are some patterns that persist across all layers despite the mixing.
|
|
URL: https://doi.org/10.3929/ethz-b-000496002 https://hdl.handle.net/20.500.11850/496002
|
|
BASE
|
|
Hide details
|
|
5 |
Disambiguatory Signals are Stronger in Word-initial Positions
|
|
|
|
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
|
|
BASE
|
|
Show details
|
|
6 |
Multi-Adversarial Learning for Cross-Lingual Word Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding.
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Is supervised syntactic parsing beneficial for language understanding tasks? An empirical investigation
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Multi-Adversarial Learning for Cross-Lingual Word Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Weakly-Supervised Concept-based Adversarial Learning for Cross-lingual Word Embeddings
|
|
|
|
In: http://infoscience.epfl.ch/record/275419 (2020)
|
|
BASE
|
|
Show details
|
|
13 |
Weakly-Supervised Concept-based Adversarial Learning for Cross-lingual Word Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Movement and structure effects on Universal 20 word order frequencies: A quantitative study
|
|
|
|
In: Glossa: a journal of general linguistics; Vol 3, No 1 (2018); 84 ; 2397-1835 (2018)
|
|
BASE
|
|
Show details
|
|
16 |
Word order variation and dependency length minimisation : a cross-linguistic computational approach
|
|
|
|
BASE
|
|
Show details
|
|
18 |
CLCL (Geneva) DINN Parser : a Neural Network Dependency Parser Ten Years Later
|
|
|
|
In: Proceedings of the CoNLL 2017 Shared Task : Multilingual Parsing from Raw Text to Universal Dependencies P. 228–236 (2017)
|
|
BASE
|
|
Show details
|
|
19 |
Some Recent Results on Cross-Linguistic, Corpus-Based Quantitative Modelling of Word Order and Aspect
|
|
|
|
In: ISBN: 978-3-319-48831-8 ; Formal Models in the Study of Language pp. 451-464 (2017)
|
|
BASE
|
|
Show details
|
|
20 |
Quantitative computational syntax : some initial results
|
|
|
|
In: ISSN: 2499-4553 ; Italian Journal of Computational Linguistics, Vol. 2, No 1 (2016) pp. 11-29 (2016)
|
|
BASE
|
|
Show details
|
|
|
|