23 |
Higher-order Derivatives of Weighted Finite-state Machines ...
|
|
|
|
BASE
|
|
Show details
|
|
25 |
A surprisal--duration trade-off across and within the world's languages ...
|
|
|
|
BASE
|
|
Show details
|
|
31 |
What About the Precedent: An Information-Theoretic Analysis of Common Law ...
|
|
|
|
BASE
|
|
Show details
|
|
35 |
Examining the Inductive Bias of Neural Language Models with Artificial Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
36 |
Finding Concept-specific Biases in Form–Meaning Associations ...
|
|
|
|
BASE
|
|
Show details
|
|
37 |
Differentiable subset pruning of transformer heads ...
|
|
|
|
Abstract:
Multi-head attention, a collection of several attention mechanisms that independently attend to different parts of the input, is the key ingredient in the Transformer. Recent work has shown, however, that a large proportion of the heads in a Transformer's multi-head attention mechanism can be safely pruned away without significantly harming the performance of the model; such pruning leads to models that are noticeably smaller and faster in practice. Our work introduces a new head pruning technique that we term differentiable subset pruning. Intuitively, our method learns per-head importance variables and then enforces a user-specified hard constraint on the number of unpruned heads. The importance variables are learned via stochastic gradient descent. We conduct experiments on natural language inference and machine translation; we show that differentiable subset pruning performs comparably or better than previous works while offering precise control of the sparsity level. ... : Transactions of the Association for Computational Linguistics, 9 ...
|
|
URL: http://hdl.handle.net/20.500.11850/528141 https://dx.doi.org/10.3929/ethz-b-000528141
|
|
BASE
|
|
Hide details
|
|
39 |
Efficient computation of expectations under spanning tree distributions ...
|
|
|
|
BASE
|
|
Show details
|
|
40 |
Multimodal pretraining unmasked: A meta-analysis and a unified framework of vision-and-language berts ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|