DE eng

Search in the Catalogues and Directories

Hits 1 – 5 of 5

1
Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.739/ Abstract: Transformer models have achieved state-of-the-art results in a wide range of NLP tasks including summarization. Training and inference using large transformer models can be computationally expensive. Previous work has focused on one important bottleneck, the quadratic self-attention mechanism in the encoder. Modified encoder architectures such as LED or LoBART use local attention patterns to address this problem for summarization. In contrast, this work focuses on the transformer's encoder-decoder attention mechanism. The cost of this attention becomes more significant in inference or training approaches that require model-generated histories. First, we examine the complexity of the encoder-decoder attention. We demonstrate empirically that there is a sparse sentence structure in document summarization that can be exploited by constraining the attention mechanism to a subset of input sentences, whilst maintaining system ...
URL: https://dx.doi.org/10.48448/t5yn-e724
https://underline.io/lecture/37392-sparsity-and-sentence-structure-in-encoder-decoder-attention-of-summarization-systems
BASE
Hide details
2
Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems ...
Manakul, Potsawee; Gales, Mark. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
3
Sparsity and Sentence Structure in Encoder-Decoder Attention of Summarization Systems ...
BASE
Show details
4
Long-Span Summarization via Local Attention and Content Selection ...
BASE
Show details
5
Long-span summarization via local attention and content selection
Manakul, Potsawee; Gales, Mark. - : ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference, 2021
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
5
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern