DE eng

Search in the Catalogues and Directories

Hits 1 – 1 of 1

1
Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.132/ Abstract: We study the power of cross-attention in the Transformer architecture within the context of transfer learning for machine translation, and extend the findings of studies into cross-attention when training from scratch. We conduct a series of experiments through fine-tuning a translation model on data where either the source or target language has changed. These experiments reveal that fine-tuning only the cross-attention parameters is nearly as effective as fine-tuning all parameters (i.e., the entire translation model). We provide insights into why this is the case and observe that limiting fine-tuning in this manner yields cross-lingually aligned embeddings. The implications of this finding for researchers and practitioners include a mitigation of catastrophic forgetting, the potential for zero-shot translation, and the ability to extend machine translation models to several new language pairs with reduced parameter storage ...
Keyword: Computational Linguistics; Machine Learning; Machine Learning and Data Mining; Machine translation; Natural Language Processing
URL: https://dx.doi.org/10.48448/pgv2-cn55
https://underline.io/lecture/37970-cross-attention-is-all-you-need-adapting-pretrained-transformers-for-machine-translation
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern