DE eng

Search in the Catalogues and Directories

Hits 1 – 1 of 1

1
Anti-transfer learning for task invariance in convolutional neural networks for speech processing
Guizzo, E.; Weyde, T.; Tarroni, G.. - : Elsevier, 2021
Abstract: We introduce the novel concept of anti-transfer learning for speech processing with convolutional neural networks. While transfer learning assumes that the learning process for a target task will benefit from re-using representations learned for another task, anti-transfer avoids the learning of representations that have been learned for an orthogonal task, i.e., one that is not relevant and potentially confounding for the target task, such as speaker identity for speech recognition or speech content for emotion recognition. This extends the potential use of pre-trained models that have become increasingly available. In anti-transfer learning, we penalize similarity between activations of a network being trained on a target task and another one previously trained on an orthogonal task, which yields more suitable representations. This leads to better generalization and provides a degree of control over correlations that are spurious or undesirable, e.g. to avoid social bias. We have implemented anti-transfer for convolutional neural networks in different configurations with several similarity metrics and aggregation functions, which we evaluate and analyze with several speech and audio tasks and settings, using six datasets. We show that anti-transfer actually leads to the intended invariance to the orthogonal task and to more appropriate features for the target task at hand. Anti-transfer learning consistently improves classification accuracy in all test cases. While anti-transfer creates computation and memory cost at training time, there is relatively little computation cost when using pre-trained models for orthogonal tasks. Anti-transfer is widely applicable and particularly useful where a specific invariance is desirable or where labeled data for orthogonal tasks are difficult to obtain on a given dataset but pre-trained models are available.
Keyword: P Philology. Linguistics; QA75 Electronic computers. Computer science; RC0321 Neuroscience. Biological psychiatry. Neuropsychiatry
URL: https://doi.org/10.1016/j.neunet.2021.05.012
https://openaccess.city.ac.uk/id/eprint/26378/1/2006.06494v2.pdf
https://openaccess.city.ac.uk/id/eprint/26378/
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern