DE eng

Search in the Catalogues and Directories

Hits 1 – 5 of 5

1
Unsupervised word segmentation from speech with attention
Godard, P.; Boito, M.Z.; Ondel, L.. - : ISCA, 2018
BASE
Show details
2
Similarity Measures for the Detection of Clinical Conditions with Verbal Fluency Tasks
Paula, F.; Wilkens, R.; Idiart, M.. - : Association for Computational Linguistics, 2018
BASE
Show details
3
A corpus study of verbal multiword expressions in Brazilian Portuguese
Ramisch, C.; Ramisch, R.; Zilio, L.. - : Springer International Publishing, 2018
BASE
Show details
4
Unwritten languages demand attention too! Word discovery with encoder-decoder models
BASE
Show details
5
Restricted recurrent neural tensor networks: Exploiting word frequency and compositionality
Salle, A.; Villavicencio, A.. - : Association for Computational Linguistics, 2018
Abstract: Increasing the capacity of recurrent neural networks (RNN) usually involves augmenting the size of the hidden layer, with significant increase of computational cost. Recurrent neural tensor networks (RNTN) increase capacity using distinct hidden layer weights for each word, but with greater costs in memory usage. In this paper, we introduce restricted recurrent neural tensor networks (r-RNTN) which reserve distinct hidden layer weights for frequent vocabulary words while sharing a single set of weights for infrequent words. Perplexity evaluations show that for fixed hidden layer sizes, r-RNTNs improve language model performance over RNNs using only a small fraction of the parameters of unrestricted RNTNs. These results hold for r-RNTNs using Gated Recurrent Units and Long Short-Term Memory.
URL: http://eprints.whiterose.ac.uk/153558/
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
5
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern