3 |
Hemipustulopora tuoyuan Liu & Liu & Zágoršek 2019, n. sp. ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Hemipustulopora tuoyuan Liu & Liu & Zágoršek 2019, n. sp. ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Exploiting future word contexts in neural network language models for speech recognition
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Jointly Learning Entity and Relation Representations for Entity Alignment
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Ensuring interpreting quality in legal and courtroom settings: Australian Language Service Providers’ perspectives on their role
|
|
|
|
BASE
|
|
Show details
|
|
8 |
See you in court: How do Australian institutions train legal interpreters?
|
|
|
|
BASE
|
|
Show details
|
|
10 |
ATTITUDES TOWARD HOSPICE CARE: A COMPARISON OF CANTONESE- AND MANDARIN-SPEAKING CHINESE AMERICAN OLDER ADULTS
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Achieving accuracy in a bilingual courtroom: the effectiveness of specialised legal interpreter training
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Investigating bidirectional recurrent neural network language models for speech recognition
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Investigation of Back-off Based Interpolation Between Recurrent Neural Network and N-gram Language Models (Author's Manuscript)
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Sentiment analysis: text, pre-processing, reader views and cross domains
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Facial landmark localization by curvature maps and profile analysis
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Paraphrastic recurrent neural network language models
|
|
|
|
Abstract:
Recurrent neural network language models (RNNLM) have become an increasingly popular choice for state-of-the-art speech recognition systems. Linguistic factors influencing the realization of surface word sequences, for example, expressive richness, are only implicitly learned by RNNLMs. Observed sentences and their associated alternative paraphrases representing the same meaning are not explicitly related during training. In order to improve context coverage and generalization, paraphrastic RNNLMs are investigated in this paper. Multiple paraphrase variants were automatically generated and used in paraphrastic RNNLM training. Using a paraphrastic multi-level RNNLM modelling both word and phrase sequences, significant error rate reductions of 0.6% absolute and perplexity reduction of 10% relative were obtained over the baseline RNNLM on a large vocabulary conversational telephone speech recognition system trained on 2000 hours of audio and 545 million words of texts. The overall improvement over the baseline n-gram LM was increased from 8.4% to 11.6% relative. ; The research leading to these results was supported by EPSRC grant EP/I031022/1 (Natural Speech Technology) and DARPA under the Broad Operational Language Translation (BOLT) and RATS programs. The paper does not necessarily reflect the position or the policy of US Government and no official endorsement should be inferred. Xie Chen is supported by Toshiba Research Europe Ltd, Cambridge Research Lab. ; This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/ICASSP.2015.7179004
|
|
Keyword:
language model; paraphrase; recurrent neural network; speech recognition
|
|
URL: https://www.repository.cam.ac.uk/handle/1810/247441
|
|
BASE
|
|
Hide details
|
|
|
|