DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 33

1
Neighborhood Matching Network for Entity Alignment
Wu, Y; Liu, X; Feng, Y. - : The Association for Computational Linguistics, 2020
BASE
Show details
2
Understanding (Mis)Behavior on the EOSIO Blockchain
Huang, Y; Wang, H; Wu, L. - 2020
BASE
Show details
3
Hemipustulopora tuoyuan Liu & Liu & Zágoršek 2019, n. sp. ...
Liu, H.; Liu, X.; Zágoršek, K.. - : Zenodo, 2019
BASE
Show details
4
Hemipustulopora tuoyuan Liu & Liu & Zágoršek 2019, n. sp. ...
Liu, H.; Liu, X.; Zágoršek, K.. - : Zenodo, 2019
BASE
Show details
5
Exploiting future word contexts in neural network language models for speech recognition
Chen, X.; Liu, X.; Wang, Y.. - : Institute of Electrical and Electronics Engineers (IEEE), 2019
BASE
Show details
6
Jointly Learning Entity and Relation Representations for Entity Alignment
Wu, Y; Liu, X; Feng, Y. - : Association for Computational Linguistics, 2019
BASE
Show details
7
Ensuring interpreting quality in legal and courtroom settings: Australian Language Service Providers’ perspectives on their role
Liu, X; Stern, L. - : Roehampton University, 2019
BASE
Show details
8
See you in court: How do Australian institutions train legal interpreters?
Stern, L; Liu, X. - : Taylor & Francis (Routledge), 2019
BASE
Show details
9
Future word contexts in neural network language models
Chen, X.; Liu, X.; Ragni, A.. - : IEEE, 2018
BASE
Show details
10
ATTITUDES TOWARD HOSPICE CARE: A COMPARISON OF CANTONESE- AND MANDARIN-SPEAKING CHINESE AMERICAN OLDER ADULTS
Liu, X; Berkman, C. - : Oxford University Press, 2018
BASE
Show details
11
Achieving accuracy in a bilingual courtroom: the effectiveness of specialised legal interpreter training
Hale, SB; Liu, X. - : Taylor & Francis (Routledge), 2018
BASE
Show details
12
Future word contexts in neural network language models
Chen, X.; Liu, X.; Ragni, A.. - : IEEE, 2017
BASE
Show details
13
Investigating bidirectional recurrent neural network language models for speech recognition
Chen, X.; Ragni, A.; Liu, X.. - : International Speech Communication Association (ISCA), 2017
BASE
Show details
14
Investigation of Back-off Based Interpolation Between Recurrent Neural Network and N-gram Language Models (Author's Manuscript)
BASE
Show details
15
Sentiment analysis: text, pre-processing, reader views and cross domains
Haddi, Emma. - : Brunel University London, 2015
BASE
Show details
16
Facial landmark localization by curvature maps and profile analysis
Lippold, C. (Carsten); Liu, X. (Xiang); Wangdo, K. (Kim). - 2015
BASE
Show details
17
Paraphrastic recurrent neural network language models
Liu, X; Chen, X; Gales, Mark; Woodland, Philip. - : IEEE, 2015. : ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 2015
Abstract: Recurrent neural network language models (RNNLM) have become an increasingly popular choice for state-of-the-art speech recognition systems. Linguistic factors influencing the realization of surface word sequences, for example, expressive richness, are only implicitly learned by RNNLMs. Observed sentences and their associated alternative paraphrases representing the same meaning are not explicitly related during training. In order to improve context coverage and generalization, paraphrastic RNNLMs are investigated in this paper. Multiple paraphrase variants were automatically generated and used in paraphrastic RNNLM training. Using a paraphrastic multi-level RNNLM modelling both word and phrase sequences, significant error rate reductions of 0.6% absolute and perplexity reduction of 10% relative were obtained over the baseline RNNLM on a large vocabulary conversational telephone speech recognition system trained on 2000 hours of audio and 545 million words of texts. The overall improvement over the baseline n-gram LM was increased from 8.4% to 11.6% relative. ; The research leading to these results was supported by EPSRC grant EP/I031022/1 (Natural Speech Technology) and DARPA under the Broad Operational Language Translation (BOLT) and RATS programs. The paper does not necessarily reflect the position or the policy of US Government and no official endorsement should be inferred. Xie Chen is supported by Toshiba Research Europe Ltd, Cambridge Research Lab. ; This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/ICASSP.2015.7179004
Keyword: language model; paraphrase; recurrent neural network; speech recognition
URL: https://www.repository.cam.ac.uk/handle/1810/247441
BASE
Hide details
18
Paraphrastic language models
In: Computer speech and language. - Amsterdam [u.a.] : Elsevier 28 (2014) 6, 1298-1316
OLC Linguistik
Show details
19
Use of contexts in language model interpolation and adaptation
In: Computer speech and language. - Amsterdam [u.a.] : Elsevier 27 (2013) 1, 301-321
OLC Linguistik
Show details
20
Language model cross adaptation for LVCSR system combination
In: Computer speech and language. - Amsterdam [u.a.] : Elsevier 27 (2013) 4, 928-942
OLC Linguistik
Show details

Page: 1 2

Catalogues
0
0
6
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
27
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern