DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 26

1
Rapid Assessment of Non-Verbal Auditory Perception in Normal-Hearing Participants and Cochlear Implant Users
In: ISSN: 2077-0383 ; Journal of Clinical Medicine ; https://hal.archives-ouvertes.fr/hal-03413817 ; Journal of Clinical Medicine, MDPI, 2021, 10 (10), pp.2093. ⟨10.3390/jcm10102093⟩ (2021)
BASE
Show details
2
Rapid Assessment of Non-Verbal Auditory Perception in Normal-Hearing Participants and Cochlear Implant Users
In: ISSN: 2077-0383 ; Journal of Clinical Medicine ; https://hal.archives-ouvertes.fr/hal-03375356 ; Journal of Clinical Medicine, MDPI, 2021, 10 (10), pp.2093. ⟨10.3390/jcm10102093⟩ (2021)
BASE
Show details
3
The Iambic Trochaic Law ...
Wagner, Michael. - : Open Science Framework, 2021
BASE
Show details
4
The Iambic Trochaic Law ...
Wagner, Michael. - : Open Science Framework, 2021
BASE
Show details
5
The Iambic Trochaic Law ...
Wagner, Michael. - : Open Science Framework, 2021
BASE
Show details
6
Sound source segregation of multiple concurrent talkers via Short-Time Target Cancellation
BASE
Show details
7
Complex acoustic environments: concepts, methods and auditory perception
Weisser, Adam. - : Sydney, Australia : Macquarie University, 2018
BASE
Show details
8
Temporal processing in audition: insights from music
BASE
Show details
9
Early cortical metabolic rearrangement related to clinical data in idiopathic sudden sensorineural hearing loss
Schillaci, O; Alessandrini, M; Chiaravalloti, A. - : ELSEVIER SCIENCE BV, 2017
BASE
Show details
10
Using Energy Difference for Speech Separation of Dual-microphone Close-talk System
In: http://www.sensorsportal.com/HTML/DIGEST/may_2013/Special_issue/P_SI_353.pdf (2013)
BASE
Show details
11
Cortical responses to changes in acoustic regularity are differentially modulated by attentional load
BASE
Show details
12
TEMPORAL CODING OF SPEECH IN HUMAN AUDITORY CORTEX
Ding, Nai. - 2012
Abstract: Human listeners can reliably recognize speech in complex listening environments. The underlying neural mechanisms, however, remain unclear and cannot yet be emulated by any artificial system. In this dissertation, we study how speech is represented in the human auditory cortex and how the neural representation contributes to reliable speech recognition. Cortical activity from normal hearing human subjects is noninvasively recorded using magnetoencephalography, during natural speech listening. It is first demonstrated that neural activity from auditory cortex is precisely synchronized to the slow temporal modulations of speech, when the speech signal is presented in a quiet listening environment. How this neural representation is affected by acoustic interference is then investigated. Acoustic interference degrades speech perception via two mechanisms, informational masking and energetic masking, which are addressed respectively by using a competing speech stream and a stationary noise as the interfering sound. When two speech streams are presented simultaneously, cortical activity is predominantly synchronized to the speech stream the listener attends to, even if the unattended, competing speech stream is 8 dB more intense. When speech is presented together with spectrally matched stationary noise, cortical activity remains precisely synchronized to the temporal modulations of speech until the noise is 9 dB more intense. Critically, the accuracy of neural synchronization to speech predicts how well individual listeners can understand speech in noise. Further analysis reveals that two neural sources contribute to speech synchronized cortical activity, one with a shorter response latency of about 50 ms and the other with a longer response latency of about 100 ms. The longer-latency component, but not the shorter-latency component, shows selectivity to the attended speech and invariance to background noise, indicating a transition from encoding the acoustic scene to encoding the behaviorally important auditory object, in auditory cortex. Taken together, we have demonstrated that during natural speech comprehension, neural activity in the human auditory cortex is precisely synchronized to the slow temporal modulations of speech. This neural synchronization is robust to acoustic interference, whether speech or noise, and therefore provides a strong candidate for the neural basis of acoustic background invariant speech recognition.
Keyword: auditory scene analysis; Electrical engineering; Engineering; human auditory cortex; magnetoencephalography (MEG); Neurosciences; spectro-temporal response function (STRF); speech; temporal processing
URL: http://hdl.handle.net/1903/12988
BASE
Hide details
13
Monaural speech separation and recognition challenge
: Elsevier, 2011
BASE
Show details
14
Monaural speech separation and recognition challenge
In: ISSN: 0885-2308 ; EISSN: 1095-8363 ; Computer Speech and Language ; https://hal.archives-ouvertes.fr/hal-00598185 ; Computer Speech and Language, Elsevier, 2009, 24 (1), pp.1. ⟨10.1016/j.csl.2009.02.006⟩ (2009)
BASE
Show details
15
Monaural Speech Segregation by Integrating Primitive and Schema-Based Analysis
In: DTIC (2008)
BASE
Show details
16
Hearing VS. Listening: Attention Changes the Neural Representations of Auditory Percepts
xiang, juanjuan. - 2008
BASE
Show details
17
A Computational Auditory Scene Analysis System for Speech Segregation and Robust Speech Recognition
BASE
Show details
18
Isolating the Energetic Component of Speech-on-Speech Masking With Ideal Time-Frequency Segregation
In: DTIC (2006)
BASE
Show details
19
Speech recognition with amplitude and frequency modulations
In: Zeng, F G; Nie, K; Stickney, G S; Kong, Y Y; Vongphoe, M; Bhargave, A; et al.(2005). Speech recognition with amplitude and frequency modulations. Proceedings of the National Academy of Sciences of the United States of America, 102(7), 2293 - 2298. UC Irvine: Retrieved from: http://www.escholarship.org/uc/item/1tn280m7 (2005)
BASE
Show details
20
ARSTREAM: A Neural Network Model of Auditory Scene Analysis and Source Segregation
Cohen, Michael; Grossberg, Stephen; Wyse, Lonce. - : Boston University Center for Adaptive Systems and Department of Cognitive and Neural Systems, 2003
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
25
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern