DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Integrating when and what information in the left parietal lobe allows language rule generalization
Orpella, Joan; Ripollés, Pablo; Ruzzoli, Manuela. - : Public Library of Science, 2020
BASE
Show details
2
Integrating when and what information in the left parietal lobe allows language rule generalization
In: PLoS Biol (2020)
BASE
Show details
3
Searching High and Low: Prosodic Breaks Disambiguate Relative Clauses
Fromont, Lauren A.; Soto-Faraco, Salvador; Biau, Emmanuel. - : Frontiers Media S.A., 2017
BASE
Show details
4
The interplay between semantic and phonological constraints during spoken-word comprehension
In: ISSN: 0048-5772 ; EISSN: 1469-8986 ; Psychophysiology ; https://hal.univ-lille.fr/hal-01911767 ; Psychophysiology, Wiley, 2015, 52 (1), pp.46--58. ⟨10.1111/psyp.12285⟩ (2015)
BASE
Show details
5
Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech
Biau, Emmanuel; Soto-Faraco, Salvador. - : Frontiers Media S.A., 2015
BASE
Show details
6
Beat gestures and speech processing: when prosody extends to the speaker's hands
Biau, Emmanuel, 1985-. - : Universitat Pompeu Fabra, 2015
In: TDX (Tesis Doctorals en Xarxa) (2015)
BASE
Show details
7
Effect of attentional load on audiovisual speech perception: evidence from ERPs
Alsius, Agnès; Möttönen, Riikka; Sams, Mikko E.. - : Frontiers Media S.A., 2014
BASE
Show details
8
Effect of attentional load on audiovisual speech perception: evidence from ERPs
Alsius, Agnes; Möttönen, Riikka; Sams, Mikko E.. - : Frontiers Media, 2014
BASE
Show details
9
Neural correlates of audiovisual speech processing in a second language
BASE
Show details
10
Neural correlates of audiovisual speech processing in a second language
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 126 (2013) 3, 253-262
OLC Linguistik
Show details
11
Beat gestures modulate auditory integration in speech perception
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 124 (2013) 2, 143-152
OLC Linguistik
Show details
12
The speakers’ accent shapes the listeners’ phonological predictions during speech perception
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 125 (2013) 1, 82-93
OLC Linguistik
Show details
13
Visual information constrains early and late stages of spoken-word recognition in sentence context
In: ISSN: 0167-8760 ; International Journal of Psychophysiology ; https://hal.univ-lille.fr/hal-01911769 ; International Journal of Psychophysiology, Elsevier, 2013, 89 (1), pp.136--147. ⟨10.1016/j.ijpsycho.2013.06.016⟩ (2013)
BASE
Show details
14
Cross-modal predictive mechanisms during speech perception
Sánchez García, Carolina, 1984-. - : Universitat Pompeu Fabra, 2013
In: TDX (Tesis Doctorals en Xarxa) (2013)
BASE
Show details
15
Perceptual load influences auditory space perception in the ventriloquist aftereffect
In: Cognition. - Amsterdam [u.a] : Elsevier 118 (2011) 1, 62-74
BLLDB
OLC Linguistik
Show details
16
Cross-Modal Prediction in Speech Perception
Abstract: Speech perception often benefits from vision of the speaker's lip movements when they are available. One potential mechanism underlying this reported gain in perception arising from audio-visual integration is on-line prediction. In this study we address whether the preceding speech context in a single modality can improve audiovisual processing and whether this improvement is based on on-line information-transfer across sensory modalities. In the experiments presented here, during each trial, a speech fragment (context) presented in a single sensory modality (voice or lips) was immediately continued by an audiovisual target fragment. Participants made speeded judgments about whether voice and lips were in agreement in the target fragment. The leading single sensory context and the subsequent audiovisual target fragment could be continuous in either one modality only, both (context in one modality continues into both modalities in the target fragment) or neither modalities (i.e., discontinuous). The results showed quicker audiovisual matching responses when context was continuous with the target within either the visual or auditory channel (Experiment 1). Critically, prior visual context also provided an advantage when it was cross-modally continuous (with the auditory channel in the target), but auditory to visual cross-modal continuity resulted in no advantage (Experiment 2). This suggests that visual speech information can provide an on-line benefit for processing the upcoming auditory input through the use of predictive mechanisms. We hypothesize that this benefit is expressed at an early level of speech analysis.
Keyword: Research Article
URL: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3187777
https://doi.org/10.1371/journal.pone.0025198
http://www.ncbi.nlm.nih.gov/pubmed/21998642
BASE
Hide details
17
Perceptual load influences auditory space perception in the ventriloquist aftereffect
BASE
Show details
18
Auditory perception : interactions with vision
In: Hearing (Oxford, 2010), p. 271-296
MPI für Psycholinguistik
Show details
19
Narrowing of intersensory speech perception in infancy
Pons, Ferran; Lewkowicz, David J.; Soto-Faraco, Salvador. - : National Academy of Sciences, 2009
BASE
Show details
20
Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: A signal detection study
In: Cognition. - Amsterdam [u.a] : Elsevier 102 (2007) 2, 299
OLC Linguistik
Show details

Page: 1 2 3

Catalogues
0
0
9
0
0
0
0
Bibliographies
10
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
34
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern