DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Integrating when and what information in the left parietal lobe allows language rule generalization
Orpella, Joan; Ripollés, Pablo; Ruzzoli, Manuela. - : Public Library of Science, 2020
BASE
Show details
2
Integrating when and what information in the left parietal lobe allows language rule generalization
In: PLoS Biol (2020)
BASE
Show details
3
Searching High and Low: Prosodic Breaks Disambiguate Relative Clauses
Fromont, Lauren A.; Soto-Faraco, Salvador; Biau, Emmanuel. - : Frontiers Media S.A., 2017
BASE
Show details
4
The interplay between semantic and phonological constraints during spoken-word comprehension
In: ISSN: 0048-5772 ; EISSN: 1469-8986 ; Psychophysiology ; https://hal.univ-lille.fr/hal-01911767 ; Psychophysiology, Wiley, 2015, 52 (1), pp.46--58. ⟨10.1111/psyp.12285⟩ (2015)
BASE
Show details
5
Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech
Biau, Emmanuel; Soto-Faraco, Salvador. - : Frontiers Media S.A., 2015
BASE
Show details
6
Beat gestures and speech processing: when prosody extends to the speaker's hands
Biau, Emmanuel, 1985-. - : Universitat Pompeu Fabra, 2015
In: TDX (Tesis Doctorals en Xarxa) (2015)
BASE
Show details
7
Effect of attentional load on audiovisual speech perception: evidence from ERPs
Alsius, Agnès; Möttönen, Riikka; Sams, Mikko E.. - : Frontiers Media S.A., 2014
BASE
Show details
8
Effect of attentional load on audiovisual speech perception: evidence from ERPs
Alsius, Agnes; Möttönen, Riikka; Sams, Mikko E.. - : Frontiers Media, 2014
BASE
Show details
9
Neural correlates of audiovisual speech processing in a second language
BASE
Show details
10
Neural correlates of audiovisual speech processing in a second language
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 126 (2013) 3, 253-262
OLC Linguistik
Show details
11
Beat gestures modulate auditory integration in speech perception
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 124 (2013) 2, 143-152
OLC Linguistik
Show details
12
The speakers’ accent shapes the listeners’ phonological predictions during speech perception
In: Brain & language. - Orlando, Fla. [u.a.] : Elsevier 125 (2013) 1, 82-93
OLC Linguistik
Show details
13
Visual information constrains early and late stages of spoken-word recognition in sentence context
In: ISSN: 0167-8760 ; International Journal of Psychophysiology ; https://hal.univ-lille.fr/hal-01911769 ; International Journal of Psychophysiology, Elsevier, 2013, 89 (1), pp.136--147. ⟨10.1016/j.ijpsycho.2013.06.016⟩ (2013)
BASE
Show details
14
Cross-modal predictive mechanisms during speech perception
Sánchez García, Carolina, 1984-. - : Universitat Pompeu Fabra, 2013
In: TDX (Tesis Doctorals en Xarxa) (2013)
Abstract: El objetivo de esta tesis es investigar los mecanismos predictivos que operan de forma online durante la percepción audiovisual de una lengua. La idea de que existen mecanismos predictivos que actúan a distintos niveles lingüísticos (sintáctico, semántico, fonológico.) durante la percepción de una lengua ha sido ampliamente apoyada recientemente por literatura. Sin embargo, casi toda la literatura está relacionada con los fenómenos predictivos dentro de la misma modalidad sensorial (visual o auditiva). En esta tesis, investigamos si la predicción online durante la percepción del habla puede ocurrir a través de distintas modalidades sensoriales. Los resultados de este trabajo aportan evidencias de que la información visual articulatoria puede ser utilizada para predecir la subsiguiente información auditiva durante el procesamiento de una lengua. Además, los efectos de la predicción intermodal se observaron únicamente en la lengua nativa de los participantes pero no en una lengua con la que no estaban familiarizados. Esto nos lleva a concluir que representaciones fonológicas bien establecidas son esenciales para que ocurra una predicción online a través de modalidades. El último estudio de esta tesis reveló, mediante el uso de ERPs, que la información visual articulatoria puede ejercer una influencia más allá de las etapas fonológicas. En concreto, la saliencia visual de la primera sílaba de una palabra influye durante la etapa de selección léxica, interaccionando con los procesos semánticos durante la comprensión de frases. Los resultados obtenidos en esta tesis demuestran la existencia de mecanismos predictivos a través de distintas modalidades sensoriales, basados en información articulatoria visual. Estos mecanismos actúan de forma online, haciendo uso de la información multisensorial disponible durante la percepción de una lengua, para optimizar su procesamiento. ; The present dissertation addresses the predictive mechanisms operating online during audiovisual speech perception. The idea that prediction mechanisms operate during the perception of speech at several linguistic levels (i.e. syntactic, semantic, phonological….) has received increasing support in recent literature. Yet, most evidence concerns prediction phenomena within a single sensory modality, i.e., visual, or auditory. In this thesis, I explore if online prediction during speech perception can occur across sensory modalities. The results of this work provide evidence that visual articulatory information can be used to predict the subsequent auditory input during speech processing. In addition, evidence for cross-modal prediction was observed only in the observer’s native language but not in unfamiliar languages. This led to the conclusion that well established phonological representations are paramount for online cross-modal prediction to take place. The last study of this thesis, using ERPs, revealed that visual articulatory information can have an influence beyond phonological stages. In particular, the visual saliency of word onsets has an influence at the stage of lexical selection, interacting with the semantic processes during sentence comprehension. By demonstrating the existence of online cross-modal predictive mechanisms based on articulatory visual information, our results shed new lights on how multisensory cues are used to speed up speech processing.
Keyword: 81; Audiovisual speech; Event-related potentials; Habla audiovisual; Integración multisensorial; Multisensory integration; Percepción del habla; Phonology based-prediction; Predicción; Predicción fonológica; Predictive coding; Speech perception
URL: http://hdl.handle.net/10803/293266
BASE
Hide details
15
Perceptual load influences auditory space perception in the ventriloquist aftereffect
In: Cognition. - Amsterdam [u.a] : Elsevier 118 (2011) 1, 62-74
BLLDB
OLC Linguistik
Show details
16
Cross-Modal Prediction in Speech Perception
Sánchez-García, Carolina; Alsius, Agnès; Enns, James T.. - : Public Library of Science, 2011
BASE
Show details
17
Perceptual load influences auditory space perception in the ventriloquist aftereffect
BASE
Show details
18
Auditory perception : interactions with vision
In: Hearing (Oxford, 2010), p. 271-296
MPI für Psycholinguistik
Show details
19
Narrowing of intersensory speech perception in infancy
Pons, Ferran; Lewkowicz, David J.; Soto-Faraco, Salvador. - : National Academy of Sciences, 2009
BASE
Show details
20
Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: A signal detection study
In: Cognition. - Amsterdam [u.a] : Elsevier 102 (2007) 2, 299
OLC Linguistik
Show details

Page: 1 2 3

Catalogues
0
0
9
0
0
0
0
Bibliographies
10
0
0
0
0
0
0
0
1
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
34
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern