41 |
Beat gestures modulate auditory integration in speech perception
|
|
|
|
BASE
|
|
Show details
|
|
42 |
Audiovisual integration as conflict resolution: The conflict of the McGurk illusion
|
|
|
|
BASE
|
|
Show details
|
|
43 |
Searching high and low: prosodic breaks disambiguate relative clauses
|
|
|
|
BASE
|
|
Show details
|
|
44 |
Neural correlates of audiovisual speech processing in a second language
|
|
|
|
BASE
|
|
Show details
|
|
45 |
Hand gestures as visual prosody: BOLD responses to audio–visual alignment are modulated by the communicative nature of the stimuli
|
|
|
|
BASE
|
|
Show details
|
|
46 |
Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech
|
|
|
|
Abstract:
During social interactions, speakers often produce spontaneous gestures to accompany their speech. These coordinated body movements convey communicative intentions, and modulate how listeners perceive the message in a subtle, but important way. In the present perspective, we put the focus on the role that congruent non-verbal information from beat gestures may play in the neural responses to speech. Whilst delta-theta oscillatory brain responses reflect the time-frequency structure of the speech signal, we argue that beat gestures promote phase resetting at relevant word onsets. This mechanism may facilitate the anticipation of associated acoustic cues relevant for prosodic/syllabic-based segmentation in speech perception. We report recently published data supporting this hypothesis, and discuss the potential of beats (and gestures in general) for further studies investigating continuous AV speech processing through low-frequency oscillations. ; This research was supported by the Spanish Ministry of Science and Innovation (PSI2013-42626-P), AGAUR Generalitat de Catalunya (2014SGR856) and the European Research Council (StG-2010 263145).
|
|
Keyword:
Audiovisual speech; Beats; EEG; Gestures; Low-frequency oscillations
|
|
URL: http://hdl.handle.net/10230/25186 https://doi.org/10.3389/fnhum.2015.00527
|
|
BASE
|
|
Hide details
|
|
47 |
The speakers’ accent shapes the listeners’ phonological predictions during speech perception
|
|
|
|
BASE
|
|
Show details
|
|
48 |
Top-down attention regulates the neural expression of audiovisual integration
|
|
|
|
BASE
|
|
Show details
|
|
49 |
Discriminating speech rhythms in audition, vision, and touch
|
|
|
|
BASE
|
|
Show details
|
|
50 |
The Interplay between semantic and phonological constraints during spoken-word comprehension
|
|
|
|
BASE
|
|
Show details
|
|
51 |
Integrating when and what information in the left parietal lobule allows language rule generalization
|
|
|
|
BASE
|
|
Show details
|
|
52 |
Age-related sensitive periods influence visual language discrimination in adults
|
|
|
|
BASE
|
|
Show details
|
|
|
|