Home
Catalogue search
Refine your search:
Keyword
Creator / Publisher:
Venezia, Jonathan H. (7)
Hickok, Gregory (6)
Matchin, William (4)
Allison-Graham Martin (2)
Leek, Marjorie R. (2)
Lindeman, Michael P. (2)
Okada, Kayoko (2)
Richards, Virginia M. (2)
Saberi, Kourosh (2)
Alain, Claude (1)
more
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 8 of 8
1
Suprathreshold differences in competing speech (Venezia et al., 2020) ...
Venezia, Jonathan H.
;
Leek, Marjorie R.
;
Lindeman, Michael P.
. - : ASHA journals, 2020
BASE
Show details
2
Suprathreshold differences in competing speech (Venezia et al., 2020) ...
Venezia, Jonathan H.
;
Leek, Marjorie R.
;
Lindeman, Michael P.
. - : ASHA journals, 2020
BASE
Show details
3
STMs for speech in impaired listeners (Venezia et al., 2019) ...
Venezia, Jonathan H.
;
Allison-Graham Martin
;
Hickok, Gregory
. - : ASHA journals, 2019
BASE
Show details
4
STMs for speech in impaired listeners (Venezia et al., 2019) ...
Venezia, Jonathan H.
;
Allison-Graham Martin
;
Hickok, Gregory
. - : ASHA journals, 2019
BASE
Show details
5
Timing in Audiovisual Speech Perception: A Mini Review and New Psychophysical Data
Venezia, Jonathan H.
;
Thurman, Steven M.
;
Matchin, William
;
George, Sahara E.
;
Hickok, Gregory
. - 2016
Abstract:
Recent influential models of audiovisual speech perception suggest that visual speech aids perception by generating predictions about the identity of upcoming speech sounds. These models place stock in the assumption that visual speech leads auditory speech in time. However, it is unclear whether and to what extent temporally-leading visual speech information contributes to perception. Previous studies exploring audiovisual-speech timing have relied upon psychophysical procedures that require artificial manipulation of cross-modal alignment or stimulus duration. We introduce a classification procedure that tracks perceptually-relevant visual speech information in time without requiring such manipulations. Participants were shown videos of a McGurk syllable (auditory /apa/ + visual /aka/ = perceptual /ata/) and asked to perform phoneme identification (/apa/ yes-no). The mouth region of the visual stimulus was overlaid with a dynamic transparency mask that obscured visual speech in some frames but not others randomly across trials. Variability in participants' responses (∼35% identification of /apa/ compared to ∼5% in the absence of the masker) served as the basis for classification analysis. The outcome was a high resolution spatiotemporal map of perceptually-relevant visual features. We produced these maps for McGurk stimuli at different audiovisual temporal offsets (natural timing, 50-ms visual lead, and 100-ms visual lead). Briefly, temporally-leading (∼130 ms) visual information did influence auditory perception. Moreover, several visual features influenced perception of a single speech sound, with the relative influence of each feature depending on both its temporal relation to the auditory signal and its informational content.
Keyword:
Article
URL:
https://doi.org/10.3758/s13414-015-1026-y
http://www.ncbi.nlm.nih.gov/pubmed/26669309
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4744562/
BASE
Hide details
6
Perception drives production across sensory modalities: A network for sensorimotor integration of visual speech
Venezia, Jonathan H.
;
Fillmore, Paul
;
Matchin, William
. - 2015
BASE
Show details
7
An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
Okada, Kayoko
;
Venezia, Jonathan H
;
Matchin, William
...
In: Okada, Kayoko; Venezia, Jonathan H; Matchin, William; Saberi, Kourosh; Hickok, Gregory; & Alain, Claude. (2013). An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex. PLoS ONE, 8(6), e68959. doi:10.1371/journal.pone.0068959. UC Irvine: Retrieved from: http://www.escholarship.org/uc/item/85b624s0 (2013)
BASE
Show details
8
An fMRI Study of Audiovisual Speech Perception Reveals Multisensory Interactions in Auditory Cortex
Okada, Kayoko
;
Venezia, Jonathan H.
;
Matchin, William
. - : Public Library of Science, 2013
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
8
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern