Online citations, reference lists, and bibliographies.
Referencing for people who value simplicity, privacy, and speed.
Get Citationsy
← Back to Search

Sound-driven Enhancement Of Vision: Disentangling Detection-level From Decision-level Contributions

Alexis Pérez-Bellido, Salvador Soto-Faraco, Joan López-Moliner

Save to my Library
Download PDF
Analyze on Scholarcy Visualize in Litmaps
Share
Reduce the time it takes to create your bibliography by a factor of 10 by using the world’s favourite reference manager
Time to take this seriously.
Get Citationsy
Cross-modal enhancement can be mediated both by higher-order effects due to attention and decision making and by detection-level stimulus-driven interactions. However, the contribution of each of these sources to behavioral improvements has not been conclusively determined and quantified separately. Here, we apply psychophysical analysis based on Piéron functions in order to separate stimulus-dependent changes from those accounted by decision-level contributions. Participants performed a simple visual speeded detection task on Gabor patches of different spatial frequencies and contrast values, presented with and without accompanying sounds. On one hand, we identified an additive cross-modal improvement in mean reaction times across all types of visual stimuli that would be well explained by interactions not strictly based on stimulus-driven modulations (e.g., due to reduction of temporal uncertainty and motor times). On the other hand, we singled out an audio-visual benefit that strongly depended on stimulus features such as frequency and contrast. This particular enhancement was selective to low-visual spatial frequency stimuli, optimized for magnocellular sensitivity. We therefore conclude that interactions at detection stages and at decisional processes in response selection that contribute to audio-visual enhancement can be separated online and express on partly different aspects of visual processing.