Levels of processing in Auditory-visual integration

Can a sound make a light look brigther?

Image from Jaekl & Soto-Faraco, 2010

One defining principle of human information processing is its hierarchical organization, from quick automatic detection of simple features to slower, more demanding extraction of complex patterns and objects.  An important research question is to establish where, in this hierarchy, do multisensory integration effects take place.  For example, in the case of auditory modulation of visual processing, sounds are thought to modulate early stages of cortical processing in V1 when they appear with a visual event.  This would mean that we actually see (i.e. experience) visual events brighter when they are accompanied by sounds. On the other hand, however, one could instead think that the actual experience of visual events is always the same with, or without sound, but the evaluation (decision stage) that we make of them is somehow higher because of priming or criterion shifts at response stages. Neuroimaging studies have of coursed helped clarify this matter, but at the same time present researchers with a complex picture where cross-modal interactions, and therefore, the cause of behavioural phenomena, can be tagged to various stages of information processing. Part of our research efforts are addressed at settling this issue. We use a physiologically-oriented psychophysical approach in audio-visual and audio-tactile detection processes. In one of our latest studies, we have investigated multisensory enhancement of visual contrast detection in the light of the fundamental division between the magnocellular (M-) and parvocellular (P-) visual pathways.  Surprisingly enough, this important feature of the visual system has so far been mostly neglected from multisensory research. In a series of different studies we have managed to provide evidence for the preferential involvement of the magnocellular pathway in audio-visual enhancement during visual detection tasks, that is, tasks that require quick reactions to relatively simple events, or perception of near-threshold events. This would be consistent with early-based interactions (we do see more clearly) between audition and vision, at least under a limited set of situations.

 

Representative publications:

Pérez-Bellido A, Soto-Faraco S, López-Moliner J.  2013.  Sound-driven enhancement of vision: Disentangling detection from decisional level contributions. Journal of Neurophysiology. 109(4):1065-1077.

Jaekl P, Soto-Faraco S.  2010.  Audiovisual contrast enhancement is articulated primarily via the M-pathway. Brain Research. 1366:85-92.