Multidimensional processing of dynamic sounds: more than meets the ear (original) (raw)
2011, Attention, Perception, & Psychophysics
Strong cross-modal interactions exist between visual and auditory processing. The relative contributions of perceptual versus decision-related processes to such interactions are only beginning to be understood. We used methodological and statistical approaches to control for potential decision-related contributions such as response interference, decisional criterion shift, and strategy selection. Participants were presented with rising-, falling-, and constant-amplitude sounds and were asked to detect change (increase or decrease) in sound amplitude while ignoring an irrelevant visual cue of a disk that grew, shrank, or stayed constant in size. Across two experiments, testing context was manipulated by varying the grouping of visual cues during testing, and cross-modal congruency showed independent perceptual and decision-related effects. Whereas a change in testing context greatly affected criterion shifts, cross-modal effects on perceptual sensitivity remained relatively consistent. In general, participants were more sensitive to increases in sound amplitude and less sensitive to sounds paired with dynamic visual cues. As compared with incongruent visual cues, congruent cues enhanced detection of amplitude decreases, but not increases. These findings suggest that the relative contributions of perceptual and decisional processing and the impacts of these processes on cross-modal interactions can vary significantly depending on asymmetries in within-modal processing, as well as consistencies in cross-modal dynamics. Keywords Cross-modal interactions. Motion in depth. Looming. Audiovisual. Visual capture Successful exploration of the environment often involves evaluating and comparing information from multiple modalities and along multiple dimensions. Cross-modal interactions in perception have been studied using behavioral and physiological measures (Alais, Newell, & Mamassian, 2010; Calvert, Spence, & Stein, 2004). Intersensory conflicts are often introduced in behavioral studies of cross-modal interaction (Soto-Faraco, Spence, Lloyd, & Kingstone, 2004b; Spence, 2011). Participants are asked to attend to signals from a target modality (e.g., audition) while ignoring potentially conflicting signals from an irrelevant modality (e.g., vision). In general, these studies demonstrate that the integration of multimodal stimuli depends on the temporal (