Multidimensional processing of dynamic sounds: more than meets the ear (original) (raw)
Abstract
Strong cross-modal interactions exist between visual and auditory processing. The relative contributions of perceptual versus decision-related processes to such interactions are only beginning to be understood. We used methodological and statistical approaches to control for potential decision-related contributions such as response interference, decisional criterion shift, and strategy selection. Participants were presented with rising-, falling-, and constant-amplitude sounds and were asked to detect change (increase or decrease) in sound amplitude while ignoring an irrelevant visual cue of a disk that grew, shrank, or stayed constant in size. Across two experiments, testing context was manipulated by varying the grouping of visual cues during testing, and cross-modal congruency showed independent perceptual and decision-related effects. Whereas a change in testing context greatly affected criterion shifts, cross-modal effects on perceptual sensitivity remained relatively consistent. In general, participants were more sensitive to increases in sound amplitude and less sensitive to sounds paired with dynamic visual cues. As compared with incongruent visual cues, congruent cues enhanced detection of amplitude decreases, but not increases. These findings suggest that the relative contributions of perceptual and decisional processing and the impacts of these processes on cross-modal interactions can vary significantly depending on asymmetries in within-modal processing, as well as consistencies in cross-modal dynamics. Keywords Cross-modal interactions. Motion in depth. Looming. Audiovisual. Visual capture Successful exploration of the environment often involves evaluating and comparing information from multiple modalities and along multiple dimensions. Cross-modal interactions in perception have been studied using behavioral and physiological measures (Alais, Newell, & Mamassian, 2010; Calvert, Spence, & Stein, 2004). Intersensory conflicts are often introduced in behavioral studies of cross-modal interaction (Soto-Faraco, Spence, Lloyd, & Kingstone, 2004b; Spence, 2011). Participants are asked to attend to signals from a target modality (e.g., audition) while ignoring potentially conflicting signals from an irrelevant modality (e.g., vision). In general, these studies demonstrate that the integration of multimodal stimuli depends on the temporal (
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (63)
- Alais, D., & Burr, D. (2003). The "flash-lag" effect occurs in audition and cross-modally. Current Biology, 13, 59-63.
- Alais, D., & Burr, D. (2004a). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194. doi:10.1016/j.cogbrainres.2003.11.011
- Alais, D., & Burr, D. (2004b). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14, 257-262. doi:10.1016/j.cub.2004.01.029
- Alais, D., Newell, F. N., & Mamassian, P. (2010). Multisensory processing in review: From physiology to behaviour. Seeing and Perceiving, 23, 3-38. doi:10.1163/187847510x488603
- Bach, D. R., Neuhoff, J. G., Perrig, W., & Seifritz, E. (2009). Looming sounds as warning signals: The function of motion cues. International Journal of Psychophysiology, 74, 28-33. doi:10.1016/j.ijpsycho.2009.06.004
- Bach, D. R., Schachinger, H., Neuhoff, J. G., Esposito, F., Di Salle, F., Lehmann, C., et al. (2008). Rising sound intensity: An intrinsic warning cue activating the amygdala. Cerebral Cortex, 18, 145- 150. doi:10.1093/cercor/bhm040
- Bertelson, P. (1998). Starting from the ventriloquist: The perception of multimodal events. In M. Sabourin, C. Fergus, & M. Robert (Eds.), Advances in psychological science: Vol. 2. Biological and cognitive aspects (pp. 419-439). Hove: Psychology Press.
- Bertelson, P., & Aschersleben, G. (1998). Automatic visual bias of perceived auditory location. Psychonomic Bulletin & Review, 5, 482-489.
- Bertelson, P., & de Gelder, B. (2004). The psychology of multimodal perception. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 151-177). Oxford: Oxford University Press.
- Bertelson, P., & Radeau, M. (1976). Ventriloquism, sensory interaction, and response bias: Remarks on the paper by Choe, Welch, Gilford, and Juola. Perception & Psychophysics, 19, 531-535.
- Brooks, A., van der Zwan, R., Billard, A., Petreska, B., Clarke, S., & Blanke, O. (2007). Auditory motion affects visual biological motion processing. Neuropsychologia, 45, 523-530. doi:10.1016/ j.neuropsychologia.2005.12.012
- Calvert, G., Spence, C., & Stein, B. E. (2004). The handbook of multisensory processes. Cambridge: MIT Press.
- Cappe, C., Thut, G., Romei, V., & Murray, M. M. (2009). Selective integration of auditory-visual looming cues by humans. Neuropsychologia, 47, 1045-1052. doi:10.1016/j. neuropsychologia.2008.11.003
- Choe, C. S., Welch, R. B., Gilford, R. M., & Juola, J. F. (1975). Ventriloquist effect: Visual dominance or response bias. Perception & Psychophysics, 18, 55-60.
- Ernst, M. O., & Bülthoff, H. H. (2004). Merging the senses into a robust percept. Trends in Cognitive Sciences, 8, 162-169. doi:10.1016/j.tics.2004.02.002
- Forster, K. I., & Forster, J. C. (2003). Dmdx: A windows display program with millisecond accuracy. Behavior Research Methods, Instruments, & Computers, 35, 116-124.
- Gallace, A., & Spence, C. (2006). Multisensory synesthetic inter- actions in the speeded classification of visual size. Perception & Psychophysics, 68, 1191-1203.
- Ghazanfar, A. A., Neuhoff, J. G., & Logothetis, N. K. (2002). Auditory looming perception in rhesus monkeys. Proceedings of the National Academy of Sciences, 99, 15755-15757. doi:10.1073/pnas.242469699
- Hall, D. A., & Moore, D. R. (2003). Auditory neuroscience: The salience of looming sounds. Current Biology, 13, R91- R93. Harrison, N. R., Wuerger, S., & Meyer, G. F. (2010). Reaction time facilitation for horizontally moving auditory-visual stimuli. Journal of Vision, 10, 1-21.
- Helbig, H. B., & Ernst, M. O. (2007). Optimal integration of shape information from vision and touch. Experimental Brain Re- search, 179, 595-606. doi:10.1007/s00221-006-0814-y
- Kitagawa, N., & Ichihara, S. (2002). Hearing visual motion in depth. Nature, 416, 172-174.
- Kitajima, N., & Yamashita, Y. (1999). Dynamic capture of sound motion by light stimuli moving in three-dimensional space. Perceptual and Motor Skills, 89, 1139-1158.
- Laurienti, P. J., Kraft, R. A., Maldjian, J. A., Burdette, J. H., & Wallace, M. T. (2004). Semantic congruence is a critical factor in multisensory behavioral performance. Experimental Brain Re- search, 158, 405-414. doi:10.1007/s00221-004-1913-2
- Liu, E. H., Church, B. A., & Mercado, E., III. (2008, November). Congruent audiovisual dynamics facilitate per- ception of looming stimuli. Poster presented at the the 7th Annual Auditory Perception, Cognition, and Action Meeting, Chicago, IL.
- Macmillan, N. A., & Creelman, C. D. (1991). Detection theory: A user's guide. Cambridge: Cambridge University Press.
- Maier, J. X., Chandrasekaran, C., & Ghazanfar, A. A. (2008). Integration of bimodal looming signals through neuronal coher- ence in the temporal lobe. Current Biology, 18, 963-968. doi:10.1016/j.cub.2008.05.043
- Maier, J. X., & Ghazanfar, A. A. (2007). Looming biases in monkey auditory cortex. Journal of Neuroscience, 27, 4093-4100. doi:10.1523/jneurosci.0330-07.2007
- Maier, J. X., Neuhoff, J. G., Logothetis, N. K., & Ghazanfar, A. A. (2004). Multisensory integration of looming signals by rhesus monkeys. Neuron, 43(2), 177-181.
- Martino, G., & Marks, L. E. (2001). Synesthesia: Strong and weak. Currrent Directions in Psychological Science, 10, 61-65.
- Melara, R. D., Marks, L. E., & Potts, B. C. (1993). Early-holistic processing or dimensional similarity. Journal of Experimental Psychology: Human Perception and Performance, 19, 1114- 1120.
- Meredith, M. A., & Stein, B. E. (1983). Interactions among converging sensory inputs in the superior colliculus. Science, 221, 389-391.
- Meyer, G. F., & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. NeuroReport, 12, 2557-2560.
- Meyer, G. F., Wuerger, S., & Greenlee, M. (2011). Interactions between auditory and visual semantic stimulus classes: Evidence for common processing networks for speech and body actions. Journal of Cognitive Neuroscience, 23, 2291-2308. doi:10.1162/ jocn.2010.21593
- Meyer, G. F., Wuerger, S. M., Rohrbein, F., & Zetzsche, C. (2005). Low-level integration of auditory and visual motion signals requires spatial co-localisation. Experimental Brain Research, 166, 538-547. doi:10.1007/s00221-005-2394-7
- Neuhoff, J. G. (1998). Perceptual bias for rising tones. Nature, 395, 123-124.
- Neuhoff, J. G. (2001). An adaptive bias in the perception of looming auditory motion. Ecological Psychology, 13, 87-110.
- Rapp, B., & Hendel, S. K. (2003). Principles of cross-modal competition: Evidence from deficits of attention. Psychonomic Bulletin & Review, 10, 210-219.
- Regan, D., & Beverley, K. (1978). Looming detectors in the human visual pathway. Vision Research, 18, 415-421.
- Romei, V., Murray, M. M., Cappe, C., & Thut, G. (2009). Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Current Biology, 19, 1799-1805. doi:10.1016/j.cub.2009.09.027
- Rosenblum, L. D., Carello, C., & Pastore, R. E. (1987). Relative effectiveness of three stimulus variables for locating a moving sound source. Perception, 16, 175-186.
- Rosenblum, L. D., Wuestefeld, A. P., & Saldana, H. M. (1993). Auditory looming perception: Influences on anticipatory judg- ments. Perception, 22, 1467-1482.
- Ross, L. A., Saint-Amour, D., Leavitt, V. M., Javitt, D. C., & Foxe, J. J. (2007). Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cerebral Cortex, 17, 1147-1153. doi:10.1093/cercor/bhl024
- Sanabria, D., Soto-Faraco, S., & Spence, C. (2007a). Spatial attention and audiovisual interactions in apparent motion. Journal of Experimental Psychology: Human Perception and Performance, 33, 927-937. doi:10.1037/0096-1523.33.4.927
- Sanabria, D., Spence, C., & Soto-Faraco, S. (2007b). Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: A signal detection study. Cognition, 102, 299-310. doi:10.1016/j.cognition.2006.01.003
- Saygin, A. P., Driver, J., & de Sa, V. R. (2008). In the footsteps of biological motion and multisensory perception: Judgments of audiovisual temporal relations are enhanced for upright walkers. Psychological Science, 19, 469-475. doi:10.1111/j.1467- 9280.2008.02111.x
- Schirillo, J. A. (2011). Cross-modal detection using various temporal and spatial configurations. Attention, Perception, & Psychophy- sics, 73, 237-246. doi:10.3758/s13414-010-0012-7
- Schouten, B., Troje, N. F., Vroomen, J., & Verfaillie, K. (2011). The effect of looming and receding sounds on the perceived in-depth orientation of depth-ambiguous biological motion figures. PLoS One, 6, e14725. doi:10.1371/journal.pone.0014725
- Seifritz, E., Neuhoff, J. G., Bilecen, D., Scheffler, K., Mustovic, H., Schachinger, H., et al. (2002). Neural processing of auditory looming in the human brain. Current Biology, 12, 2147-2151.
- Shams, L., Kamitani, Y., & Shimojo, S. (2000). What you see is what you hear. Nature, 408, 788.
- Simon, J. R., & Berbaum, K. (1990). Effect of conflicting cues on information processing: The 'Stroop effect' vs. The 'Simon effect'. Acta Psychologica, 73, 159-170.
- Soto-Faraco, S., & Kingstone, A. (2004). Multisensory integration of dynamic information. In G. A. Calvert, C. Spence, & B. E. Stein (Eds.), The handbook of multisensory processes (pp. 49-68). Cambridge: MIT Press.
- Soto-Faraco, S., Kingstone, A., & Spence, C. (2003). Multisensory contributions to the perception of motion. Neuropsychologia, 41, 1847-1862. doi:10.1016/s0028-3932(03)00185-4
- Soto-Faraco, S., Spence, C., & Kingstone, A. (2004a). Cross-modal dynamic capture: Congruency effects in the perception of motion across sensory modalities. Journal of Experimental Psychology: Human Perception and Performance, 30, 330-345. doi:10.1037/ 0096-1523.30.2.330
- Soto-Faraco, S., Spence, C., & Kingstone, A. (2005). Assessing automaticity in the audiovisual integration of motion. Acta Psychologica, 118, 71-92. doi:10.1016/j.actpsy.2004.10.008
- Soto-Faraco, S., Spence, C., Lloyd, D., & Kingstone, A. (2004b). Moving multisensory research along: Motion perception across sensory modalities. Current Directions in Psychological Science, 13, 29-32.
- Spence, C. (2011). Crossmodal correspondences: A tutorial. Attention, Perception, & Psychophysics, 73, 971-995. doi:10.3758/s13414- 010-0073-7
- Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. Cambridge: MIT Press.
- Talsma, D., Senkowski, D., Soto-Faraco, S., & Woldorff, M. G. (2010). The multifaceted interplay between attention and multi- sensory integration. Trends in Cognitive Sciences, 14, 400-410. doi:10.1016/j.tics.2010.06.008
- Watanabe, K., & Shimojo, S. (2001). When sound affects vision: Effects of auditory grouping on visual motion perception. Psychological Science, 12, 109-116.
- Welch, R. B., & Warren, D. H. (1980). Immediate perceptual response to intersensory discrepancy. Psychological Bulletin, 88, 638-667.
- Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and visual motion signals at threshold. Perception & Psychophysics, 65, 1188-1196.
- Zahorik, P. (2002). Assessing auditory distance perception using virtual acoustics. Journal of the Acoustical Society of America, 111, 1832-1846. doi:10.1121/1.1458027
- Zetzsche, C., Röhrbein, F., Hofbauer, M., & Schill, K. (2002). Audio- visual sensory interactions and the statistical covariance of the natural environment. In A. Calvo-Manzano, A. Perez-Lopez, & J. S. Santiago (Eds.), Proceedings of the Forum Acusticum. Sevilla.