A single auditory tone alters the perception of multiple visual events (original) (raw)

Bouncing or streaming? Exploring the influence of auditory cues on the interpretation of ambiguous visual motion

Experimental Brain Research, 2004

When looking at two identical objects moving toward each other on a two-dimensional visual display, two different events can be perceived: the objects can either be seen to bounce off each other, or else to stream through one another. Previous research has shown that the large bias normally seen toward the streaming percept can be modulated by the presentation of an auditory event at the moment of coincidence. However, previous behavioral research on this crossmodal effect has always relied on subjective report. In the present experiment, we used a novel experimental design to provide a more objective/ implicit measure of the effect of an auditory cue on visual motion perception. In our study, two disks moved toward each other, with the point of coincidence hidden behind an occluder. When emerging from behind the occluder, the disks (one red, the other blue) could either follow the same trajectory (streaming) or else move in the opposite direction (bouncing). Participants made speeded discrimination responses regarding the side from which one of the disks emerged from behind the occluder. Participants responded more rapidly on streaming trials when no sound was presented and on bouncing trials when the sound was presented at the moment of coincidence. These results provide the first empirical demonstration of the auditory modulation of an ambiguous visual motion display using an implicit/objective behavioral measure of perception.

Perception of causality and synchrony dissociate in the audiovisual bounce-inducing effect (ABE)

Cognition

A sound can cause 2 visual streaming objects appear to bounce (the audiovisual bounce-inducing effect, ABE). Here we examined whether the stream/bounce percept affects perception of audiovisual synchrony. Participants saw 2 disks that either clearly streamed, clearly bounced, or were ambiguous, and heard a sound around the point of contact (POC). They reported, on each trial, whether they perceived the disks to 'stream' or 'bounce', and whether the sound was 'synchronous' or 'asynchronous' with the POC. Results showed that the optimal time of the sound to induce a bounce was before the POC (−59 msec), whereas audiovisual synchrony was maximal when the sound came after the POC (+16 msec). The range of temporal asynchronies perceived as 'synchronous', the temporal binding window (TBW), was wider when disks were perceived as bouncing than streaming, with no difference between ambiguous and non-ambiguous visual displays. These results demonstrate 1) that causality differs from synchrony, 2) that causality widens the TBW, and 3) that the ABE is perceptually real.

Auditory transients do not affect visual sensitivity in discriminating between objective streaming and bouncing events

Journal of Vision, 2012

With few exceptions, the sound-induced bias toward bouncing characteristic of the stream/bounce effect has been demonstrated via subjective responses, leaving open the question whether perceptual factors, decisional factors, or some combination of the two underlie the illusion. We addressed this issue directly, using a novel stimulus and signal detection theory to independently characterize observers' sensitivity (d 0 ) and criterion (c) when discriminating between objective streaming and bouncing events in the presence or absence of a brief sound at the point of coincidence. We first confirmed that sound-induced motion reversals persist despite rendering the targets visually distinguishable by differences in texture density. Sound-induced bouncing persisted for targets differing by as many as nine just-noticeable-differences (JNDs). We then exploited this finding in our signal detection paradigm in which observers discriminated between objective streaming and bouncing events. We failed to find any difference in sensitivity (d 0 ) between sound and no-sound conditions, but we did observe a significantly more liberal criterion (c) in the sound condition than the no-sound condition. The results suggest that the auditory-induced bias toward bouncing in this context is attributable to a sound-induced shift in criterion implicating decisional processes rather than perceptual processes determining responses to these displays.

Audiovisual Integration: An Investigation of the ‘Streaming-bouncing’ Phenomenon

Journal of PHYSIOLOGICAL ANTHROPOLOGY and Applied Human Science, 2004

Temporal aspects of the perceptual integration of audiovisual information were investigated by utilizing the visual 'streaming-bouncing' phenomenon. When two identical visual objects move towards each other, coincide, and then move away from each other, the objects can either be seen as streaming past one another or bouncing off each other. Although the streaming percept is dominant, the bouncing percept can be induced by presenting an auditory stimulus during the visual coincidence of the moving objects. Here we show that the bounce-inducing effect of the auditory stimulus is paramount when its onset and offset occur in temporal proximity of the onset and offset of the period of visual coincidence of the moving objects. When the duration of the auditory stimulus exceeded this period, visual bouncing disappears. Implications for a temporal window of audiovisual integration and the design of effective audiovisual warning signals are discussed.

Auditory motion affects visual motion perception in a speeded discrimination task

Experimental Brain Research, 2007

Transient auditory stimuli have been shown to influence the perception of ambiguous 2D visual motion displays (the bouncing-disks effect; e.g. Sekuler et al. in Nature 385:308, 1997). The question addressed here was whether continuous moving auditory stimuli can also influence visual motion perception under the same experimental conditions. In Experiment 1, we used a modification of Sanabria et al.’s (Exp Brain Res 157:537–541, 2004) paradigm (involving an indirect behavioural measure of the bouncing-disks effect), in which the 2D visual display was presented together with either a brief tone, a continuous moving sound, or in the absence of any form of auditory stimulation. Crucially, the results showed that, together with the effect of the brief tone on bouncing trials, the presence of the continuous moving sound speeded-up participants’ responses on streaming trials as compared to the brief tone or no sound conditions. The results of a second experiment revealed that the effect of the continuous moving sound reported in Experiment 1 was not caused simply by the presence of continuous auditory stimulation per se.

Stream/bounce event perception reveals a temporal limit of motion correspondence based on surface feature over space and time

i-Perception, 2011

We examined how stream/bounce event perception is affected by motion correspondence based on the surface features of moving objects passing behind an occlusion. In the stream/bounce display two identical objects moving across each other in a two-dimensional display can be perceived as either streaming through or bouncing off each other at coincidence. Here, surface features such as colour (Experiments 1 and 2) or luminance (Experiment 3) were switched between the two objects at coincidence. The moment of coincidence was invisible to observers due to an occluder. Additionally, the presentation of the moving objects was manipulated in duration after the feature switch at coincidence. The results revealed that a postcoincidence duration of approximately 200 ms was required for the visual system to stabilize judgments of stream/bounce events by determining motion correspondence between the objects across the occlusion on the basis of the surface feature. The critical duration was similar across motion speeds of objects and types of surface features. Moreover, controls (Experiments 4a-4c) showed that cognitive bias based on feature (colour/luminance) congruency across the occlusion could not fully account for the effects of surface features on the stream/bounce judgments. We discuss the roles of motion correspondence, visual feature processing, and attentive tracking in the stream/bounce judgments.

Direction of Visual Apparent Motion Driven Solely by Timing of a Static Sound

Current Biology, 2008

In temporal ventriloquism, auditory events can illusorily attract perceived timing of a visual onset . We investigated whether timing of a static sound can also influence spatio-temporal processing of visual apparent motion, induced here by visual bars alternating between opposite hemifields. Perceived direction typically depends on the relative interval in timing between visual left-right and right-left flashes (e.g., rightwards motion dominating when leftto-right interflash intervals are shortest [4]). In our new multisensory condition, interflash intervals were equal, but auditory beeps could slightly lag the right flash, yet slightly lead the left flash, or vice versa. This auditory timing strongly influenced perceived visual motion direction, despite providing no spatial auditory motion signal whatsoever. Moreover, prolonged adaptation to such auditorily driven apparent motion produced a robust visual motion aftereffect in the opposite direction, when measured in subsequent silence. Control experiments argued against accounts in terms of possible auditory grouping, or possible attention capture. We suggest that the motion arises because the sounds change perceived visual timing, as we separately confirmed. Our results provide a new demonstration of multisensory influences on sensory-specific perception , with timing of a static sound influencing spatio-temporal processing of visual motion direction.

Beyond perceptual modality: Auditory effects on visual perception

Acoustical Science and Technology, 2001

Three sets of new findings with regard to modulation of visual perception by auditory stimuli are reviewed. First, we show that visual temporal resolution can be either improved or deteriorated by accompanying sounds, depending on the sequence and delay among the auditory and visual stimuli. Second, a single visual flash can be perceived as multiple flashes when accompanied by multiple sounds. Third, an ambiguous motion display consisting of two objects moving toward each other is perceived as streaming with or without an unsynchronized sound, but as bouncing with a synchronized sound. Based on these findings, we argue, against the traditional belief of visual dominance, that audition can modify vision particularly when it provides strong transient signal(s).

Visual motion disambiguation by a subliminal sound

Consciousness and Cognition, 2008

There is growing interest in the effect of sound on visual motion perception. One model involves the illusion created when two identical objects moving towards each other on a two-dimensional visual display can be seen to either bounce off or stream through each other. Previous studies show that the large bias normally seen toward the streaming percept can be modulated by the presentation of an auditory event at the moment of coincidence. However, no reports to date provide sufficient evidence to indicate whether the sound bounce-inducing effect is due to a perceptual binding process or merely to an explicit inference resulting from the transient auditory stimulus resembling a physical collision of two objects. In the present study, we used a novel experimental design in which a subliminal sound was presented either 150 ms before, at, or 150 ms after the moment of coincidence of two disks moving towards each other. The results showed that there was an increased perception of bouncing (rather than streaming) when the subliminal sound was presented at or 150 ms after the moment of coincidence compared to when no sound was presented. These findings provide the first empirical demonstration that activation of the human auditory system without reaching consciousness affects the perception of an ambiguous visual motion display.

Multidimensional processing of dynamic sounds: more than meets the ear

Attention, Perception, & Psychophysics, 2011

Strong cross-modal interactions exist between visual and auditory processing. The relative contributions of perceptual versus decision-related processes to such interactions are only beginning to be understood. We used methodological and statistical approaches to control for potential decision-related contributions such as response interference, decisional criterion shift, and strategy selection. Participants were presented with rising-, falling-, and constant-amplitude sounds and were asked to detect change (increase or decrease) in sound amplitude while ignoring an irrelevant visual cue of a disk that grew, shrank, or stayed constant in size. Across two experiments, testing context was manipulated by varying the grouping of visual cues during testing, and cross-modal congruency showed independent perceptual and decision-related effects. Whereas a change in testing context greatly affected criterion shifts, cross-modal effects on perceptual sensitivity remained relatively consistent. In general, participants were more sensitive to increases in sound amplitude and less sensitive to sounds paired with dynamic visual cues. As compared with incongruent visual cues, congruent cues enhanced detection of amplitude decreases, but not increases. These findings suggest that the relative contributions of perceptual and decisional processing and the impacts of these processes on cross-modal interactions can vary significantly depending on asymmetries in within-modal processing, as well as consistencies in cross-modal dynamics. Keywords Cross-modal interactions. Motion in depth. Looming. Audiovisual. Visual capture Successful exploration of the environment often involves evaluating and comparing information from multiple modalities and along multiple dimensions. Cross-modal interactions in perception have been studied using behavioral and physiological measures (Alais, Newell, & Mamassian, 2010; Calvert, Spence, & Stein, 2004). Intersensory conflicts are often introduced in behavioral studies of cross-modal interaction (Soto-Faraco, Spence, Lloyd, & Kingstone, 2004b; Spence, 2011). Participants are asked to attend to signals from a target modality (e.g., audition) while ignoring potentially conflicting signals from an irrelevant modality (e.g., vision). In general, these studies demonstrate that the integration of multimodal stimuli depends on the temporal (