The effect of visuo-haptic congruency on haptic spatial matching. (original) (raw)

Noninformative Vision Improves Haptic Spatial Perception

Current Biology, 2002

The University of Nottingham pear to be planned in limb-based coordinates [11,[16][17][18]. University Park Nottingham NG7 2RD We investigated this issue with respect to haptic perception by examining how noninformative vision (i.e., United Kingdom visual information that is not directly relevant to the task) influences the haptic perception of spatial relations. Specifically, we investigated haptic perception of Summary whether two bars feel parallel to one another ( ). We reasoned that instructing participants to make Previous studies have attempted to map somatosenparallel two bars positioned in different locations within sory space via haptic matching tasks [1-3] and have the workspace necessarily favors the construction of shown that individuals make large and systematic a representation of peripersonal space that specifies matching errors, the magnitude and angular direction object locations in extrinsic (e.g., world-based or eyeof which vary systematically through the workspace centered) coordinates (i.e., veridical performance is [2, 3]. Based upon such demonstrations, it has been achieved when the reference and test bar are in the suggested that haptic space is non-Euclidian. This same orientation within Cartesian coordinates). Howconclusion assumes that spatial perception is modever, because the reference and test bars are in separate ality specific, and it largely ignores the fact that tactile regions of the workspace, this will produce quite differmatching tasks involve active, exploratory arm moveent postural configurations of the reference and test ments. Here we demonstrate that, when individuals limbs and will not therefore favor the use of a representamatch two bar stimuli (i.e., make them parallel) in cirtion of peripersonal space based upon limb-based (poscumstances favoring extrinsic (visual) coordinates, tural) coordinates. Based upon our previous studies [11, providing noninformative visual information signifi-18], we hypothesized that providing noninformative vicantly increases the accuracy of haptic perception. In sion should result in a strategic combination of visual contrast, when individuals match the same bar stimuli and proprioceptive signals that favors the construction in circumstances favoring the coding of movements in of a representation of peripersonal space in extrinsic intrinsic (limb-based) coordinates, providing identical coordinates, and we tested this by using a haptic pernoninformative visual information either has no effect ception task in which participants were either blindor leads to the decreased accuracy of haptic percepfolded or else were provided with noninformative vision tion. These results are consistent with optimal integra-(i.e., the entire workspace was covered by an opaque tion models of sensory integration [13-15] in which the board). Note that in each condition participants are free weighting given to visual and somatosensory signals to orient their head and eyes toward the reference and depends upon the precision of the visual and somatotest bars. Thus, we acknowledge that spatial attention sensory information and provide important evidence may be a component of noninformative vision as defined for the task-dependent integration of visual and soin this paper. matosensory signals during the construction of a rep-Statistical analyses confirmed that the magnitude and resentation of peripersonal space.

Does the Integration of Haptic and Visual Cues Reduce the Effect of a Biased Visual Reference Frame on the Subjective Head Orientation

Background: The selection of appropriate frames of reference (FOR) is a key factor in the elaboration of spatial perception and the production of robust interaction with our environment. The extent to which we perceive the head axis orientation (subjective head orientation, SHO) with both accuracy and precision likely contributes to the efficiency of these spatial interactions. A first goal of this study was to investigate the relative contribution of both the visual and egocentric FOR (centre-of-mass) in the SHO processing. A second goal was to investigate humans' ability to process SHO in various sensory response modalities (visual, haptic and visuo-haptic), and the way they modify the reliance to either the visual or egocentric FORs. A third goal was to question whether subjects combined visual and haptic cues optimally to increase SHO certainty and to decrease the FORs disruption effect.

Visual and Motor deviation-related representations in the Haptic parallel matching task: an fMRI -TMS study

2012

The present research draws on Kaas and colleagues (Kaas, van Mier, & Goebel, 2007a) finding of a visual cortex activation (left parieto-occipital cortex) during a purely haptic task. The aim was to assess the possibility of a functional role of this region in the performance of the haptic parallel matching task (Kappers & Koederinck, 1999). The methodology included functional magnetic resonance imaging (fMRI) to localize the active region and then transcranial magnetic stimulation (TMS) to hamper its function. In contrast with the expectations, previous fMRI findings could not be replicated (Kaas, et al., 2007a), but an interesting deviation-related activation in the motor and parietal cortex during the exploration phase of the task was found. This suggests a possible sensorimotor representation of haptic space. In the TMS part of the experiment, a continuous theta burst stimulation protocol – cTBS – (Huang, Edwards, Rounis, Bhatia, & Rothwell, 2005) was administered on l-POC following the talaraich coordinates of Kaas and colleagues paper, finding a significant effect on performance for Orientation 90° alone. Further considerations concerning the data suggest a possible TMS follow-up relying on present fMRI findings.

The role of contextual cues in the haptic perception of orientations and the oblique effect

Psychonomic Bulletin & Review, 2005

Blindfolded right-handed participants were asked to position, with the right hand, a frontoparallel rod to one of three orientations: vertical (0º) and left 45º and right 45º obliques. Simultaneously, three different backgrounds were explored with the left hand: smooth, congruent stripes (parallel to the orientation to be produced), or incongruent stripes (tilted relative to the orientation to be produced). The analysis of variable errors showed that the oblique effect (higher precision for the vertical orientation than for the oblique orientations) was weakened in the presence of contextual cues, because of an improvement in oblique precision. Moreover, the analysis of constant errors revealed that the perception of orientations erred in the direction of the stripes, similar to the effect that has been found with vision, where visual contextual cues (tilted frame or lines) divert the perception of the vertical. These results are discussed in relation to a patterncentric frame of reference hypothesis or as a congruency effect.

Measures of Spatial Orientation: Spatial Bias Analogs in Visual and Haptic Tasks

eneuro

The primary sensory modality for probing spatial orientation can vary among psychophysical tasks. In the subjective visual vertical (SVV) task, a visual stimulus is used to measure perceived vertical orientation, while a haptic stimulus is used in the subjective haptic vertical (SHV) task. Here we examined disparity in SHV and SVV task results and asked whether it could be related to biases in probing different spatial estimates by each task. Forty-two healthy volunteers (mean ± SD age, 25 ± 10 years; 19 females; 21 left handed) were recruited. The effect of a task to measure spatial orientation was calculated as the difference between SHV and SVV values, and with the head upright and tilted 20° laterally. There was a task bias regardless of head position related to hand use in the haptic task but not handedness (mean head upright ± SEM: left hand, −3.7 ± 1.1°; right hand, 7.9 ± 1.0°). When this task bias was subtracted out, there was a similar spatial bias using each hand in the SH...

The structure of frontoparallel haptic space is task dependent

Perception & psychophysics, 2006

In three experiments, we investigated the structure of frontoparallel haptic space. In the first experiment, we asked blindfolded participants to rotate a matching bar so that it felt parallel to the reference bar, the bars could be at various positions in the frontoparallel plane. Large systematic errors were observed, in which orientations that were perceived to be parallel were not physically parallel. In two subsequent experiments, we investigated the origin of these errors. In Experiment 2, we asked participants to verbally report the orientation of haptically presented bars. In this task, participants made errors that were considerably smaller than those made in Experiment 1. In Experiment 3, we asked participants to set bars in a verbally instructed orientation, and they also made errors significantly smaller than those observed in Experiment 1. The data suggest that the errors in the matching task originate from the transfer of the reference orientation to the matching-bar p...

Left-right judgment of haptic stimuli representing the human hand

The handedness recognition of visually perceived body parts engages motor representations that are constrained by the same biomechanical factors that limit the execution of real movements. In the present study, we used small plastic cutouts that represented the human hand to investigate the properties of mental images generated during their haptic exploration. Our working hypothesis was that any handedness recognition task that involves body parts depends on motor imagery. Forty-four blindfolded, right-handed volunteers participated in a handedness evaluation experiment using their index finger to explore either the back or palm view of a haptic stimulus that represented the human hand. The stimuli were presented in four different orientations, and we measured the subjects’ response times. Our results showed that stimulus configurations that resemble awkward positions of the human hand are associated with longer response times (p < .006), indicating that the haptic exploration of stimuli that represent body parts also leads to motor imagery that is constrained by biomechanical factors. Keywords: haptic exploration, motor imagery, handedness recognition, mirror neurons, mental rotation.

The role of haptic information in shaping coordination dynamics: Inertial frame of reference hypothesis

Human Movement Science, 2012

Current research suggests that non-visual perception of the spatial orientation of body segments is tied to vectors representative of their mass moment distribution (v mm ). Our question was whether the relative orientation of v mm of right and left hands (Dv mm = v mm left À v mm right) constitutes haptic information supporting bimanual coordination and, if so, how it contributes to coordination dynamics. Blindfolded participants coordinated the motions of a pair of crossshaped, hand-held pendulums that were either symmetrically loaded (Dv mm = 0) or asymmetrically loaded (Dv mm -0). The sign and magnitude of Dv mm , in particular of the first moment vector, systematically affected the pattern of coordination (indexed by mean relative phase /), but not its stability. These results suggest that (1) Dv mm specifies a frame of reference about which coordination is organized; and (2) that the changes in pattern were a function of the experimentally induced biases in this perceptual frame of reference and not a function of a functional asymmetry akin to detuning. The implications of the findings to the understanding of perceptual regulation of interlimb coordination were discussed.

Errors in visuo-haptic and haptic-haptic location matching are stable over long periods of time

Acta Psychologica, 2016

People make systematic errors when they move their unseen dominant hand to a visual target (visuo-haptic matching) or to their other unseen hand (haptic-haptic matching). Why they make such errors is still unknown. A key question in determining the reason is to what extent individual participants' errors are stable over time. To examine this, we developed a method to quantify the consistency. With this method, we studied the stability of systematic matching errors across time intervals of at least a month. Within this time period, individual subjects' matches were as consistent as one could expect on the basis of the variability in the individual participants' performance within each session. Thus individual participants make quite different systematic errors, but in similar circumstances they make the same errors across long periods of time.