Correcting for Visuo-Haptic Biases in 3D Haptic Guidance (original) (raw)

Adjusting Haptic Guidance to Idiosyncratic Visuo-Haptic Matching Errors Improves Perceptual Consistency in Reaching

IEEE Transactions on Human-Machine Systems, 2016

When subjects reach for a visual target with their unseen hand, they make systematic errors (visuo-haptic matching errors). Visuohaptic matching errors are idiosyncratic and consistent over time. Therefore, it might be useful to compensate for these subject-specific matching errors in the design of haptic guidance to make the guidance perceptually consistent with the visual information. In this study, we investigated whether compensating for visuo-haptic matching errors results in better perceptual consistency in a reaching task. Subjects (N = 12) had to reach for visual targets with the handle of a haptic device (PHANToM Premium 3.0/6DoF) held in their unseen dominant hand without guidance, with haptic guidance toward the target position, or with haptic guidance toward the position they would reach for according to their idiosyncratic visuo-haptic matching error. We found that the distance between the aiming point of the guidance and the reached end position was smaller for the guidance toward the idiosyncratic matched positions, suggesting a larger perceptual consistency. Adjusting for idiosyncratic visuo-haptic matching errors seems to have benefits over guidance to the visual target position.

Haptic Guidance Needs to Be Intuitive Not Just Informative to Improve Human Motor Accuracy

PLOS ONE, 2016

Humans make both random and systematic errors when reproducing learned movements. Intuitive haptic guidance that assists one to make the movements reduces such errors. Our study examined whether any additional haptic information about the location of the target reduces errors in a position reproduction task, or whether the haptic guidance needs to be assistive to do so. Holding a haptic device, subjects made reaches to visible targets without time constraints. They did so in a no-guidance condition, and in guidance conditions in which the direction of the force with respect to the target differed, but the force scaled with the distance to the target in the same way. We examined whether guidance forces directed towards the target would reduce subjects' errors in reproducing a prior position to the same extent as do forces rotated by 90 degrees or 180 degrees, as it might because the forces provide the same information in all three cases. Without vision of the arm, both the accuracy and precision were significantly better with guidance directed towards the target than in all other conditions. The errors with rotated guidance did not differ from those without guidance. Not surprisingly, the movements tended to be faster when guidance forces directed the reaches to the target. This study shows that haptic guidance significantly improved motor performance when using it was intuitive, while non-intuitively presented information did not lead to any improvements and seemed to be ignored even in our simple paradigm with static targets and no time constraints.

The effect of visuo-haptic congruency on haptic spatial matching.

Experimental Brain …, 2007

Eye-hand coordination is crucial for everyday visuo-haptic object-manipulation. Noninformative vision has been reported to improve haptic spatial tasks relying on world-based reference frames. The current study investigated whether the degree of visuo-haptic congruity systematically affects haptic task performance. Congruent and parametrically varied incongruent visual orientation cues were presented while participants manually explored the orientation of a reference bar stimulus. Participants were asked to haptically match this reference orientation by turning a test bar either to a parallel or mirrored orientation, depending on the instruction. While parallel matching can only be performed correctly in a world-based frame, mirror matching (in the mid-sagittal plane) can also be achieved in a body-centered frame. We revealed that visuo-haptic incongruence affected parallel but not mirror matching responses in size and direction. Parallel matching did not improve when congruent visual orientation cues were provided throughout a run, and mirror matching even deteriorated. These results show that there is no positive effect of visual input on haptic performance per se. Tasks, which favor a body-centered frame are immune to incongruent visual input, while such input parametrically modulates performance on world-based haptic tasks.

Errors in visuo-haptic and haptic-haptic location matching are stable over long periods of time

Acta Psychologica, 2016

People make systematic errors when they move their unseen dominant hand to a visual target (visuo-haptic matching) or to their other unseen hand (haptic-haptic matching). Why they make such errors is still unknown. A key question in determining the reason is to what extent individual participants' errors are stable over time. To examine this, we developed a method to quantify the consistency. With this method, we studied the stability of systematic matching errors across time intervals of at least a month. Within this time period, individual subjects' matches were as consistent as one could expect on the basis of the variability in the individual participants' performance within each session. Thus individual participants make quite different systematic errors, but in similar circumstances they make the same errors across long periods of time.

Haptic texture affects the kinematics of pointing movements, but not of eye movements

Neuroreport, 2003

Discrepant ¢ndings on the degree of eye^hand coupling suggest its dependence on the task. One task characteristic modulating this coupling may be the relevance of certain target attributes for each motor system. We tested this assumption by comparing eye and hand movements towards targets of di¡erent haptic texture, a target attribute which is behaviourally relevant only to the hand, not the eye. Pointing to a slippery target (fur) resulted in longer hand movement time than to a rougher target (sandpaper). This e¡ect was due to an increased ratio of time spent in deceleration. In contrast, eye movement time was invariant across di¡erent haptic target textures. Thus, information about target texture is used di¡erently by eye and hand.

Misjudgment of direction contributes to curvature in movements toward haptically defined targets

Journal of Experimental Psychology: Human Perception and Performance, 2014

The trajectories of arm movements toward visually defined targets are curved, even if participants try to move in a straight line. A factor contributing to this curvature may be that participants systematically misjudge the direction to the target, and try to achieve a straight path by always moving in the perceived direction of the target. If so, the relation between perception of direction and initial movement direction should not only be present for movements toward visually defined targets, but also when making movements toward haptically defined targets. To test whether this is so, we compared errors in the initial movement direction when moving as straight as possible toward haptically defined targets with errors in a pointer setting task toward the same targets. We found a modest correlation between perception of direction and initial movement direction for movements toward haptically defined targets. The amount of correlation depended on the geometry of the task.

A brief glimpse at a haptic target is sufficient for multisensory integration in reaching movements

2021

Goal-directed aiming movements toward visuo-haptic targets (i.e., seen and handheld targets) are generally more precise than those toward visual only or haptic only targets. This multisensory advantage stems from a continuous inflow of haptic and visual target information during the movement planning and execution phases. However, in everyday life, multisensory movements often occur without the support of continuous visual in- formation. Here we investigated whether and to what extent limiting visual information to the initial stage of the action still leads to a multisensory advantage. Participants were asked to reach a handheld target while vision was briefly provided during the movement planning phase (50 ms, 100 ms, 200 ms of vision before movement onset), or during the planning and early execution phases (400 ms of vision), or during the entire movement. Additional conditions were performed in which only haptic target information was provided, or, only vision was provided either briefly (50 ms, 100 ms, 200 ms, 400 ms) or throughout the entire movement. Results showed that 50 ms of vision before movement onset were sufficient to trigger a direction-specific visuo-haptic integration process that increased endpoint precision. We conclude that, when a continuous support of vision is not available, endpoint precision is determined by the less recent, but most reliable multisensory information rather than by the latest unisensory (haptic) inputs.

Does the Integration of Haptic and Visual Cues Reduce the Effect of a Biased Visual Reference Frame on the Subjective Head Orientation

Background: The selection of appropriate frames of reference (FOR) is a key factor in the elaboration of spatial perception and the production of robust interaction with our environment. The extent to which we perceive the head axis orientation (subjective head orientation, SHO) with both accuracy and precision likely contributes to the efficiency of these spatial interactions. A first goal of this study was to investigate the relative contribution of both the visual and egocentric FOR (centre-of-mass) in the SHO processing. A second goal was to investigate humans' ability to process SHO in various sensory response modalities (visual, haptic and visuo-haptic), and the way they modify the reliance to either the visual or egocentric FORs. A third goal was to question whether subjects combined visual and haptic cues optimally to increase SHO certainty and to decrease the FORs disruption effect.

Haptic discrimination of force direction and the influence of visual information

ACM Transactions on Applied Perception, 2006

Despite a wealth of literature on discrimination thresholds for displacement, force magnitude, stiffness, and viscosity, there is currently a lack of data on our ability to discriminate force directions. Such data are needed in designing haptic rendering algorithms where force direction, as well as force magnitude, are used to encode information such as surface topography. Given that haptic information is typically presented in addition to visual information in a data perceptualization system, it is also important to investigate the extent to which the congruency of visual information affects force-direction discrimination. In this article, the authors report an experiment on the discrimination threshold of force directions under the three display conditions of haptics alone (H), haptics plus congruent vision (HVcong), and haptics plus incongruent vision (HVincong). Average force-direction discrimination thresholds were found to be 18.4 • , 25.6 • , and 31.9 • for the HVcong, H and HVincong conditions, respectively. The results show that the congruency of visual information significantly affected haptic discrimination of force directions, and that the force-direction discrimination thresholds did not seem to depend on the reference force direction. The implications of the results for designing haptic virtual environments, especially when the numbers of sensors and actuators in a haptic display do not match, are discussed.