Haptic texture affects the kinematics of pointing movements, but not of eye movements (original) (raw)
Related papers
The equivalence of vision and haptics when matched spatiotemporally
2009
Two experiments are described in which the task was to identify a set of stimuli that were presented both haptically and visually such that the spatiotemporal display of information was matched. The stimuli were pre-recorded movement pathways followed during the exploration of capital letters. In the haptic tasks explorers were guided along the pathways by two mechanical devices -the Tactile Display System (Experiment 1) and the Phantom (Experiment 2). In the visual tasks the same pathways were plotted on a monitor. Both the visual and haptic tasks were undertaken in moving window and stationary window modes. In the former conditions the participant's hand was moved along the pathway (haptic condition)_ or the participant watched a 1 cm segment of line progressively trace out the pathway. In the stationary window tasks a tactile stimulus was moved underneath a stationary fingertip or the stimulus was moved behind a 1 cm motionless window. In terms of latency to stimulus identification, the moving window conditions yielded better performance than the stationary window conditions. However, performance in the haptic and visual conditions did not differ significantly, indicating a modal equivalence when information is presented to both senses in a matched manner.
Keep an eye on your hands: on the role of visual mechanisms in processing of haptic space
Cognitive Processing, 2008
The present paper reviews research on a haptic orientation processing. Central is a task in which a test bar has to be set parallel to a reference bar at another location. Introducing a delay between inspecting the reference bar and setting the test bar leads to a surprising improvement. Moreover, offering visual background information also elevates performance. Interestingly, (congenitally) blind individuals do not or to a weaker extent show the improvement with time, while in parallel to this, they appear to benefit less from spatial imagery processing. Together this strongly points to an important role for visual processing mechanisms in the perception of haptic inputs.
A behavioral adaptation approach to identifying visual dependence of haptic perception
2007
Both haptic and visual senses play a role in how we explore our environment. Previous studies have shown that vision plays a very strong role in perception of object stiffness, yet quantification of the contributions of both haptic and visual feedback remains elusive. This study uses a behavioral adaptation approach in order to better understand how humans perceive stiffness. Namely, subjects make targeted reaches across a virtual surface of varying stiffness, adapting to the new environment. The hand's cursor position is visually distorted to seem more stiff for one group, less stiff for another, and no distortion for the control group. Area Reaching Deviation (ARD) and post-adaptation interface forces, used in previous studies, were the two outcome measures used to determine differences between groups. We compare the slopes of the postadaptation force-stiffness relations to quantify the effect of visual distortion. Our results indicate that making a stiff surface look more compliant has a greater effect on humans than making a compliant surface look more stiff.
Haptic and Tactile Feedback in Directed Movements
Proceedings of GOTHI …, 2005
When compared to other kinds of perception, haptic perception has several special aspects. This paper presents a proposal for definitions of haptic perception and tactile interaction. These definitions are designed to support the process of developing interactive systems with haptic perception and tactile interfaces. To give an impression of the complexity of needed guidance, the difficulty of coding tactile information is further illustrated by example.
What aspects of vision facilitate haptic processing?
Brain and Cognition, 2005
We investigate how vision aVects haptic performance when task-relevant visual cues are reduced or excluded. The task was to remember the spatial location of six landmarks that were explored by touch in a tactile map. Here, we use specially designed spectacles that simulate residual peripheral vision, tunnel vision, diVuse light perception, and total blindness. Results for target locations diVered, suggesting additional eVects from adjacent touch cues. These are discussed. Touch with full vision was most accurate, as expected. Peripheral and tunnel vision, which reduce visuo-spatial cues, diVered in error pattern. Both were less accurate than full vision, and signiWcantly more accurate than touch with diVuse light perception, and touch alone. The important Wnding was that touch with diVuse light perception, which excludes spatial cues, did not diVer from touch without vision in performance accuracy, nor in location error pattern. The contrast between spatially relevant versus spatially irrelevant vision provides new, rather decisive, evidence against the hypothesis that vision aVects haptic processing even if it does not add task-relevant information. The results support optimal integration theories, and suggest that spatial and non-spatial aspects of vision need explicit distinction in bimodal studies and theories of spatial integration. !
Haptic guidance of overt visual attention
Attention, Perception, & Psychophysics, 2014
Research has shown that information accessed from one sensory modality can influence perceptual and attentional processes in another modality. Here, we demonstrated a novel crossmodal influence of haptic-shape information on visual attention. Participants visually searched for a target object (e.g., an orange) presented among distractor objects, fixating the target as quickly as possible. While searching for the target, participants held (never viewed and out of sight) an item of a specific shape in their hands. In two experiments, we demonstrated that the time for the eyes to reach a target-a measure of overt visual attention-was reduced when the shape of the held item (e.g., a sphere) was consistent with the shape of the visual target (e.g., an orange), relative to when the held shape was unrelated to the target (e.g., a hockey puck) or when no shape was held. This haptic-to-visual facilitation occurred despite the fact that the held shapes were not predictive of the visual targets' shapes, suggesting that the crossmodal influence occurred automatically, reflecting shape-specific haptic guidance of overt visual attention. Keywords Multisensory processing. Visual search. Attention Visual attention is known to be influenced by signals from other sensory modalities. In the domain of spatial attention, auditory, tactile, and visual cues produce spatially specific attentional facilitation or inhibition of visual processing (e.g., Driver & Spence, 1998). Converging evidence from patients with spatial attention deficits, such as neglect or extinction, has shown that auditory or tactile signals presented in one hemifield compete with visual signals in the opposite hemifield for drawing visual attention, as occurs with competing bilateral visual signals (e.g., Brozzoli, Demattè, Pavani, Frassinetti, & Farnè, 2006). In addition to crossmodal spatial interactions, feature-and object-based crossmodal influences of audition on visual attention have been reported (Guzman
Hand movements: A window into haptic object recognition
Cognitive Psychology, 1987
Two experiments establish links between desired knowledge about objects and hand movements during haptic object exploration. Experiment 1 used a matchto-sample task, in which blindfolded subjects were directed to match objects on a particular dimension (e.g., texture). Hand movements during object exploration were reliably classified as "exploratory procedures," each procedure defined by its invariant and typical properties. The movement profile, i.e., the distribution of exploratory procedures, was directly related to the desired object knowledge that was required for the match. Experiment 2 addressed the reasons for the specific links between exploratory procedures and knowledge goals. Hand movements were constrained, and performance on various matching tasks was assessed. The procedures were considered in terms of their necessity, sufficiency, and optimahty of performance for each task. The results establish that in free exploration, a procedure is generally used to acquire information about an object property, not because it is merely sufficient, but because it is optimal or even necessary. Hand movements can serve as "windows," through which it is possible to learn about the underlying representation of objects in memory and the processes by which such representations are derived and utilized. o 1987 Academic Press. Inc. When we feel extremely helpless in a situation, we commonly say, "My hands are tied!" Indeed, it is hard to imagine a world in which we cannot feel the soft fur of a kitten or even tie our shoelaces. Yet, psychology has often portrayed the hand as a second-class citizen. Research
Haptic selective attention by foot and by hand
Neuroscience Letters, 2007
Nonvisual perceptions of a wielded object's spatial properties are based on the quantities expressing the object's mass distribution, quantities that are invariant during the wielding. The mechanoreceptors underlying the kind of haptic perception involved in wielding -referred to as effortful, kinesthetic, or dynamic touch -are those embedded in the muscles, tendons, and ligaments. The present experiment's focus was the selectivity of this muscle-based form of haptic perception. For an occluded rod grasped by the hand at some intermediate position along its length, participants can attend to and report selectively the rod's full length, its partial lengths (fore or aft of the hand), and the position of the grip. The present experiment evaluated whether participants could similarly attend selectively when wielding by foot. For a given rod attached to and wielded by foot or attached to (i.e. grasped) and wielded by hand, participants reported (by magnitude production) the rod's whole length or fractional length leftward of the point of attachment. On measures of mean perceived length, accuracy, and reliability, the degree of differentiation of partial from full extent achieved by means of the foot matched that achieved by means of the hand. Despite their neural, anatomical, and experiential differences, the lower and upper limbs seem to abide by the same principles of selective muscle-based perception and seem to express this perceptual function with equal facility.
Haptic discrimination of force direction and the influence of visual information
ACM Transactions on Applied Perception, 2006
Despite a wealth of literature on discrimination thresholds for displacement, force magnitude, stiffness, and viscosity, there is currently a lack of data on our ability to discriminate force directions. Such data are needed in designing haptic rendering algorithms where force direction, as well as force magnitude, are used to encode information such as surface topography. Given that haptic information is typically presented in addition to visual information in a data perceptualization system, it is also important to investigate the extent to which the congruency of visual information affects force-direction discrimination. In this article, the authors report an experiment on the discrimination threshold of force directions under the three display conditions of haptics alone (H), haptics plus congruent vision (HVcong), and haptics plus incongruent vision (HVincong). Average force-direction discrimination thresholds were found to be 18.4 • , 25.6 • , and 31.9 • for the HVcong, H and HVincong conditions, respectively. The results show that the congruency of visual information significantly affected haptic discrimination of force directions, and that the force-direction discrimination thresholds did not seem to depend on the reference force direction. The implications of the results for designing haptic virtual environments, especially when the numbers of sensors and actuators in a haptic display do not match, are discussed.