Spatial eye–hand coordination during bimanual reaching is not systematically coded in either LIP or PRR (original) (raw)
Related papers
Journal of neurophysiology, 2014
Studies of visually guided unimanual reaching have established that a saccade usually precedes each reach and that the reaction times (RTs) for the saccade and reach are highly correlated. The correlation of eye and hand RT is commonly taken as a measure of eye-hand coordination and is thought to assist visuospatial guidance of the hand. We asked what happens during a bimanual reach task. As with a unimanual reach, a saccade was executed first. Although latencies were fastest on unimanual trials, eye and hand RT correlation was identical whether just one or both hands reached to a single target. The average correlation was significantly reduced, however, when each hand reached simultaneously to a different target. We considered three factors that might explain the drop. We found that correlation strength depended on which hand reached first and on which hand reached to the same target as the saccade. Surprisingly, these two factors were largely independent, and the identity of the h...
Early coding of visuomanual coordination during reaching in parietal area PEc
Journal of neurophysiology, 2001
The parietal mechanisms of eye-hand coordination during reaching were studied by recording neural activity in area PEc while monkeys performed different tasks, aimed at assessing the influence of retinal, hand-, and eye-related signals on neural activity. The tasks used consisted of 1) reaching to foveated and 2) to extra-foveal targets, with constant eye position; and 3) saccadic eye movement toward, and holding of eye position on peripheral targets, the same as those of the reaching tasks. In all tasks, hand and/or eye movements were made from a central position to eight peripheral targets. A conventional visual fixation paradigm was used as a control task, to assess location and extent of visual receptive field of neurons. A large proportion of cells in area PEc displayed significant relationships to hand movement direction and position. Many of them were also related to the eye's position. Relationships to saccadic eye movements were found for a smaller proportion of cells. ...
Multiple Levels of Representation of Reaching in the Parieto-frontal Network
Cerebral Cortex, 2003
In daily life, hand and eye movements occur in different contexts. Hand movements can be made to a visual target shortly after its presentation, or after a longer delay; alternatively, they can be made to a memorized target location. In both instances, the hand can move in a visually structured scene under normal illumination, which allows visual monitoring of its trajectory, or in darkness. Across these conditions, movement can be directed to points in space already foveated, or to extrafoveal ones, thus requiring different forms of eye-hand coordination. The ability to adapt to these different contexts by providing successful answers to their demands probably resides in the high degree of flexibility of the operations that govern cognitive visuomotor behavior. The neurophysiological substrates of these processes include, among others, the context-dependent nature of neural activity, and a transitory, or task-dependent, affiliation of neurons to the assemblies underlying different forms of sensorimotor behavior. Moreover, the ability to make independent or combined eye and hand movements in the appropriate order and time sequence must reside in a process that encodes retinal-, eye-and hand-related inputs in a spatially congruent fashion. This process, in fact, requires exact knowledge of where the eye and the hand are at any given time, although we have no or little conscious experience of where they stay at any instant. How this information is reflected in the activity of cortical neurons remains a central question to understanding the mechanisms underlying the planning of eye-hand movement in the cerebral cortex. In the last 10 years, psychophysical analyses in humans, as well as neurophysiological studies in monkeys, have provided new insights on the mechanisms of different forms of oculo-manual actions. These studies have also offered preliminary hints as to the cortical substrates of eye-hand coordination. In this review, we will highlight some of the results obtained as well as some of the questions raised, focusing on the role of eye-and hand-tuning signals in cortical neural activity. This choice rests on the crucial role this information exerts in the specification of movement, and coordinate transformation.
The parietal reach region is limb specific and not involved in eye-hand coordination
Journal of Neurophysiology, 2013
Primates frequently reach toward visual targets. Neurons in early visual areas respond to stimuli in the contralateral visual hemifield and without regard to which limb will be used to reach toward that target. In contrast, neurons in motor areas typically respond when reaches are performed using the contralateral limb and with minimal regard to the visuospatial location of the target. The parietal reach region (PRR) is located early in the visuomotor processing hierarchy. PRR neurons are significantly modulated when targets for either limb or eye movement appear, similar to early sensory areas; however, they respond to targets in either visual field, similar to motor areas. The activity could reflect the subject's attentional locus, movement of a specific effector, or a related function, such as coordinating eye-arm movements. To examine the role of PRR in the visuomotor pathway, we reversibly inactivated PRR. Inactivation effects were specific to contralateral limb movements, ...
The circuits that drive visually guided eye and arm movements transform generic visual inputs into effector-specific motor commands. As part of the effort to elucidate these circuits, the primate lateral intraparietal area (LIP) has been interpreted as a priority map for saccades (oculomotor-specific) or a salience map of space (not effector-specific). It has also been proposed as a locus for eye-hand coordination. We reversibly inactivated LIP while monkeys performed memory-guided saccades and reaches. Coordinated saccade and reach reaction times were similarly impaired, consistent with a nonspecific role. However, reaches made without an accompanying saccade remained intact, and the relative temporal coupling of saccades and reaches was unchanged. These results suggest that LIP contributes to saccade planning but not to reach planning. Coordinated reaches are delayed as a result of an eye-hand coordination mechanism, located outside of LIP, that actively delays reaches until shortly after the onset of an associated saccade. We conclude with a discussion of how to reconcile specificity for saccades with a possible role in directing attention.
Combination of hand and gaze signals during reaching: activity in parietal area 7 m of the monkey
Journal of neurophysiology, 1997
The role of area 7 m has been studied by recording the activity of single neurons of monkeys trained to fixate and reach toward peripheral targets. The target was randomly selected from eight possible locations on a virtual circle, of radius 30 degrees visual angle from a central target. Three tasks were employed to dissociate hand- from eye-related contributions. In the first task, animals looked and reached to the peripheral target. In a second task, the animal reached to the peripheral target while maintaining fixation on the central target. In the third task, the monkey maintained fixation on peripheral targets that were spatially coincident with those of the reaching tasks. The results show that cell activity in area 7 m relates, for some cells to eye position, for others to hand position and movement, and for the majority of cells to a combination of visuomanual and oculomotor information. This area, therefore, seems to perform an early combination of information in the proces...
Coordinated control of eye and hand movements in dynamic reaching
2002
In the present study, we integrated two recent, at first sight contradictory findings regarding the question whether saccadic eye movements can be generated to a newly presented target during an ongoing hand movement. Saccades were measured during so-called adaptive and sustained pointing conditions. In the adapted pointing condition, subjects had to direct both their gaze and arm movements to a displaced target location. The results showed that the eyes could fixate the new target during pointing. In addition, a temporal coupling of these corrective saccades was found with changes in arm movement trajectories when reaching to the new target. In the sustained pointing condition, however, the same subjects had to point to the initial target, while trying to deviate their gaze to a new target that appeared during pointing. It was found that the eyes could not fixate the new target before the hand reached the initial target location. Together, the results indicate that ocular gaze is always forced to follow the target intended by a manual arm movement. A neural mechanism is proposed that couples ocular gaze to the target of an arm movement. Specifically, the mechanism includes a reach neuron layer besides the well-known saccadic layer in the primate superior colliculus. Such a tight, sub-cortical coupling of ocular gaze to the target of a reaching movement can explain the contrasting behavior of the eyes in dependency of whether the eye and hand share the same target position or attempt to move to different locations.
The Journal of neuroscience : the official journal of the Society for Neuroscience, 2009
Neural activity was recorded in area PE (dorsorostral part of Brodmann's area 5) of the posterior parietal cortex while monkeys performed arm reaching toward memorized targets located at different distances from the body. For any given distance, arm movements were performed while the animal kept binocular eye fixation constant. Under these conditions, the activity of a large proportion (36%) of neurons was modulated by reach distance during the memory period. By varying binocular eye position (vergence angle) and initial hand position, we found that the reaching-related activity of most neurons (61%) was influenced by changing the starting position of the hand, whereas that of a smaller, although substantial, population (13%) was influenced by changes of binocular eye position (i.e., by the angle of vergence). Furthermore, the modulation of the neural activity was better explained expressing the reach movement end-point, corresponding to the memorized target location, in terms ...
Unconstrained reaching modulates eye–hand coupling
Experimental Brain Research, 2013
Eye-hand coordination is a crucial element of goal-directed movements. However, few studies have looked at the extent to which unconstrained movements of the eyes and hand made to targets influence each other. We studied human participants who moved either their eyes, or both their eyes and hand to one of three static or flashed targets presented in 3D space. The eyes were directed and hand located at a common start position on either the right or left side of the body. We found that the velocity and scatter of memory-guided saccades (flashed targets) differed significantly when produced in combination with a reaching movement than when produced alone. Specifically, when accompanied by a reach, peak saccadic velocities were lower than when the eye moved alone. Peak saccade velocities, as well as latencies, were also highly correlated with those for reaching movements, especially for the briefly flashed targets compared to the continuous visible target. The scatter of saccade endpoints was greater when the saccades were produced with the reaching movement than when produced without, and the size of the scatter for both saccades and reaches were weakly correlated. These findings suggest that the saccades and reaches made to 3D targets are weakly to moderately coupled both temporally and spatially, and that this is partly the result of the arm movement influencing the eye movement. Taken together this study provides further evidence that the oculomotor and arm motor systems interact above and beyond any common target representations shared by the two motor systems.
Eye-hand coordination: saccades are faster when accompanied by a coordinated arm movement
Journal of neurophysiology, 2002
When primates reach for an object, they very often direct an eye movement toward the object as well. This pattern of directing both eye and limb movements to the same object appears to be fundamental to eye-hand coordination. We investigated interactions between saccades and reaching movements in a rhesus monkey model system. The amplitude and peak velocity of isolated eye movements are positively correlated with one another. This relationship is called the main sequence. We now report that the main sequence relationship for saccades is changed during coordinated eye and arm movements. In particular, peak eye velocity is approximately 4% faster for the same size saccade when the saccade is accompanied by a coordinated arm movement. Saccade duration is reduced by an equivalent amount. The main sequence relationship is unperturbed when the arm moves simultaneously but in the opposite direction as the eyes, suggesting that eye and arm movements must be tightly coordinated to produce th...