Combination of hand and gaze signals during reaching: activity in parietal area 7 m of the monkey (original) (raw)
Related papers
European Journal of Neuroscience, 2008
Parietal area V6A contains neurons modulated by the direction of gaze as well as neurons able to code the direction of arm movement. The present study was aimed to disentangle the gaze effect from the effect of reaching activity upon single V6A neurons. To this purpose, we used a visuomotor task in which the direction of arm movement remained constant while the animal changed the direction of gaze. Gaze direction modulated reach-related activity in about two-thirds of tested neurons. In several cases, modulations were not due to the eye-position signal per se, the apparent eye-position modulation being just an epiphenomenon. The real modulating factor was the location of reaching target with respect to the point gazed by the animal, that is, the retinotopic coordinates towards which the action of reaching occurred. Comparison of neural discharge of the same cell during execution of foveated and non-foveated reaching movements, performed towards the same or different spatial locations, confirmed that in a part of V6A neurons reaching activity is coded retinocentrically. In other neurons, reaching activity is coded spatially, depending on the direction of reaching movement regardless of where the animal was looking at. The majority of V6A reaching neurons use a system that encompasses both of these reference frames. These results are in line with the view of a progressive visuomotor transformation in the dorsal visual stream, that changes the frame of reference from the retinocentric one, typically used by the visual system, to the arm-centred one, typically used by the motor system.
The Journal of neuroscience : the official journal of the Society for Neuroscience, 2009
Neural activity was recorded in area PE (dorsorostral part of Brodmann's area 5) of the posterior parietal cortex while monkeys performed arm reaching toward memorized targets located at different distances from the body. For any given distance, arm movements were performed while the animal kept binocular eye fixation constant. Under these conditions, the activity of a large proportion (36%) of neurons was modulated by reach distance during the memory period. By varying binocular eye position (vergence angle) and initial hand position, we found that the reaching-related activity of most neurons (61%) was influenced by changing the starting position of the hand, whereas that of a smaller, although substantial, population (13%) was influenced by changes of binocular eye position (i.e., by the angle of vergence). Furthermore, the modulation of the neural activity was better explained expressing the reach movement end-point, corresponding to the memorized target location, in terms ...
Early coding of reaching in the parietooccipital cortex
Journal of neurophysiology, 2000
Neural activity was recorded in the parietooccipital cortex while monkeys performed different tasks aimed at investigating visuomotor interactions of retinal, eye, and arm-related signals on neural activity. The tasks were arm reaching 1) to foveated targets; 2) to extrafoveal targets, with constant eye position; 3) within an instructed-delayed paradigm, under both light and darkness; 4) saccadic eye movements toward, and static eye holding on peripheral targets; and 5) visual fixation and stimulation. The activity of many cells was modulated during arm reaction (68%) and movement time (58%), and during static holding of the arm in space (64%), when eye position was kept constant. Eye position influenced the activity of many cells during hand reaction (45%) and movement time (51%) and holding of hand static position (69%). Many cells (56%) were also modulated during preparation for hand movement, in the delayed reach task. Modulation was present also in the dark in 59% of cells duri...
Early coding of visuomanual coordination during reaching in parietal area PEc
Journal of neurophysiology, 2001
The parietal mechanisms of eye-hand coordination during reaching were studied by recording neural activity in area PEc while monkeys performed different tasks, aimed at assessing the influence of retinal, hand-, and eye-related signals on neural activity. The tasks used consisted of 1) reaching to foveated and 2) to extra-foveal targets, with constant eye position; and 3) saccadic eye movement toward, and holding of eye position on peripheral targets, the same as those of the reaching tasks. In all tasks, hand and/or eye movements were made from a central position to eight peripheral targets. A conventional visual fixation paradigm was used as a control task, to assess location and extent of visual receptive field of neurons. A large proportion of cells in area PEc displayed significant relationships to hand movement direction and position. Many of them were also related to the eye's position. Relationships to saccadic eye movements were found for a smaller proportion of cells. ...
Multiple Levels of Representation of Reaching in the Parieto-frontal Network
Cerebral Cortex, 2003
In daily life, hand and eye movements occur in different contexts. Hand movements can be made to a visual target shortly after its presentation, or after a longer delay; alternatively, they can be made to a memorized target location. In both instances, the hand can move in a visually structured scene under normal illumination, which allows visual monitoring of its trajectory, or in darkness. Across these conditions, movement can be directed to points in space already foveated, or to extrafoveal ones, thus requiring different forms of eye-hand coordination. The ability to adapt to these different contexts by providing successful answers to their demands probably resides in the high degree of flexibility of the operations that govern cognitive visuomotor behavior. The neurophysiological substrates of these processes include, among others, the context-dependent nature of neural activity, and a transitory, or task-dependent, affiliation of neurons to the assemblies underlying different forms of sensorimotor behavior. Moreover, the ability to make independent or combined eye and hand movements in the appropriate order and time sequence must reside in a process that encodes retinal-, eye-and hand-related inputs in a spatially congruent fashion. This process, in fact, requires exact knowledge of where the eye and the hand are at any given time, although we have no or little conscious experience of where they stay at any instant. How this information is reflected in the activity of cortical neurons remains a central question to understanding the mechanisms underlying the planning of eye-hand movement in the cerebral cortex. In the last 10 years, psychophysical analyses in humans, as well as neurophysiological studies in monkeys, have provided new insights on the mechanisms of different forms of oculo-manual actions. These studies have also offered preliminary hints as to the cortical substrates of eye-hand coordination. In this review, we will highlight some of the results obtained as well as some of the questions raised, focusing on the role of eye-and hand-tuning signals in cortical neural activity. This choice rests on the crucial role this information exerts in the specification of movement, and coordinate transformation.
Arm-reaching' neurons in the parietal area V6A of the macaque monkey
European Journal of Neuroscience, 2001
In previous experiments we have found that several cells of area V6A in the macaque superior parietal lobule were activated by small and stereotyped movements of the arms (C. Galletti, P. Fattori, D. F. Kutz & P. P. Battaglini, Eur. J. Neurosci., 1997, 9, 410). This behaviour was not accounted for by retinal information, nor by somatosensory inputs from the arms. We now want to investigate whether V6A neurons are modulated by purposeful movements aimed at reaching visual targets or targets located outside the ®eld of view. V6A neuronal activity was collected while monkeys performed arm movements during an instructeddelay reaching task in darkness. The task required the animal to reach out for a visual target in the peripersonal space and to bring the hand back to its body. Quantitative analysis of neuronal activity carried out on 55 V6A neurons showed that: (i) the great majority of neurons (71%) was signi®cantly modulated during the execution of arm movements; (ii) 30% of neurons were signi®cantly modulated during preparation of reaching; and (iii) modulations during both execution and preparation of reaching occurred in the absence of any visual feedback and were not due to eye movements. V6A reach-related neurons could be useful in guiding the hand to reach its target with or without visual feedback.
European Journal of Neuroscience, 2005
We recorded neural activity from the medial parieto-occipital area V6A while three monkeys performed an instructed-delay reaching task in the dark. Targets to be reached were in different spatial positions. Neural discharges were recorded during reaching movements directed outward from the body (towards visual objects), during the holding phase (when the hand was on the target) and during inward movements of the hand towards the home button (which was near the body and outside the field of view). Reachrelated activity was observed in the majority of 207 V6A cells, during outward (78%) and inward (65%) movements as well as during the holding phase (62%). Most V6A reaching neurons (84%) were modulated in more than one phase of the task. The reach-related activity in V6A could depend on somatosensory inputs and ⁄ or on corollary discharges from the dorsal premotor cortex. Although visual and oculomotor inputs are known to have a strong influence on V6A activity, we excluded the possibility that the reach-related activity which we observed was due to visual stimulation and ⁄ or oculomotor activity. Reach-related activity for movements towards different locations was spatially modulated during outward (40%) and inward (47%) reaching movements. The position of the hand ⁄ arm in space modulated about 40% of V6A cells. Preferred reach directions and spatial locations were represented uniformly across the workspace. These data suggest that V6A reach-related neurons are able to code the direction of movement of the arm and the position of the hand ⁄ arm in space. We suggest that the V6A reach-related neurons are involved in the guidance of goal-directed arm movements, whether these actions are visually guided or not.
Neuronal activity related to eye-hand coordination in the primate premotor cortex
Experimental brain research, 1999
To test the functional implications of gaze signals that we previously reported in the dorsal premotor cortex (PMd), we trained two rhesus monkeys to point to visual targets presented on a touch screen while controlling their gaze orientation. Each monkey had to perform four different tasks. To initiate a trial, the monkey had to put his hand on a starting position at the center of the touch screen and fixate a fixation point. In one task, the animal had to make a reaching movement to a peripheral target randomly presented at one of eight possible locations on a circle while maintaining fixation at the center of this virtual circle (central fixation + reaching). In the second task, the monkey maintained fixation at the location of the upcoming peripheral target and, later, reached to that location. After a delay, the target was turned on and the monkey made a reaching arm movement (target fixation + reaching). In the third task, the monkey made a saccade to the target without any arm movement (saccade). Finally, in the fourth task, the monkey first made a saccade to the target, then reached to it after a delay (saccade + reaching). This design allowed us to examine the contribution of the oculomotor context to arm-related neuronal activity in PMd. We analyzed the effects of the task type on neuronal activity and found that many cells showed a task effect during the signal (26/60; 43%), set (16/49; 33%) and/or movement (15/54; 28%) epochs, depending on the oculomotor history. These findings, together with previously published data, suggest that PMd codes limb-movement direction in a gaze-dependent manner and may, thus, play an important role in the brain mechanisms of eye-hand coordination during visually guided reaching.