Role of the rostral superior colliculus in gaze anchoring during reach movements (original) (raw)

Combination of hand and gaze signals during reaching: activity in parietal area 7 m of the monkey

Journal of neurophysiology, 1997

The role of area 7 m has been studied by recording the activity of single neurons of monkeys trained to fixate and reach toward peripheral targets. The target was randomly selected from eight possible locations on a virtual circle, of radius 30 degrees visual angle from a central target. Three tasks were employed to dissociate hand- from eye-related contributions. In the first task, animals looked and reached to the peripheral target. In a second task, the animal reached to the peripheral target while maintaining fixation on the central target. In the third task, the monkey maintained fixation on peripheral targets that were spatially coincident with those of the reaching tasks. The results show that cell activity in area 7 m relates, for some cells to eye position, for others to hand position and movement, and for the majority of cells to a combination of visuomanual and oculomotor information. This area, therefore, seems to perform an early combination of information in the proces...

Reaching activity in parietal area V6A of macaque: eye influence on arm activity or retinocentric coding of reaching movements?

European Journal of Neuroscience, 2008

Parietal area V6A contains neurons modulated by the direction of gaze as well as neurons able to code the direction of arm movement. The present study was aimed to disentangle the gaze effect from the effect of reaching activity upon single V6A neurons. To this purpose, we used a visuomotor task in which the direction of arm movement remained constant while the animal changed the direction of gaze. Gaze direction modulated reach-related activity in about two-thirds of tested neurons. In several cases, modulations were not due to the eye-position signal per se, the apparent eye-position modulation being just an epiphenomenon. The real modulating factor was the location of reaching target with respect to the point gazed by the animal, that is, the retinotopic coordinates towards which the action of reaching occurred. Comparison of neural discharge of the same cell during execution of foveated and non-foveated reaching movements, performed towards the same or different spatial locations, confirmed that in a part of V6A neurons reaching activity is coded retinocentrically. In other neurons, reaching activity is coded spatially, depending on the direction of reaching movement regardless of where the animal was looking at. The majority of V6A reaching neurons use a system that encompasses both of these reference frames. These results are in line with the view of a progressive visuomotor transformation in the dorsal visual stream, that changes the frame of reference from the retinocentric one, typically used by the visual system, to the arm-centred one, typically used by the motor system.

Reaching in depth: hand position dominates over binocular eye position in the rostral superior parietal lobule

The Journal of neuroscience : the official journal of the Society for Neuroscience, 2009

Neural activity was recorded in area PE (dorsorostral part of Brodmann's area 5) of the posterior parietal cortex while monkeys performed arm reaching toward memorized targets located at different distances from the body. For any given distance, arm movements were performed while the animal kept binocular eye fixation constant. Under these conditions, the activity of a large proportion (36%) of neurons was modulated by reach distance during the memory period. By varying binocular eye position (vergence angle) and initial hand position, we found that the reaching-related activity of most neurons (61%) was influenced by changing the starting position of the hand, whereas that of a smaller, although substantial, population (13%) was influenced by changes of binocular eye position (i.e., by the angle of vergence). Furthermore, the modulation of the neural activity was better explained expressing the reach movement end-point, corresponding to the memorized target location, in terms ...

Coordinated control of eye and hand movements in dynamic reaching

2002

In the present study, we integrated two recent, at first sight contradictory findings regarding the question whether saccadic eye movements can be generated to a newly presented target during an ongoing hand movement. Saccades were measured during so-called adaptive and sustained pointing conditions. In the adapted pointing condition, subjects had to direct both their gaze and arm movements to a displaced target location. The results showed that the eyes could fixate the new target during pointing. In addition, a temporal coupling of these corrective saccades was found with changes in arm movement trajectories when reaching to the new target. In the sustained pointing condition, however, the same subjects had to point to the initial target, while trying to deviate their gaze to a new target that appeared during pointing. It was found that the eyes could not fixate the new target before the hand reached the initial target location. Together, the results indicate that ocular gaze is always forced to follow the target intended by a manual arm movement. A neural mechanism is proposed that couples ocular gaze to the target of an arm movement. Specifically, the mechanism includes a reach neuron layer besides the well-known saccadic layer in the primate superior colliculus. Such a tight, sub-cortical coupling of ocular gaze to the target of a reaching movement can explain the contrasting behavior of the eyes in dependency of whether the eye and hand share the same target position or attempt to move to different locations.

Early coding of visuomanual coordination during reaching in parietal area PEc

Journal of neurophysiology, 2001

The parietal mechanisms of eye-hand coordination during reaching were studied by recording neural activity in area PEc while monkeys performed different tasks, aimed at assessing the influence of retinal, hand-, and eye-related signals on neural activity. The tasks used consisted of 1) reaching to foveated and 2) to extra-foveal targets, with constant eye position; and 3) saccadic eye movement toward, and holding of eye position on peripheral targets, the same as those of the reaching tasks. In all tasks, hand and/or eye movements were made from a central position to eight peripheral targets. A conventional visual fixation paradigm was used as a control task, to assess location and extent of visual receptive field of neurons. A large proportion of cells in area PEc displayed significant relationships to hand movement direction and position. Many of them were also related to the eye's position. Relationships to saccadic eye movements were found for a smaller proportion of cells. ...

Early coding of reaching in the parietooccipital cortex

Journal of neurophysiology, 2000

Neural activity was recorded in the parietooccipital cortex while monkeys performed different tasks aimed at investigating visuomotor interactions of retinal, eye, and arm-related signals on neural activity. The tasks were arm reaching 1) to foveated targets; 2) to extrafoveal targets, with constant eye position; 3) within an instructed-delayed paradigm, under both light and darkness; 4) saccadic eye movements toward, and static eye holding on peripheral targets; and 5) visual fixation and stimulation. The activity of many cells was modulated during arm reaction (68%) and movement time (58%), and during static holding of the arm in space (64%), when eye position was kept constant. Eye position influenced the activity of many cells during hand reaction (45%) and movement time (51%) and holding of hand static position (69%). Many cells (56%) were also modulated during preparation for hand movement, in the delayed reach task. Modulation was present also in the dark in 59% of cells duri...

Multiple Levels of Representation of Reaching in the Parieto-frontal Network

Cerebral Cortex, 2003

In daily life, hand and eye movements occur in different contexts. Hand movements can be made to a visual target shortly after its presentation, or after a longer delay; alternatively, they can be made to a memorized target location. In both instances, the hand can move in a visually structured scene under normal illumination, which allows visual monitoring of its trajectory, or in darkness. Across these conditions, movement can be directed to points in space already foveated, or to extrafoveal ones, thus requiring different forms of eye-hand coordination. The ability to adapt to these different contexts by providing successful answers to their demands probably resides in the high degree of flexibility of the operations that govern cognitive visuomotor behavior. The neurophysiological substrates of these processes include, among others, the context-dependent nature of neural activity, and a transitory, or task-dependent, affiliation of neurons to the assemblies underlying different forms of sensorimotor behavior. Moreover, the ability to make independent or combined eye and hand movements in the appropriate order and time sequence must reside in a process that encodes retinal-, eye-and hand-related inputs in a spatially congruent fashion. This process, in fact, requires exact knowledge of where the eye and the hand are at any given time, although we have no or little conscious experience of where they stay at any instant. How this information is reflected in the activity of cortical neurons remains a central question to understanding the mechanisms underlying the planning of eye-hand movement in the cerebral cortex. In the last 10 years, psychophysical analyses in humans, as well as neurophysiological studies in monkeys, have provided new insights on the mechanisms of different forms of oculo-manual actions. These studies have also offered preliminary hints as to the cortical substrates of eye-hand coordination. In this review, we will highlight some of the results obtained as well as some of the questions raised, focusing on the role of eye-and hand-tuning signals in cortical neural activity. This choice rests on the crucial role this information exerts in the specification of movement, and coordinate transformation.