Reaching in depth: hand position dominates over binocular eye position in the rostral superior parietal lobule (original) (raw)

Spatial tuning of reaching activity in the medial parieto-occipital cortex (area V6A) of macaque monkey

European Journal of Neuroscience, 2005

We recorded neural activity from the medial parieto-occipital area V6A while three monkeys performed an instructed-delay reaching task in the dark. Targets to be reached were in different spatial positions. Neural discharges were recorded during reaching movements directed outward from the body (towards visual objects), during the holding phase (when the hand was on the target) and during inward movements of the hand towards the home button (which was near the body and outside the field of view). Reachrelated activity was observed in the majority of 207 V6A cells, during outward (78%) and inward (65%) movements as well as during the holding phase (62%). Most V6A reaching neurons (84%) were modulated in more than one phase of the task. The reach-related activity in V6A could depend on somatosensory inputs and ⁄ or on corollary discharges from the dorsal premotor cortex. Although visual and oculomotor inputs are known to have a strong influence on V6A activity, we excluded the possibility that the reach-related activity which we observed was due to visual stimulation and ⁄ or oculomotor activity. Reach-related activity for movements towards different locations was spatially modulated during outward (40%) and inward (47%) reaching movements. The position of the hand ⁄ arm in space modulated about 40% of V6A cells. Preferred reach directions and spatial locations were represented uniformly across the workspace. These data suggest that V6A reach-related neurons are able to code the direction of movement of the arm and the position of the hand ⁄ arm in space. We suggest that the V6A reach-related neurons are involved in the guidance of goal-directed arm movements, whether these actions are visually guided or not.

Early coding of reaching in the parietooccipital cortex

Journal of neurophysiology, 2000

Neural activity was recorded in the parietooccipital cortex while monkeys performed different tasks aimed at investigating visuomotor interactions of retinal, eye, and arm-related signals on neural activity. The tasks were arm reaching 1) to foveated targets; 2) to extrafoveal targets, with constant eye position; 3) within an instructed-delayed paradigm, under both light and darkness; 4) saccadic eye movements toward, and static eye holding on peripheral targets; and 5) visual fixation and stimulation. The activity of many cells was modulated during arm reaction (68%) and movement time (58%), and during static holding of the arm in space (64%), when eye position was kept constant. Eye position influenced the activity of many cells during hand reaction (45%) and movement time (51%) and holding of hand static position (69%). Many cells (56%) were also modulated during preparation for hand movement, in the delayed reach task. Modulation was present also in the dark in 59% of cells duri...

Combination of hand and gaze signals during reaching: activity in parietal area 7 m of the monkey

Journal of neurophysiology, 1997

The role of area 7 m has been studied by recording the activity of single neurons of monkeys trained to fixate and reach toward peripheral targets. The target was randomly selected from eight possible locations on a virtual circle, of radius 30 degrees visual angle from a central target. Three tasks were employed to dissociate hand- from eye-related contributions. In the first task, animals looked and reached to the peripheral target. In a second task, the animal reached to the peripheral target while maintaining fixation on the central target. In the third task, the monkey maintained fixation on peripheral targets that were spatially coincident with those of the reaching tasks. The results show that cell activity in area 7 m relates, for some cells to eye position, for others to hand position and movement, and for the majority of cells to a combination of visuomanual and oculomotor information. This area, therefore, seems to perform an early combination of information in the proces...

Multiple Levels of Representation of Reaching in the Parieto-frontal Network

Cerebral Cortex, 2003

In daily life, hand and eye movements occur in different contexts. Hand movements can be made to a visual target shortly after its presentation, or after a longer delay; alternatively, they can be made to a memorized target location. In both instances, the hand can move in a visually structured scene under normal illumination, which allows visual monitoring of its trajectory, or in darkness. Across these conditions, movement can be directed to points in space already foveated, or to extrafoveal ones, thus requiring different forms of eye-hand coordination. The ability to adapt to these different contexts by providing successful answers to their demands probably resides in the high degree of flexibility of the operations that govern cognitive visuomotor behavior. The neurophysiological substrates of these processes include, among others, the context-dependent nature of neural activity, and a transitory, or task-dependent, affiliation of neurons to the assemblies underlying different forms of sensorimotor behavior. Moreover, the ability to make independent or combined eye and hand movements in the appropriate order and time sequence must reside in a process that encodes retinal-, eye-and hand-related inputs in a spatially congruent fashion. This process, in fact, requires exact knowledge of where the eye and the hand are at any given time, although we have no or little conscious experience of where they stay at any instant. How this information is reflected in the activity of cortical neurons remains a central question to understanding the mechanisms underlying the planning of eye-hand movement in the cerebral cortex. In the last 10 years, psychophysical analyses in humans, as well as neurophysiological studies in monkeys, have provided new insights on the mechanisms of different forms of oculo-manual actions. These studies have also offered preliminary hints as to the cortical substrates of eye-hand coordination. In this review, we will highlight some of the results obtained as well as some of the questions raised, focusing on the role of eye-and hand-tuning signals in cortical neural activity. This choice rests on the crucial role this information exerts in the specification of movement, and coordinate transformation.

Reaching activity in parietal area V6A of macaque: eye influence on arm activity or retinocentric coding of reaching movements?

European Journal of Neuroscience, 2008

Parietal area V6A contains neurons modulated by the direction of gaze as well as neurons able to code the direction of arm movement. The present study was aimed to disentangle the gaze effect from the effect of reaching activity upon single V6A neurons. To this purpose, we used a visuomotor task in which the direction of arm movement remained constant while the animal changed the direction of gaze. Gaze direction modulated reach-related activity in about two-thirds of tested neurons. In several cases, modulations were not due to the eye-position signal per se, the apparent eye-position modulation being just an epiphenomenon. The real modulating factor was the location of reaching target with respect to the point gazed by the animal, that is, the retinotopic coordinates towards which the action of reaching occurred. Comparison of neural discharge of the same cell during execution of foveated and non-foveated reaching movements, performed towards the same or different spatial locations, confirmed that in a part of V6A neurons reaching activity is coded retinocentrically. In other neurons, reaching activity is coded spatially, depending on the direction of reaching movement regardless of where the animal was looking at. The majority of V6A reaching neurons use a system that encompasses both of these reference frames. These results are in line with the view of a progressive visuomotor transformation in the dorsal visual stream, that changes the frame of reference from the retinocentric one, typically used by the visual system, to the arm-centred one, typically used by the motor system.

Parieto-frontal coding of reaching: an integrated framework

Experimental Brain Research, 1999

In the last few years, anatomical and physiological studies have provided new insights into the organization of the parieto-frontal network underlying visually guided arm-reaching movements in at least three domains. (1) Network architecture. It has been shown that the different classes of neurons encoding information relevant to reaching are not confined within individual cortical areas, but are common to different areas, which are generally linked by reciprocal association connections.

Arm-reaching' neurons in the parietal area V6A of the macaque monkey

European Journal of Neuroscience, 2001

In previous experiments we have found that several cells of area V6A in the macaque superior parietal lobule were activated by small and stereotyped movements of the arms (C. Galletti, P. Fattori, D. F. Kutz & P. P. Battaglini, Eur. J. Neurosci., 1997, 9, 410). This behaviour was not accounted for by retinal information, nor by somatosensory inputs from the arms. We now want to investigate whether V6A neurons are modulated by purposeful movements aimed at reaching visual targets or targets located outside the ®eld of view. V6A neuronal activity was collected while monkeys performed arm movements during an instructeddelay reaching task in darkness. The task required the animal to reach out for a visual target in the peripersonal space and to bring the hand back to its body. Quantitative analysis of neuronal activity carried out on 55 V6A neurons showed that: (i) the great majority of neurons (71%) was signi®cantly modulated during the execution of arm movements; (ii) 30% of neurons were signi®cantly modulated during preparation of reaching; and (iii) modulations during both execution and preparation of reaching occurred in the absence of any visual feedback and were not due to eye movements. V6A reach-related neurons could be useful in guiding the hand to reach its target with or without visual feedback.

Cortical Networks for Visual Reaching: Physiological and Anatomical Organization of Frontal and Parietal Lobe Arm Regions

Cerebral Cortex, 1996

The functional and structural properties of the dorsolateral frontal lobe and posterior parietal proximal arm representations were studied in macaque monkeys. Physiological mapping of primary motor (Ml), dorsal premotor (PMd), and posterior parietal (area 5) cortices was performed in behaving monkeys trained in an instructed-delay reaching task. The parietofrontal corticocortical connectivities of these same areas were subsequently examined anatomically by means of retrograde tracing techniques. Signal-, set-, movement-, and position-related directional neuronal activities were distributed nonuniformly within the task-related areas in both frontal and parietal cortices. Within the frontal lobe, moving caudally from PMd to the Ml, the activity that signals for the visuospatial events leading to target localization decreased, while the activity more directly linked to movement generation increased. Physiological recordings in the superior parietal lobule revealed a gradient-like distribution of functional properties similar to that observed in the frontal lobe. Signal-and set-related activities were encountered more frequently in the intermediate and ventral part of the medial bank of the intraparietal sulcus (IPS), in area MIP. Movementand position-related activities were distributed more uniformly within the superior parietal lobule (SPL), in both dorsal area 5 and in MIP. Frontal and parietal regions sharing similar functional properties were preferentially connected through their association pathways. As a result of this study, area MIP, and possibly areas MDP and 7m as well, emerge as the parietal nodes by which visual information may be relayed to the frontal lobe arm region. These parietal and frontal areas, along with their association connections, represent a potential cortical network for visual reaching. The architecture of this network is ideal for coding reaching as the result of a combination between visual and somatic information.

Dorsal Premotor Neurons Encode the Relative Position of the Hand, Eye, and Goal during Reach Planning

When reaching to grasp an object, we often move our arm and orient our gaze together. How are these movements coordinated? To investigate this question, we studied neuronal activity in the dorsal premotor area (PMd) and the medial intraparietal area (area MIP) of two monkeys while systematically varying the starting position of the hand and eye during reaching. PMd neurons encoded the relative position of the target, hand, and eye. MIP neurons encoded target location with respect to the eye only. These results indicate that whereas MIP encodes target locations in an eyecentered reference frame, PMd uses a relative position code that specifies the differences in locations between all three variables. Such a relative position code may play an important role in coordinating hand and eye movements by computing their relative position.