Multiplexed and flexible neural coding in sensory, parietal, and frontal cortices during goal-directed virtual navigation (original) (raw)
Related papers
2021
We do not understand how neural nodes operate and coordinate within the recurrent action-perception loops that characterize naturalistic self-environment interactions. Here, we record single-unit spiking activity and local field potentials (LFPs) simultaneously from the dorsomedial superior temporal area (MSTd), parietal area 7a, and dorsolateral prefrontal cortex (dlPFC) as monkeys navigate in virtual reality to “catch fireflies.” This task requires animals to actively sample from a closed-loop virtual environment while concurrently computing continuous latent variables: (i) the distance and angle travelled (i.e., path integration) and (ii) the distance and angle to a memorized firefly location (i.e., a hidden spatial goal). We observed a patterned mixed selectivity, with the prefrontal cortex most prominently coding for latent variables, parietal cortex coding for sensorimotor variables, and MSTd most often coding for eye-movements. However, even the traditionally considered senso...
Widespread coding of navigational variables in prefrontal cortex
ABSTRACTTo navigate, we must represent information about our place in the environment. Traditional research highlights the role of the hippocampal complex in this process. Spurred by recent research highlighting the widespread cortical encoding of cognitive and motor variables previously thought to have localized function, we hypothesized that navigational variables would be likewise encoded widely, especially in the prefrontal cortex, which is often associated with control of volitional behavior. We recorded neural activity from six prefrontal structures while macaques performed a foraging task in an open enclosure. In all six regions, we found strong encoding of allocentric position, head direction, egocentric boundary distance, and linear and angular velocity. These encodings were not accounted for by distance or time to reward. Strength of coding of all variables increase along a ventral-to-dorsal gradient. Together these results argue that encoding of navigational variables is ...
Visual Neuroscience, 2013
Many complex behaviors rely on guidance from sensations. To perform these behaviors, the motor system must decode information relevant to the task from the sensory system. However, identifying the neurons responsible for encoding the appropriate sensory information remains a difficult problem for neurophysiologists. A key step toward identifying candidate systems is finding neurons or groups of neurons capable of representing the stimuli adequately to support behavior. A traditional approach involves quantitatively measuring the performance of single neurons and comparing this to the performance of the animal. One of the strongest pieces of evidence in support of a neuronal population being involved in a behavioral task comes from the signals being sufficient to support behavior. Numerous experiments using perceptual decision tasks show that visual cortical neurons in many areas have this property. However, most visually guided behaviors are not categorical but continuous and dynami...
Large-Scale Visuomotor Integration in the Cerebral Cortex
Cerebral Cortex, 2007
Efficient visuomotor behavior depends on integrated processing by the visual and motor systems of the cerebral cortex. Yet, many pre- vious cortical neurophysiology studies have examined the visual and motor modalities in isolation, largely ignoring questions of large-scale cross-modal integration. To address this issue, we analyzed event- related local field potentials simultaneously recorded from multiple visual, motor, and executive cortical sites in monkeys performing a visuomotor pattern discrimination task. The timing and cortical location of four aspects of event-related activities were examined: stimulus-evoked activation onset, stimulus-specific processing, stimulus category--specific processing, and response-specific pro- cessing. Activations appeared earliest in striate cortex and rapidly thereafter in other visual areas. Stimulus-specific processing began early in most visual cortical areas, some at activation onset. Early onset latencies were also observed in motor, premotor, and pre- frontal areas, some as early as in striate cortex, but these early- activating frontal sites did not show early stimulus-specific process- ing. Response-specific processing began around 150 ms poststimulus in widespread cortical areas, suggesting that perceptual decision formation and response selection arose through concurrent pro- cesses of visual, motor, and executive areas. The occurrence of stimulus-specific and stimulus category--specific differences after the onset of response-specific processing suggests that sensory and motor stages of visuomotor processing overlapped in time.
Navigational path integration by cortical neurons: origins in higher-order direction selectivity
Journal of neurophysiology, 2015
Navigation relies on the neural processing of sensory cues about observer self-movement and spatial location. Neurons in macaque dorsal medial superior temporal cortex (MSTd) respond to visual and vestibular self-movement cues, potentially contributing to navigation and orientation. We moved monkeys on circular paths around a room while recording the activity of MSTd neurons. MSTd neurons show a variety of sensitivities to the monkey's heading direction, circular path through the room, and place in the room. Changing visual cues alters the relative prevalence of those response properties. Disrupting the continuity of self-movement paths through the environment disrupts path selectivity in a manner linked to the time course of single neuron responses. We hypothesize that sensory cues interact with the spatial and temporal integrative properties of MSTd neurons to derive path selectivity for navigational path integration supporting spatial orientation.
Task-Dependent Changes in the Large-Scale Dynamics and Necessity of Cortical Regions
Neuron, 2019
Highlights d Mice navigated in virtual reality while performing one of three related tasks d All dorsal cortex contributed to memory-dependent but not visually guided navigation d Higher task demands induced decorrelation of whole-cortex Ca 2+ activity patterns d A modular RNN model suggested that differences in computations can explain results
Selectivity of Local Field Potentials in Macaque Inferior Temporal Cortex
While single neurons in inferior temporal (IT) cortex show differential responses to distinct complex stimuli, little is known about the responses of populations of neurons in IT. We recorded single electrode data, including multi-unit activity (MUA) and local field potentials (LFP), from 618 sites in the inferior temporal cortex of macaque monkeys while the animals passively viewed 78 different pictures of complex stimuli. The LFPs were obtained by low-pass filtering the extracellular electrophysiological signal with a corner frequency of 300 Hz. As reported previously, we observed that spike counts from MUA showed selectivity for some of the pictures. Strikingly, the LFP data, which is thought to constitute an average over large numbers of neurons, also showed significantly selective responses. The LFP responses were less selective than the MUA responses both in terms of the proportion of selective sites as well as in the selectivity of each site. We observed that there was only little overlap between the selectivity of MUA and LFP recordings from the same electrode. To assess the spatial organization of selective responses, we compared the selectivity of nearby sites recorded along the same penetration and sites recorded from different penetrations. We observed that MUA selectivity was correlated on spatial scales up to 800 µm while the LFP selectivity was correlated over a larger spatial extent, with significant correlations between sites separated by several mm. Our data support the idea that there is some topographical arrangement to the organization of selectivity in inferior temporal cortex and that this organization may be relevant for the representation of object identity in IT. Note: Gabriel Kreiman and Chou Hung contributed equally to this work This report describes research done within the Center for Biological and Computational Learning in the Department of Brain and Cognitive Sciences and at the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. This research is sponsored by DARPA, ONR and a Whiteman fellowship to G.K.
To accurately guide one's actions online, the brain predicts sensory action feedback ahead of time based on internal models, which can be updated by sensory prediction errors. The underlying operations can be experimentally investigated in sensorimotor adaptation tasks, in which moving under perturbed sensory action feedback requires internal model updates. Here we altered healthy participants' visual hand movement feedback in a virtual reality setup, while assessing brain activity with functional magnetic resonance imaging (fMRI). Participants tracked a continually moving virtual target object with a photorealistic, three-dimensional (3D) virtual hand controlled online via a data glove. During the continuous tracking task, the virtual hand's movements (i.e., visual movement feedback) were repeatedly periodically delayed, which participants had to compensate for to maintain accurate tracking. This realistic task design allowed us to simultaneously investigate processes likely operating at several levels of the brain's motor control hierarchy. FMRI revealed that the length of visual feedback delay was parametrically reflected by activity in the inferior parietal cortex and posterior temporal cortex. Unpredicted changes in visuomotor mapping (at transitions from synchronous to delayed visual feedback periods or vice versa) activated biological motion-sensitive regions in the lateral occipitotem-poral cortex (LOTC). Activity in the posterior parietal cortex (PPC), focused on the contralateral anterior intraparietal sulcus (aIPS), correlated with tracking error, whereby this correlation was stronger in participants with higher tracking performance. Our results are in line with recent proposals of a widespread cortical motor control hierarchy, where temporoparietal regions seem to evaluate visuomotor congruence and thus possibly ground a self-attribution of movements, the LOTC likely processes early visual prediction errors, and the aIPS computes action goal errors and possibly corresponding motor corrections.