The neuronal encoding of information in the brain (original) (raw)
Related papers
Information-Theoretical Analysis of the Neural Code in the Rodent Temporal Lobe
Entropy
In the study of the neural code, information-theoretical methods have the advantage of making no assumptions about the probabilistic mapping between stimuli and responses. In the sensory domain, several methods have been developed to quantify the amount of information encoded in neural activity, without necessarily identifying the specific stimulus or response features that instantiate the code. As a proof of concept, here we extend those methods to the encoding of kinematic information in a navigating rodent. We estimate the information encoded in two well-characterized codes, mediated by the firing rate of neurons, and by the phase-of-firing with respect to the theta-filtered local field potential. In addition, we also consider a novel code, mediated by the delta-filtered local field potential. We find that all three codes transmit significant amounts of kinematic information, and informative neurons tend to employ a combination of codes. Cells tend to encode conjunctions of kinem...
Combinatorial coding in neural populations
2008
To evaluate the nature of the neural code in the cerebral cortex, we have used a combination of theory and experiment to assess how information is represented in a realistic cortical population response. We have shown how a sensory stimulus could be estimated on a biologically-realistic time scale, given brief individual responses from a population of neurons with similar response properties. For neurons in extrastriate motion area MT, a combinatorial code, one that keeps track of the cell identity of action potentials and silences in individual neurons across the population, carries twice as much information about visual motion as does spike count averaged over the same group of cells. The combinatorial code is more informative because of the diverse firing rate dynamics of MT neurons in response to constant motion stimuli, and is robust to neuron-neuron correlations. We provide a theoretical motivation for these observations that challenges commonly held ideas about the nature of cortical coding at the level of single neurons and neural populations.
Journal of Computational Neuroscience, 1997
To analyze the information provided about individual visual stimuli in the responses of single neurons in the primate temporal lobe visual cortex, neuronal responses to a set of 65 visual stimuli were recorded in macaques performing a visual fixation task and analyzed using information theoretical measures. The population of neurons analyzed responded primarily to faces. The stimuli included 23 faces and 42 nonface images of real-world scenes, so that the function of this brain region could be analyzed when it was processing relatively natural scenes.
Information theory and neural coding
Nature neuroscience, 1999
Information theory quantifies how much information a neural response carries about the stimulus. This can be compared to the information transferred in particular models of the stimulus-response function and to maximum possible information transfer. Such comparisons are crucial because they validate assumptions present in any neurophysiological analysis. Here we review information-theory basics before demonstrating its use in neural coding. We show how to use information theory to validate simple stimulus-response models of neural coding of dynamic stimuli. Because these models require specification of spike timing precision, they can reveal which time scales contain information in neural coding. This approach shows that dynamic stimuli can be encoded efficiently by single neurons and that each spike contributes to information transmission. We argue, however, that the data obtained so far do not suggest a temporal code, in which the placement of spikes relative to each other yields ...
An information-theoretic study of neuronal spike correlations in the mammalian cerebral cortex
We have used information theory to examine whether stimulus-dependent correlation could contribute to the neural coding of orientation and contrast by pairs of V1 cells. To this end, we have used a modified version of the method of information components. This analysis revealed that although synchrony is prevalent and informative, the additional information it provides is frequently offset by the redundancy arising from the similar tuning properties of the two cells. Thus, coding is roughly independent with weak synergy or redundancy arising depending on the similarity in tuning and the temporal precision of the analysis.
A critical assessment of different measures of the information carried by correlated neuronal firing
Biosystems, 2002
Information theoretic measures have been proposed as a quantitative framework to clarify the role of correlated neuronal activity in the brain. In this paper we review some recent methods that allow precise assessments of the role of correlation in stimulus coding and decoding by the nervous system. We present new results that make explicit links between types of encoding and decoding mechanisms based on correlations. We illustrate the concepts by showing that the spike trains of pairs of neurons in rat somatosensory cortex can be decoded almost perfectly without including knowledge of correlation in the read-out model, although in this neural system correlations between spike times contribute appreciably to stimulus encoding. #
On the Information in Spike Timing: Neural Codes Derived from Polychronous Groups
2018 Information Theory and Applications Workshop (ITA), 2018
There is growing evidence regarding the importance of spike timing in neural information processing, with even a small number of spikes carrying information, but computational models lag significantly behind those for rate coding. Experimental evidence on neuronal behavior is consistent with the dynamical and state dependent behavior provided by recurrent connections. This motivates the minimalistic abstraction investigated in this paper, aimed at providing insight into information encoding in spike timing via recurrent connections. We employ information-theoretic techniques for a simple reservoir model which encodes input spatiotemporal patterns into a sparse neural code, translating the polychronous groups introduced by Izhikevich into codewords on which we can perform standard vector operations. We show that the distance properties of the code are similar to those for (optimal) random codes. In particular, the code meets benchmarks associated with both linear classification and capacity, with the latter scaling exponentially with reservoir size.