Measures of statistical dispersion based on Entropy and Fisher information (original) (raw)
Related papers
Entropy and local uncertainty of data from sensory neurons
Physical review. E, Statistical, nonlinear, and soft matter physics, 2001
We present an empirical comparison between neural interspike interval sequences obtained from two different kinds of sensory receptors. Both differ in their internal structure as well as in the strength of correlations and the degree of predictability found in the respective spike trains. As a further tool in this context, we suggest the local uncertainty, assigning a well-defined predictability to individual spikes. The local uncertainty is demonstrated to reveal significant patterns within the interspike interval sequences, even when its overall structure is (almost) random. Our approach is based on the concept of symbolic dynamics and information theory.
Variability Measures of Positive Random Variables
PLoS ONE, 2011
During the stationary part of neuronal spiking response, the stimulus can be encoded in the firing rate, but also in the statistical structure of the interspike intervals. We propose and discuss two information-based measures of statistical dispersion of the interspike interval distribution, the entropy-based dispersion and Fisher information-based dispersion. The measures are compared with the frequently used concept of standard deviation. It is shown, that standard deviation is not well suited to quantify some aspects of dispersion that are often expected intuitively, such as the degree of randomness. The proposed dispersion measures are not entirely independent, although each describes the interspike intervals from a different point of view. The new methods are applied to common models of neuronal firing and to both simulated and experimental data.
Neuronal Data Analysis Based on the Empirical Cumulative Entropy
Lecture Notes in Computer Science, 2012
We propose the empirical cumulative entropy as a variability measure suitable to describe the information content in neuronal firing data. Some useful characteristics and an application to a real dataset are also discussed.
Time resolution dependence of information measures for spiking neurons: scaling and universality
Frontiers in computational neuroscience, 2015
The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes. A first step toward that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate). We calculate these for spike trains generated by a variety of noise-driven integrate-and-fire neurons as a function of time resolution and for alternating renewal processes. We show that their time-resolution dependence reveals coarse-grained structural properties of interspike interval statistics; e.g., τ-entropy rates that diverge less quickly than the firing rate indicated by interspike interval correlations. We also find evidence that ...
2015
The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes. A first step towards that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate). We calculate these for spike trains generated by a variety of noise-driven integrate-and-fire neurons as a function of time resolution and for alternating renewal processes. We show that their time-resolution dependence reveals coarsegrained structural properties of interspike interval statistics; e.g., τ-entropy rates that diverge less quickly than the firing rate indicate interspike interval correlations. We also find evidence that the excess entropy and regularized statistical complexity of different types of integrate-and-fire neurons are universal in the continuous-time limit in the sense that they do not depend on mechanism details. This suggests a surprising simplicity in the spike trains generated by these model neurons. Interestingly, neurons with gamma-distributed ISIs and neurons whose spike trains are alternating renewal processes do not fall into the same universality class. These results lead to two conclusions. First, the dependence of information measures on time resolution reveals mechanistic details about spike train generation. Second, information measures can be used as model selection tools for analyzing spike train processes.
Neurocomputing, 2003
We deÿne a biological communication system at the level of a single neuron and quantify the temporal variability of a erent and e erent impulse patterns by means of an interval entropy measure. Two signal transmission conditions which bound a physiologically plausible range of transmission possibilities are explored. The number of e erent synapses is predicted by matching estimates of the mean e erent entropy to the total a erent entropy.
Estimating Information Rates with Confidence Intervals in Neural Spike Trains
Neural Computation, 2007
Information theory provides a natural set of statistics to quantify the amount of knowledge a neuron conveys about a stimulus. A related work (Kennel, Shlens, Abarbanel, & Chichilnisky, 2005) demonstrated how to reliably estimate, with a Bayesian confidence interval, the entropy rate from a discrete, observed time series. We extend this method to measure the rate of novel information that a neural spike train encodes about a stimulus—the average and specific mutual information rates. Our estimator makes few assumptions about the underlying neural dynamics, shows excellent performance in experimentally relevant regimes, and uniquely provides confidence intervals bounding the range of information rates compatible with the observed spike train. We validate this estimator with simulations of spike trains and highlight how stimulus parameters affect its convergence in bias and variance. Finally, we apply these ideas to a recording from a guinea pig retinal ganglion cell and compare resul...
From the Entropy to the Statistical Structure of Spike Trains
2006 IEEE International Symposium on Information Theory, 2006
We use statistical estimates of the entropy rate of individual spike train data in order to make inferences about the underlying structure of the spike train itself. We first examine a number of different parametric and nonparametric estimators (some known and some new), including the "plug-in" method, several versions of Lempel-Ziv-based compression algorithms, a maximum likelihood estimator tailored to renewal processes, and the natural estimator derived from the Context-Tree Weighting method (CTW). The theoretical properties of these estimators are examined, several new theoretical results are developed, and all estimators are systematically applied to various types of simulated data under different conditions.
Correlated noise and memory effects in neural firing statistics
This paper discusses two problems at the forefront of neurobiology and of noise research. They arise from non-renewal firing processes in nerve cells, due to various forms of memory. The combination of short and long-term correlations between firing intervals has been shown to enhance information transfer, namely by causing a minimal variability in the spike count distribution at a specific counting time. The first problem concerns first passage time calculations in a model that combines these two forms of correlations. It is a two-dimensional leaky integrate-and-fire (LIF) model in which the threshold is also a dynamical variable. The second problem concerns the effect of long-range correlations on neuron firing statistics. We show new results on the interspike interval densities as well as the spike count Fano factor for the perfect integrate-and-fire (PIF) model forced by a slow (long-correlation time) Ornstein-Uhlenbeck process, which is a simplification of the previous model. These theoretical results are obtained using a quasi-static noise approximation. There remain, however, many exciting challenges in relating correlations with signal detection in neurobiological systems, some of which are highlighted in our paper.