Martin Stemmler | Ludwig-Maximilians-Universität München (original) (raw)
Papers by Martin Stemmler
BMC Neuroscience, 2011
When a rat explores its environment, grid cells in the medial entorhinal cortex show increased ac... more When a rat explores its environment, grid cells in the medial entorhinal cortex show increased activity at specific locations that constitute a regular hexagonal grid. As the rat enters and progresses through one of these "grid fields" on a linear track, spikes occur at successively earlier phases in the LFP's theta rhythm. This phenomenon is called phase precession. For rats foraging in two-dimensional environments, however, phase precession has not yet been quantified. Unlike on the linear track, a rat does not repeat the same path over and over again in an open arena; pooling different runs to assess phase precession becomes fraught with difficulty. Instead, we analyze grid cell spike trains recorded by [1] on a run-by-run basis, and do the same also for linear track data (by [2]). Surprisingly, even on the linear track, phase precession during single runs is stronger than the average phase precession in the pooled data. We show that a grid cell spike's theta phase allows one to estimate the animal's position on a linear track to within about 10 % of the size of a typical grid field. The spectrum of grid cell spike trains recorded in a two-dimensional environment reveals a peak that is shifted by approximately 1 Hz relative to the peak in the LFP theta rhythm, indicative of robust phase precession. We investigate the dependence of spike phase on the radial distance from the grid field center and how phase precession depends on whether the path is transverse or tangential to the grid field, in order to create a model for decoding position in two dimensions from spike phases.
We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire... more We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire neurons with feedback connectivity consisting of local excitation and surround inhibition. Each neuron receives stochastic input from an external source, independent in space and time. As recently suggested by Koch (1992,1993), independent stochastic input alone cannot explain the high interspike interval variability exhibited by cortical neurons in behaving monkeys. We show that high variability can be obtained due to the amplification of correlated fluctuations in a recurrent network. Furthermore, the crosscorrelation functions have a dual structure, with a sharp peak on top of a much broader hill. This is due to the inhibitory and excitatory feedback connections, which cause "hotspots" of neural activity to form within the network. These localized patterns of excitation appear as clusters or stripes that coalesce, disintegrate, or fluctuate in size while simultaneously moving in a random walk constrained by the interaction with other clusters. The synaptic current impinging upon a single neuron shows large fluctuations at many time scales, leading to a large coefficient of variation (Cv) for the interspike interval statistics. The power spectrum associated with single units shows a l/f decay for small frequencies and is flat at higher frequencies, while the power spectrum of the spiking activity averaged over many cells-equivalent to the local field potential-shows no l/f decay but a prominent peak around 40 Hz, in agreement with data recorded from cat and monkey cortex (Gray et al. 1990; Eckhorn et al. 1993). Firing rates exhibit self-similarity between 20 and 800 msec, resulting in I/!-like noise, consistent with the fractal nature of neural spike trains (Teich 1992). Neural Computation 6, 795-836 (1994) @ 1994 Massachusetts Institute of Technology 796 M. Usher et al.
Nature Neuroscience, Jun 1, 1999
Information from the senses must be compressed into the limited range of responses that spiking n... more Information from the senses must be compressed into the limited range of responses that spiking neurons can generate. For optimal compression, the neuron's response should match the statistics of stimuli encountered in nature. Given a maximum firing rate, a nerve cell should learn to use each available firing rate equally often. Given a set mean firing rate, it should self-organize to respond with high firing rates only to comparatively rare events. Here we derive an unsupervised learning rule that continuously adapts membrane conductances of a Hodgkin-Huxley model neuron to optimize the representation of sensory information in the firing rate. Maximizing information transfer between the stimulus and the cell's firing rate can be interpreted as a non-Hebbian developmental mechanism. © 1999 Nature America Inc. • http://neurosci.nature.com Conductance (nS) Midpoint voltage, V 1/2 (mV) Dendritic voltage, V d (mV) Dendritic voltage, V d (mV) Dendritic voltage, V d (mV)
Nature neuroscience, 1999
Information from the senses must be compressed into the limited range of responses that spiking n... more Information from the senses must be compressed into the limited range of responses that spiking neurons can generate. For optimal compression, the neuron's response should match the statistics of stimuli encountered in nature. Given a maximum firing rate, a nerve cell should learn to use each available firing rate equally often. Given a set mean firing rate, it should self-organize to respond with high firing rates only to comparatively rare events. Here we derive an unsupervised learning rule that continuously adapts membrane conductances of a Hodgkin-Huxley model neuron to optimize the representation of sensory information in the firing rate. Maximizing information transfer between the stimulus and the cell's firing rate can be interpreted as a non-Hebbian developmental mechanism.
The Neurobiology of Computation, 1995
ABSTRACT
Information from the senses must be compressed into the limited range of firing rates generated b... more Information from the senses must be compressed into the limited range of firing rates generated by spiking nerve cells. Optimal compression uses all firing rates equally often, implying that the nerve cell's response matches the statistics of naturally occurring stimuli. Since changing the voltage-dependent ionic conductances in the cell membrane alters the flow of information, an unsupervised, non-Hebbian, developmental learning rule is derived to adapt the conductances in Hodgkin-Huxley model neurons. By maximizing the rate of information transmission, each firing rate within the model neuron's limited dynamic range is used equally often .
Neural Computation, 1994
We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire... more We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire neurons with feedback connectivity consisting of local excitation and surround inhibition. Each neuron receives stochastic input from an external source, independent in space and time. As recently suggested by Koch (1992,1993), independent stochastic input alone cannot explain the high interspike interval variability exhibited by cortical neurons in behaving monkeys. We show that high variability can be obtained due to the amplification of correlated fluctuations in a recurrent network. Furthermore, the crosscorrelation functions have a dual structure, with a sharp peak on top of a much broader hill. This is due to the inhibitory and excitatory feedback connections, which cause "hotspots" of neural activity to form within the network. These localized patterns of excitation appear as clusters or stripes that coalesce, disintegrate, or fluctuate in size while simultaneously moving in a random walk constrained by the interaction with other clusters. The synaptic current impinging upon a single neuron shows large fluctuations at many time scales, leading to a large coefficient of variation (Cv) for the interspike interval statistics. The power spectrum associated with single units shows a l/f decay for small frequencies and is flat at higher frequencies, while the power spectrum of the spiking activity averaged over many cells-equivalent to the local field potential-shows no l/f decay but a prominent peak around 40 Hz, in agreement with data recorded from cat and monkey cortex (Gray et al. 1990; Eckhorn et al. 1993). Firing rates exhibit self-similarity between 20 and 800 msec, resulting in I/!-like noise, consistent with the fractal nature of neural spike trains (Teich 1992). Neural Computation 6, 795-836 (1994) @ 1994 Massachusetts Institute of Technology 796 M. Usher et al.
Encoding information about continuous variables using noisy computational units is a challenge; n... more Encoding information about continuous variables using noisy computational units is a challenge; nonetheless, asymptotic theory shows that combining multiple periodic scales for coding can be highly precise despite the corrupting influence of noise (Mathis et al., Phys. Rev. Lett. 2012). Indeed, cortex seems to use such stochastic multi-scale periodic `grid codes' to represent position accurately. We show here how these codes can be read out without taking the asymptotic limit; even on short time scales, the precision of neuronal grid codes scales exponentially in the number N of neurons. Does this finding also hold for neurons that are not statistically independent? To assess the extent to which biological grid codes are subject to statistical dependencies, we analyze the noise correlations between pairs of grid code neurons in behaving rodents. We find that if the grids of the two neurons align and have the same length scale, the noise correlations between the neurons can reach 0.8. For increasing mismatches between the grids of the two neurons, the noise correlations fall rapidly. Incorporating such correlations into a population coding model reveals that the correlations lessen the resolution, but the exponential scaling of resolution with N is unaffected.
Physical Review Letters, Dec 1, 2009
Starting from a general description of noisy limit cycle oscillators, we derive from the Fokker-P... more Starting from a general description of noisy limit cycle oscillators, we derive from the Fokker-Planck equations the linear response of the instantaneous oscillator frequency to a time-varying external force. We consider the time series of zero crossings of the oscillator’s phase and compute the mutual information between it and the driving force. A direct link is established between the phase response curve summarizing the oscillator dynamics and the ability of a limit cycle oscillator, such as a heart cell or neuron, to encode information in the timing of peaks in the oscillation.
Science advances, 2015
Mammalian grid cells fire when an animal crosses the points of an imaginary hexagonal grid tessel... more Mammalian grid cells fire when an animal crosses the points of an imaginary hexagonal grid tessellating the environment. We show how animals can navigate by reading out a simple population vector of grid cell activity across multiple spatial scales, even though neural activity is intrinsically stochastic. This theory of dead reckoning explains why grid cells are organized into discrete modules within which all cells have the same lattice scale and orientation. The lattice scale changes from module to module and should form a geometric progression with a scale ratio of around 3/2 to minimize the risk of making large-scale errors in spatial localization. Such errors should also occur if intermediate-scale modules are silenced, whereas knocking out the module at the smallest scale will only affect spatial precision. For goal-directed navigation, the allocentric grid cell representation can be readily transformed into the egocentric goal coordinates needed for planning movements. The go...
Frontiers in Computational Neuroscience, 2008
Frontiers in Computational Neuroscience, 1970
The Journal of Neuroscience : The Official Journal of the Society for Neuroscience
Despite their simple auditory systems, some insect species recognize certain temporal aspects of ... more Despite their simple auditory systems, some insect species recognize certain temporal aspects of acoustic stimuli with an acuity equal to that of vertebrates; however, the underlying neural mechanisms and coding schemes are only partially understood. In this study, we analyze the response characteristics of the peripheral auditory system of grasshoppers with special emphasis on the representation of species-specific communication signals. We use both natural calling songs and artificial random stimuli designed to focus on two low-order statistical properties of the songs: their typical time scales and the distribution of their modulation amplitudes.
Despite their simple auditory systems, some insect species recognize certain temporal aspects of ... more Despite their simple auditory systems, some insect species recognize certain temporal aspects of acoustic stimuli with an acuity equal to that of vertebrates; however, the underlying neural mechanisms and coding schemes are only partially understood. In this study, we analyze the response characteristics of the peripheral auditory system of grasshoppers with special emphasis on the representation of species-specific communication signals. We use both natural calling songs and artificial random stimuli designed to focus on two low-order statistical properties of the songs: their typical time scales and the distribution of their modulation amplitudes.
BMC Neuroscience, 2011
When a rat explores its environment, grid cells in the medial entorhinal cortex show increased ac... more When a rat explores its environment, grid cells in the medial entorhinal cortex show increased activity at specific locations that constitute a regular hexagonal grid. As the rat enters and progresses through one of these "grid fields" on a linear track, spikes occur at successively earlier phases in the LFP's theta rhythm. This phenomenon is called phase precession. For rats foraging in two-dimensional environments, however, phase precession has not yet been quantified. Unlike on the linear track, a rat does not repeat the same path over and over again in an open arena; pooling different runs to assess phase precession becomes fraught with difficulty. Instead, we analyze grid cell spike trains recorded by [1] on a run-by-run basis, and do the same also for linear track data (by [2]). Surprisingly, even on the linear track, phase precession during single runs is stronger than the average phase precession in the pooled data. We show that a grid cell spike's theta phase allows one to estimate the animal's position on a linear track to within about 10 % of the size of a typical grid field. The spectrum of grid cell spike trains recorded in a two-dimensional environment reveals a peak that is shifted by approximately 1 Hz relative to the peak in the LFP theta rhythm, indicative of robust phase precession. We investigate the dependence of spike phase on the radial distance from the grid field center and how phase precession depends on whether the path is transverse or tangential to the grid field, in order to create a model for decoding position in two dimensions from spike phases.
We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire... more We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire neurons with feedback connectivity consisting of local excitation and surround inhibition. Each neuron receives stochastic input from an external source, independent in space and time. As recently suggested by Koch (1992,1993), independent stochastic input alone cannot explain the high interspike interval variability exhibited by cortical neurons in behaving monkeys. We show that high variability can be obtained due to the amplification of correlated fluctuations in a recurrent network. Furthermore, the crosscorrelation functions have a dual structure, with a sharp peak on top of a much broader hill. This is due to the inhibitory and excitatory feedback connections, which cause "hotspots" of neural activity to form within the network. These localized patterns of excitation appear as clusters or stripes that coalesce, disintegrate, or fluctuate in size while simultaneously moving in a random walk constrained by the interaction with other clusters. The synaptic current impinging upon a single neuron shows large fluctuations at many time scales, leading to a large coefficient of variation (Cv) for the interspike interval statistics. The power spectrum associated with single units shows a l/f decay for small frequencies and is flat at higher frequencies, while the power spectrum of the spiking activity averaged over many cells-equivalent to the local field potential-shows no l/f decay but a prominent peak around 40 Hz, in agreement with data recorded from cat and monkey cortex (Gray et al. 1990; Eckhorn et al. 1993). Firing rates exhibit self-similarity between 20 and 800 msec, resulting in I/!-like noise, consistent with the fractal nature of neural spike trains (Teich 1992). Neural Computation 6, 795-836 (1994) @ 1994 Massachusetts Institute of Technology 796 M. Usher et al.
Nature Neuroscience, Jun 1, 1999
Information from the senses must be compressed into the limited range of responses that spiking n... more Information from the senses must be compressed into the limited range of responses that spiking neurons can generate. For optimal compression, the neuron's response should match the statistics of stimuli encountered in nature. Given a maximum firing rate, a nerve cell should learn to use each available firing rate equally often. Given a set mean firing rate, it should self-organize to respond with high firing rates only to comparatively rare events. Here we derive an unsupervised learning rule that continuously adapts membrane conductances of a Hodgkin-Huxley model neuron to optimize the representation of sensory information in the firing rate. Maximizing information transfer between the stimulus and the cell's firing rate can be interpreted as a non-Hebbian developmental mechanism. © 1999 Nature America Inc. • http://neurosci.nature.com Conductance (nS) Midpoint voltage, V 1/2 (mV) Dendritic voltage, V d (mV) Dendritic voltage, V d (mV) Dendritic voltage, V d (mV)
Nature neuroscience, 1999
Information from the senses must be compressed into the limited range of responses that spiking n... more Information from the senses must be compressed into the limited range of responses that spiking neurons can generate. For optimal compression, the neuron's response should match the statistics of stimuli encountered in nature. Given a maximum firing rate, a nerve cell should learn to use each available firing rate equally often. Given a set mean firing rate, it should self-organize to respond with high firing rates only to comparatively rare events. Here we derive an unsupervised learning rule that continuously adapts membrane conductances of a Hodgkin-Huxley model neuron to optimize the representation of sensory information in the firing rate. Maximizing information transfer between the stimulus and the cell's firing rate can be interpreted as a non-Hebbian developmental mechanism.
The Neurobiology of Computation, 1995
ABSTRACT
Information from the senses must be compressed into the limited range of firing rates generated b... more Information from the senses must be compressed into the limited range of firing rates generated by spiking nerve cells. Optimal compression uses all firing rates equally often, implying that the nerve cell's response matches the statistics of naturally occurring stimuli. Since changing the voltage-dependent ionic conductances in the cell membrane alters the flow of information, an unsupervised, non-Hebbian, developmental learning rule is derived to adapt the conductances in Hodgkin-Huxley model neurons. By maximizing the rate of information transmission, each firing rate within the model neuron's limited dynamic range is used equally often .
Neural Computation, 1994
We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire... more We investigate a model for neural activity in a two-dimensional sheet of leaky integrate-and-fire neurons with feedback connectivity consisting of local excitation and surround inhibition. Each neuron receives stochastic input from an external source, independent in space and time. As recently suggested by Koch (1992,1993), independent stochastic input alone cannot explain the high interspike interval variability exhibited by cortical neurons in behaving monkeys. We show that high variability can be obtained due to the amplification of correlated fluctuations in a recurrent network. Furthermore, the crosscorrelation functions have a dual structure, with a sharp peak on top of a much broader hill. This is due to the inhibitory and excitatory feedback connections, which cause "hotspots" of neural activity to form within the network. These localized patterns of excitation appear as clusters or stripes that coalesce, disintegrate, or fluctuate in size while simultaneously moving in a random walk constrained by the interaction with other clusters. The synaptic current impinging upon a single neuron shows large fluctuations at many time scales, leading to a large coefficient of variation (Cv) for the interspike interval statistics. The power spectrum associated with single units shows a l/f decay for small frequencies and is flat at higher frequencies, while the power spectrum of the spiking activity averaged over many cells-equivalent to the local field potential-shows no l/f decay but a prominent peak around 40 Hz, in agreement with data recorded from cat and monkey cortex (Gray et al. 1990; Eckhorn et al. 1993). Firing rates exhibit self-similarity between 20 and 800 msec, resulting in I/!-like noise, consistent with the fractal nature of neural spike trains (Teich 1992). Neural Computation 6, 795-836 (1994) @ 1994 Massachusetts Institute of Technology 796 M. Usher et al.
Encoding information about continuous variables using noisy computational units is a challenge; n... more Encoding information about continuous variables using noisy computational units is a challenge; nonetheless, asymptotic theory shows that combining multiple periodic scales for coding can be highly precise despite the corrupting influence of noise (Mathis et al., Phys. Rev. Lett. 2012). Indeed, cortex seems to use such stochastic multi-scale periodic `grid codes' to represent position accurately. We show here how these codes can be read out without taking the asymptotic limit; even on short time scales, the precision of neuronal grid codes scales exponentially in the number N of neurons. Does this finding also hold for neurons that are not statistically independent? To assess the extent to which biological grid codes are subject to statistical dependencies, we analyze the noise correlations between pairs of grid code neurons in behaving rodents. We find that if the grids of the two neurons align and have the same length scale, the noise correlations between the neurons can reach 0.8. For increasing mismatches between the grids of the two neurons, the noise correlations fall rapidly. Incorporating such correlations into a population coding model reveals that the correlations lessen the resolution, but the exponential scaling of resolution with N is unaffected.
Physical Review Letters, Dec 1, 2009
Starting from a general description of noisy limit cycle oscillators, we derive from the Fokker-P... more Starting from a general description of noisy limit cycle oscillators, we derive from the Fokker-Planck equations the linear response of the instantaneous oscillator frequency to a time-varying external force. We consider the time series of zero crossings of the oscillator’s phase and compute the mutual information between it and the driving force. A direct link is established between the phase response curve summarizing the oscillator dynamics and the ability of a limit cycle oscillator, such as a heart cell or neuron, to encode information in the timing of peaks in the oscillation.
Science advances, 2015
Mammalian grid cells fire when an animal crosses the points of an imaginary hexagonal grid tessel... more Mammalian grid cells fire when an animal crosses the points of an imaginary hexagonal grid tessellating the environment. We show how animals can navigate by reading out a simple population vector of grid cell activity across multiple spatial scales, even though neural activity is intrinsically stochastic. This theory of dead reckoning explains why grid cells are organized into discrete modules within which all cells have the same lattice scale and orientation. The lattice scale changes from module to module and should form a geometric progression with a scale ratio of around 3/2 to minimize the risk of making large-scale errors in spatial localization. Such errors should also occur if intermediate-scale modules are silenced, whereas knocking out the module at the smallest scale will only affect spatial precision. For goal-directed navigation, the allocentric grid cell representation can be readily transformed into the egocentric goal coordinates needed for planning movements. The go...
Frontiers in Computational Neuroscience, 2008
Frontiers in Computational Neuroscience, 1970
The Journal of Neuroscience : The Official Journal of the Society for Neuroscience
Despite their simple auditory systems, some insect species recognize certain temporal aspects of ... more Despite their simple auditory systems, some insect species recognize certain temporal aspects of acoustic stimuli with an acuity equal to that of vertebrates; however, the underlying neural mechanisms and coding schemes are only partially understood. In this study, we analyze the response characteristics of the peripheral auditory system of grasshoppers with special emphasis on the representation of species-specific communication signals. We use both natural calling songs and artificial random stimuli designed to focus on two low-order statistical properties of the songs: their typical time scales and the distribution of their modulation amplitudes.
Despite their simple auditory systems, some insect species recognize certain temporal aspects of ... more Despite their simple auditory systems, some insect species recognize certain temporal aspects of acoustic stimuli with an acuity equal to that of vertebrates; however, the underlying neural mechanisms and coding schemes are only partially understood. In this study, we analyze the response characteristics of the peripheral auditory system of grasshoppers with special emphasis on the representation of species-specific communication signals. We use both natural calling songs and artificial random stimuli designed to focus on two low-order statistical properties of the songs: their typical time scales and the distribution of their modulation amplitudes.