Simulation of the neurodynamic system of working memory (original) (raw)

Biological Neuronal Networks, Modeling of

In recent decades, since the seminal work of AL Hodgkin and AF Huxley (1), the study of the dynamical phenomena emerging in a network of biological neurons has been approached by means of mathematical descriptions, computer simulations (2, 3), and neuromorphic electronic hardware implementations (4). Several models1 have been proposed in the literature, and a large class of them share similar qualitative features.

Associative memory in networks of spiking neurons

Neural Networks, 2001

Here, we develop and investigate a computational model of a network of cortical neurons on the base of biophysically well constrained and tested two-compartmental neurons developed by Pinsky and Rinzel [Pinsky, P. F., & Rinzel, J. (1994). Intrinsic and network rhythmogenesis in a reduced Traub model for CA3 neurons. Journal of Computational Neuroscience, 1, 39±60]. To study associative memory, we connect a pool of cells by a structured connectivity matrix. The connection weights are shaped by simple Hebbian coincidence learning using a set of spatially sparse patterns. We study the neuronal activity processes following an external stimulation of a stored memory. In two series of simulation experiments, we explore the effect of different classes of external input, tonic and¯ashed stimulation. With tonic stimulation, the addressed memory is an attractor of the network dynamics. The memory is displayed rhythmically, coded by phase-locked bursts or regular spikes. The participating neurons have rhythmic activity in the gamma-frequency range (30±80 Hz). If the input is switched from one memory to another, the network activity can follow this change within one or two gamma cycles. Unlike similar models in the literature, we studied the range of high memory capacity (in the order of 0.1 bit/synapse), comparable to optimally tuned formal associative networks. We explored the robustness of ef®cient retrieval varying the memory load, the excitation/inhibition parameters, and background activity. A stimulation pulse applied to the identical simulation network can push away ongoing network activity and trigger a phase-locked association event within one gamma period. Unlike as under tonic stimulation, the memories are not attractors. After one association process, the network activity moves to other states. Applying in close succession pulses addressing different memories, one can switch through the space of memory patterns. The readout speed can be increased up to the point where in every gamma cycle another pattern is displayed. With pulsed stimulation, bursts become relevant for coding, their occurrence can be used to discriminate relevant processes from background activity.

Model for a neural network structure and signal transmission

We present a model of a neural network that is based on the diffusion-limited-aggregation ~DLA! structure from fractal physics. A single neuron is one DLA cluster, while a large number of clusters, in an interconnected fashion, make up the neural network. Using simulation techniques, a signal is randomly generated and traced through its transmission inside the neuron and from neuron to neuron through the synapses. The activity of the entire neural network is monitored as a function of time. The characteristics included in the model contain, among others, the threshold for firing, the excitatory or inhibitory character of the synapse, the synaptic delay, and the refractory period. The system activity results in ‘‘noisy’’ time series that exhibit an oscillatory character. Standard power spectra are evaluated and fractal analyses performed, showing that the system is not chaotic, but the varying parameters can be associated with specific values of fractal dimensions. It is found that the network activity is not linear with the system parameters, e.g., with the numbers of active synapses. The details of this behavior may have interesting repercussions from the neurological point of view.

Associative memory in a simple model of oscillating cortex

Advances in neural information processing systems 2, 1990

A generic model of oscillating cortex, which assumes "minimal" coupling justified by known anatomy, is shown to function as an associative memory, using previously developed theory. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long range excitatofy connections. Using a local Hebb-like learning rule for primary and higher order synapses at the ends of the long range connections, the system learns to store the kinds of oscillation amplitude patterns observed in olfactory and visual cortex. This rule is derived from a more general "projection algorithm" for recurrent analog networks, that analytically guarantees content addressable memory storage of continuous periodic sequencescapacity: N /2 Fourier components for an N node network -no "spurious" attractors.

Neural network modeling of associative memory: Beyond the Hopfield model

Physica A: Statistical Mechanics and its Applications, 1992

A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying dynamics are used to store and associatively recall information, are described. In the first class of models, a hierarchical structure is used to store an exponentially large number of strongly correlated memories. The second class of models uses limit cycles to store and retrieve individual memories. A neurobiologically plausible network that generates low-amplitude periodic variations of activity, similar to the oscillations observed in electroencephalographic recordings, is also described. Results obtained from analytic and numerical studies of the properties of these networks are discussed.

Pattern retrieval in a three-layer oscillatory network with a context dependent synaptic connectivity

Neural Networks, 2012

We propose a network solution for memory pattern retrieval in an oscillatory network based on a context dependent Hebbian connectivity. The model is composed of three interacting layers of spiking neurons with excitatory and inhibitory synaptic connections. Information patterns are stored in the memory using a symmetric Hebbian matrix and can be retrieved in response to a definite stimulus pattern. The patterns are encoded as distributions of phases of the oscillatory network units. We include in the network architecture an intermediate layer of excitable (non-oscillatory) interneurons. This layer provides a kind of pre-processing by filtering the in-phase or the anti-phase components of the input pattern. Then, only a part of Hebbian connections defined by the input (a ''context dependent connectivity'') is further used for the memory retrieval. Being supplied with an oscillatory clock signal the interneurons drive the signal propagation pathways in the feedforward architecture and, hence, reduce the number of effective connections needed for the retrieval. The oscillation phase stability problem for the in-phase and antiphase locking modes is investigated. Information characteristics and efficiency of the context dependent retrieval are discussed and compared with traditional oscillatory associative memory models. (A. Simonov), kastalskiy@neuro.nnov.ru (I. Kastalskiy), vkazan@neuron.appl.sci-nnov.ru (V. Kazantsev).

Model of neural circuit comparing static and adaptive synapses

Prague medical report, 2004

Replacing static synapses with the adaptive ones can affect the behaviour of neuronal network. Several network setups containing synapses modelled by alpha-functions, called here static synapses, are compared with corresponding setups containing more complex, dynamic synapses. The dynamic synapses have four state variables and the time constants are of different orders of magnitude. Response of the network to modelled stimulations was studied together with effects of neuronal interconnectivity, the axonal delays and the proportion of excitatory and inhibitory neurons on the network output. Dependency of synaptic strength on synaptic activity was also studied. We found that dynamic synapses enable network to exhibit broader spectrum of responses to given input and they make the network more sensitive to changes of network parameters. As a step towards memory modelling, retention of input sequences in the network with static and dynamic synapses was studied. The network with dynamic s...

Oscillatory activity in excitable neural systems

Contemporary Physics, 2010

The brain is a complex system and exhibits various subsystems on different spatial and temporal scales. These subsystems are recurrent networks of neurons or populations that interact with each other. The single neurons are microscopic objects and evolve on a different time scale than macroscopic neural populations. To understand the dynamics of the brain, however, it is necessary to understand the dynamics of the brain network both on the microscopic and the macroscopic level and the interaction between the levels. The presented work introduces to the major properties of single neurons and their interactions. The physical aspects of some standard mathematical models are discussed in some detail. The work shows that both single neurons and neural populations are excitable in the sense that small differences in an initial short stimulation may yield very different dynmical behavior of the system. To illustrate the power of the neural population model discussed, the work applies the model to explain experimental activity in the delayed feedback system in weakly electric fish and the electroencephalogram (EEG).