Parameter estimation in spiking neural networks: a reverse-engineering approach (original) (raw)

Reverse-engineering in spiking neural networks parameters: exact deterministic parameters estimation

2010

Abstract: We consider the deterministic evolution of a time-discretized network with spiking neurons, where synaptic transmission has delays, modeled as a neural network of the generalized integrate-and-fire (gIF) type. The purpose is to study a class of algorithmic methods allowing one to calculate the proper parameters to reproduce exactly a given spike train, generated by an hidden (unknown) neural network. This standard problem is known as NP-hard when delays are to be calculated. We propose here a reformulation, now ...

Back-engineering of spiking neural networks parameters

BMC Neuroscience, 2009

We consider the deterministic evolution of a time-discretized spiking network of neurons with connection weights having delays, modeled as a discretized neural network of the generalized integrate and fire (gIF) type. The purpose is to study a class of algorithmic methods allowing to calculate the proper parameters to reproduce exactly a given spike train generated by an hidden (unknown) neural network. This standard problem is known as NP-hard when delays are to be calculated. We propose here a reformulation, now expressed as a Linear-Programming (LP) problem, thus allowing to provide an efficient resolution. This allows us to "back-engineer" a neural network, i.e. to find out, given a set of initial conditions, which parameters (i.e., connection weights in this case), allow to simulate the network spike dynamics. More precisely we make explicit the fact that the back-engineering of a spike train, is a Linear (L) problem if the membrane potentials are observed and a LP problem if only spike times are observed, with a gIF model. Numerical robustness is discussed. We also explain how it is the use of a generalized IF neuron model instead of a leaky IF model that allows us to derive this algorithm. Furthermore, we point out how the L or LP adjustment mechanism is local to each unit and has the same structure as an "Hebbian" rule. A step further, this paradigm is easily generalizable to the design of input-output spike train transformations. This means that we have a practical method to "program" a spiking network, i.e. find a set of parameters allowing us to exactly reproduce the network output, given an input. Numerical verifications and illustrations are provided.

Optimization Methods for Spiking Neurons and Networks

IEEE Transactions on Neural Networks, 2000

Spiking neurons and spiking neural circuits are finding uses in a multitude of tasks such as robotic locomotion control, neuroprosthetics, visual sensory processing, and audition. The desired neural output is achieved through the use of complex neuron models, or by combining multiple simple neurons into a network. In either case, a means for configuring the neuron or neural circuit is required. Manual manipulation of parameters is both time consuming and non-intuitive due to the nonlinear relationship between parameters and the neuron's output. The complexity rises even further as the neurons are networked and the systems often become mathematically intractable. In large circuits, the desired behavior and timing of action potential trains may be known but the timing of the individual action potentials is unknown and unimportant, whereas in single neuron systems the timing of individual action potentials is critical. In this paper, we automate the process of finding parameters. To configure a single neuron we derive a maximum likelihood method for configuring a neuron model, specifically the Mihalas-Niebur Neuron. Similarly, to configure neural circuits, we show how we use genetic algorithms (GAs) to configure parameters for a network of simple integrate and fire with adaptation neurons. The GA approach is demonstrated

An FPGA-based approach for parameter estimation in spiking neural networks

2012

We present an FPGA-based approach for estimating the delayed synaptic weights of spiking neural networks. Our approach makes explicit use of the fact that reverse engineering of a spiking neural network can be cast as a linear programming problem, whereby the objective function is based on the network spiking activity. The solution is obtained by employing the widely used simplex algorithm. Numerical results on a Xilinx Spartan 3 FPGA board show that the present approach can be used to reproduce a ...

Spiking neural networks modelled as timed automata: with parameter learning

Natural Computing

In this paper we present a novel approach to automatically infer parameters of spiking neural networks. Neurons are modelled as timed automata waiting for inputs on a number of different channels (synap-ses), for a given amount of time (the accumulation period). When this period is over, the current potential value is computed considering current and past inputs. If this potential overcomes a given threshold, the automaton emits a broadcast signal over its output channel, otherwise it restarts another accumulation period. After each emission, the automaton remains inactive for a fixed refractory period. Spiking neural networks are formalised as sets of automata, one for each neuron, running in parallel and sharing channels according to the network structure. Such a model is formally validated against some crucial properties defined via proper temporal logic formulae. The model is then exploited to find an assignment for the synaptical weights of neural networks such that they can reproduce a given behaviour. The core of this approach consists in identifying some correcting actions adjusting synaptical weights and back-propagating them until the expected behaviour is displayed. A concrete case study is discussed.

Embedded neural controllers based on spiking neuron models

Pollack Periodica, 2009

This paper demonstrates, that input patterns can be encoded in the synaptic weights by local Hebbian delay-learning of spiking neurons (SN), where, after learning, the firing time of an output neuron reflects the distance of the evaluated pattern to its learned input pattern thus realizing a kind of RBF behavior. Furthermore, the paper shows, that temporal spike-time coding and Hebbian learning is a viable means for unsupervised computation in a network of SNs, as the network is capable of clustering realistic data. Then, two versions-with and without embedded micro-controllers-of a SNN are implemented for the aforementioned task.

Designing spiking neural networks

2016 13th International Conference on Modern Problems of Radio Engineering, Telecommunications and Computer Science (TCSET), 2016

The problem of design is the most important part of complex systems building. This is also true for spiking neural networks. In this paper, the next steps of SNN design are described: coding, selecting neuron model and learning algorithm, creating network architecture. Software and hardware solutions for simulating these networks are also discussed. We propose a range of evolution directions, future studies on every step of the design. Our methods are based on detailed analysis of existing solutions and needs.

Introduction to spiking neural networks: Information processing, learning and applications

Acta neurobiologiae experimentalis, 2011

The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, specifically, on models of spike-based information coding, synaptic plasticity and learning. We also survey real-life applications of spiking models. The paper is meant to be an introduction to spiking neural networks for scientists from various disciplines interested in spike-based neural processing.

Indisputable facts when implementing spiking neuron networks

2009

In this article, our wish is to demystify some aspects of coding with spike-timing, through a simple review of well-understood technical facts regarding spike coding. The goal is to help better understanding to which extend computing and modelling with spiking neuron networks can be biologically plausible and computationally efficient. We intentionally restrict ourselves to a deterministic dynamics, in this review,

An Adaptive Spiking Neural Network with

2011

This paper will describe a numerical approach to simulating biologically-plausible spiking neural networks. These are time dependent neural networks with realistic models for the neurons (Hodgkin-Huxley). In addition the learning is biologically plausible as well, being a Hebbian approach based on spike timing dependent plasticity (STDP). To make the approach very general and flexible, neurogenesis and synaptogenesis have been implemented, which allows the code to automatically add or remove neurons (or synapses) as required.