Investigating the computational power of spiking neurons with non-standard behaviors (original) (raw)
Related papers
Computing with Spiking Neuron Networks A Review
Spiking Neuron Networks (SNNs) are often referred to as the third generation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an accurate modeling of synaptic interactions between neurons, taking into account the time of spike firing. SNNs overcome the computational power of neural networks made of threshold or sigmoidal units. Based on dynamic event-driven processing, they open up new horizons for developing models with an exponential capacity of memorizing and a strong ability to fast adaptation. Today, the main challenge is to discover efficient learning rules that might take advantage of the specific features of SNNs while keeping the nice properties (general-purpose, easy-to-use, available simulators, etc.) of traditional connectionist models. This paper presents the history of the "spiking neuron", summarizes the most currently-in-use models of neurons and synaptic plasticity, the computational power of SNNs is addressed and the problem of learning in networks of spiking neurons is tackled.
Buletinul Institutului Politehnic din Iaşi, 2022
Spiking neural networks (SNNs) are inspired from natural computing, modelling with high accuracy the interactions and processes between the synapses of the neurons focusing on low response time and energy efficiency. This novel paradigm of event-based processing opens new opportunities for discovering applications and developing efficient learning methods that should highlight the advantages of SNNs such as the large memory capacity and the fast adaptation, while preserving the easy-to-use and portability of the conventional computing architectures. In this paper, we do a brief review of the developments of the past decades in the field of SNNs. We start with a brief history of the SNN and summarize the most common models of spiking neurons and methods to implement synaptic plasticity. We also classify the SNNs according to the implemented learning rules and network topology. We present the computational advantages, liabilities, and applications suitable for using SNNs in terms of energy efficiency and response time. In addition, we briefly sweep through the existing platforms and simulation frameworks for
Biologically inspired features in spiking neural networks
Neural networks have the power to deal with information which is very hard to process using ordinary approaches, e.g. speech recognition. A recent trend in applying neural networks is to use biologically realistic neuron models. Specifically, neurons are considered which communicate with discrete pulses instead of continuous signals: spiking neurons. In this paper we investigate a small selection of properties which are found in biological neurons and investigate their effect on the general computational performance of spiking neural networks (SNN). Firstly, we investigated the way in which the internal dynamics of the neurons and delayed communication improve the ability to recognize temporal patterns. Secondly we explored an unsupervised adaptation rule which helps to distribute the work equally over all the neurons in the network, so that all neurons are involved in the task they are supposed to solve. It turned out that these biologically inspired features often improved the performance for the tasks investigated.
In this article is presented a very simple and effective analog spiking neural network simulator, realized with an event-driven method, taking into account a basic biological neuron parameter: the spike latency. Also, other fundamentals biological parameters are considered, such as subthreshold decay and refractory period. This model allows to synthesize neural groups able to carry out some substantial functions. The proposed simulator is applied to elementary structures, in which some properties and interesting applications are discussed, such as the realization of a Spiking Neural Network Classifier.
The Simulation of Spiking Neural Networks
Technologies and Applications
This chapter is an overview of the simulation of spiking neural networks that relates discrete event simulation to other approaches and includes a case study of recent work. The chapter starts with an introduction to the key components of the brain and sets out three neuron models that are commonly used in simulation work. After explaining discrete event, continuous and hybrid simulation, the performance of each method is evaluated and recent research is discussed. To illustrate the issues surrounding this work, the second half of this chapter presents a case study of the SpikeStream neural simulator that covers the architecture, performance and typical applications of this software along with some recent experiments. The last part of the chapter suggests some future trends for work in this area.
Spiking Neural Networks for Computational Intelligence: An Overview
Big Data and Cognitive Computing
Deep neural networks with rate-based neurons have exhibited tremendous progress in the last decade. However, the same level of progress has not been observed in research on spiking neural networks (SNN), despite their capability to handle temporal data, energy-efficiency and low latency. This could be because the benchmarking techniques for SNNs are based on the methods used for evaluating deep neural networks, which do not provide a clear evaluation of the capabilities of SNNs. Particularly, the benchmarking of SNN approaches with regards to energy efficiency and latency requires realization in suitable hardware, which imposes additional temporal and resource constraints upon ongoing projects. This review aims to provide an overview of the current real-world applications of SNNs and identifies steps to accelerate research involving SNNs in the future.
A Review of Biologically Plausible Neuron Models for Spiking Neural Networks
AIAA Infotech@Aerospace 2010, 2010
In this paper, five mathematical models of single neurons are discussed and compared. The physical meanings, derivations, and differential equations of each model are provided. Since for many applications the spiking rates of neurons are of great importance, we compare the spiking rate patterns under different sustained current inputs. Numerical stability and accuracy are also considered. The computational cost and storage requirements needed to numerically solve each of the models are also discussed.
Simple and complex spiking neurons: perspectives and analysis in a simple STDP scenario
Cornell University - arXiv, 2022
Spiking neural networks (SNNs) are largely inspired by biology and neuroscience and leverage ideas and theories to create fast and efficient learning systems. Spiking neuron models are adopted as core processing units in neuromorphic systems because they enable event-based processing. Among many neuron models, the integrate-and-fire (I&F) models are often adopted, with the simple Leaky I&F (LIF) being the most used. The reason for adopting such models is their efficiency and/or biological plausibility. Nevertheless, rigorous justification for adopting LIF over other neuron models for use in artificial learning systems has not yet been studied. This work considers various neuron models in the literature and then selects computational neuron models that are single-variable, efficient, and display different types of complexities. From this selection, we make a comparative study of three simple I&F neuron models, namely the LIF, the Quadratic I&F (QIF) and the Exponential I&F (EIF), to understand whether the use of more complex models increases the performance of the system and whether the choice of a neuron model can be directed by the task to be completed. Neuron models are tested within an SNN trained with Spike-Timing Dependent Plasticity (STDP) on a classification task on the N-MNIST and DVS Gestures datasets. Experimental results reveal that more complex neurons manifest the same ability as simpler ones to achieve high levels of accuracy on a simple dataset (N-MNIST), albeit requiring comparably more hyper-parameter tuning. However, when the data possess richer Spatio-temporal features, the QIF and EIF neuron models steadily achieve better results. This suggests that accurately selecting the model based on the richness of the feature spectrum of the data could improve the whole system's performance. Finally, the code implementing the spiking neurons in the SpykeTorch framework is made publicly available.
Spiking Neural Networks and Their Applications: A Review
Brain Sciences
The past decade has witnessed the great success of deep neural networks in various domains. However, deep neural networks are very resource-intensive in terms of energy consumption, data requirements, and high computational costs. With the recent increasing need for the autonomy of machines in the real world, e.g., self-driving vehicles, drones, and collaborative robots, exploitation of deep neural networks in those applications has been actively investigated. In those applications, energy and computational efficiencies are especially important because of the need for real-time responses and the limited energy supply. A promising solution to these previously infeasible applications has recently been given by biologically plausible spiking neural networks. Spiking neural networks aim to bridge the gap between neuroscience and machine learning, using biologically realistic models of neurons to carry out the computation. Due to their functional similarity to the biological neural netwo...
Mathematics, 2022
The main required organ of the biological system is the Central Nervous System (CNS), which can influence the other basic organs in the human body. The basic elements of this important organ are neurons, synapses, and glias (such as astrocytes, which are the highest percentage of glias in the human brain). Investigating, modeling, simulation, and hardware implementation (realization) of different parts of the CNS are important in case of achieving a comprehensive neuronal system that is capable of emulating all aspects of the real nervous system. This paper uses a basic neuron model called the Izhikevich neuronal model to achieve a high copy of the primary nervous block, which is capable of regenerating the behaviors of the human brain. The proposed approach can regenerate all aspects of the Izhikevich neuron in high similarity degree and performances. The new model is based on Look-Up Table (LUT) modeling of the mathematical neuromorphic systems, which can be realized in a high deg...