On the Dynamics of Hopfield Neural Networks on Unit Quaternions (original) (raw)

A broad class of discrete-time hypercomplex-valued Hopfield neural networks

Neural Networks, 2019

In this paper, we address the stability of a broad class of discrete-time hypercomplexvalued Hopfield-type neural networks. To ensure the neural networks belonging to this class always settle down at a stationary state, we introduce novel hypercomplex number systems referred to as real-part associative hypercomplex number systems. Real-part associative hypercomplex number systems generalize the well-known Cayley-Dickson algebras and real Clifford algebras and include the systems of real numbers, complex numbers, dual numbers, hyperbolic numbers, quaternions, tessarines, and octonions as particular instances. Apart from the novel hypercomplex number systems, we introduce a family of hypercomplex-valued activation functions called B-projection functions. Broadly speaking, a B-projection function projects the activation potential onto the set of all possible states of a hypercomplex-valued neuron. Using the theory presented in this paper, we confirm the stability analysis of several discrete-time hypercomplex-valued Hopfield-type neural networks from the literature. Moreover, we introduce and provide the stability analysis of a general class of Hopfield-type neural networks on Cayley-Dickson algebras.

Continuous-Valued Octonionic Hopfield Neural Network

Proceeding Series of the Brazilian Society of Computational and Applied Mathematics, 2018

In this paper, we generalize the famous Hopfield neural network to unit octonions. In the proposed model, referred to as the continuous-valued octonionic Hopfield neural network (CV-OHNN), the next state of a neuron is obtained by setting its octonionic activation potential to length one. We show that, like the traditional Hopfield network, a CV-OHNN operating in an asynchronous update mode always settles down to an equilibrium state under mild conditions on the octonionic synaptic weights.

Some Remarks on the Stability of Discrete-Time Complex-Valued Multistate Hopfield Neural Networks

Proceeding Series of the Brazilian Society of Computational and Applied Mathematics, 2018

In this paper, we review three discrete-time complex-valued Hopfield neural networks (CvMHNNs) proposed recently in the literature. Contrary to what has been stated, we provide examples in which the sequences produced by these CvMHNN fails to converge under the usual conditions on the synaptic weight matrix, that is, the synaptic weight matrix is hermitian with non-negative diagonal elements. Furthermore, we present one CvMHNN model that always settle down to a stationary state under the usual conditions on the synaptic weights.

Configurations of steady states for Hopfield-type neural networks

Applied Mathematics and Computation, 2006

The dependence of the steady states on the external input vector I for the continuous-time and discrete-time Hopfieldtype neural networks of n neurons is discussed. Conditions for the existence of one or several paths of steady states are derived. It is shown that, in some conditions, for an external input I there may exist at least 2 n exponentially stable steady states (called configuration of steady states), and their regions of attraction are estimated. This means that there exist 2 n paths of exponentially stable steady states defined on a certain set of input values. Conditions assuring the transfer of a configuration of exponentially stable steady states to another configuration of exponentially stable steady states by successive changes of the external input are obtained. These results may be important for the design and maneuvering of Hopfield-type neural networks used to analyze associative memories.

Slow–fast dynamics of tri-neuron Hopfield neural network with two timescales

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier's archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/authorsrights a b s t r a c t The slow-fast dynamics of a tri-neuron Hopfield neural network with two timescales is stated in present paper. On the basis of geometric singular perturbation theory, the transition of the solution trajectory is illuminated, and the existence of the relaxation oscillation with rapid movement process alternating with slow movement process is proved. It is indicated the characteristic of the relaxation oscillation is dependent on the structure of the slow manifold. Moreover, the approximate expression of the relaxation oscillation and its period are obtained analytically. Case studies are given to demonstrate the validity of theoretical results.

Generating exponentially stable states for a Hopfield Neural Network

Neurocomputing, 2017

An algorithm that generates an exponential number of stable states for the very well-known Hopfield Neural Network (HNN) is introduced in this paper. We show that the quantity of stable states depends on the dimension and number of components of the input pattern supporting noise. Extensive tests verify that the states generated by our algorithm are stable states and show the exponential storage capacity of a HNN. This paper opens the possibility of designing improved HNNs able to achieve exponential storage, and thus find their applicability in complex real-world problems.

Self-modeling in Hopfield Neural Networks with Continuous Activation Function

Procedia Computer Science, 2018

Hopfield networks can exhibit many different attractors of which most are local optima. It has been demonstrated that combining states randomization and Hebbian learning enlarges the basin of attraction of globally optimal attractors. The procedure is called self-modeling and it has been applied in symmetric Hopfield networks with discrete states and without self-recurrent connections. We are interested in knowing which topological constraints can be relaxed. So, the self-modeling process is tested in asymmetric Hopfield networks with continuous states and self-recurrent connections. The best results are obtained in networks with modular structure.

Neural Networks for Quaternion-valued Function Approximation

1994

In the paper a new structure of Multi-Layer Perceptron, able to deal with quaternion-valued signal, is proposed. A learning algorithm for the proposed Quaternion MLP (QMLP) is also derived. Such a neural network allows to interpolate functions of quaternion variable with a smaller number of connections with respect to the corresponding real valued MLP.