Flow-invariance and stability analysis for a class of nonlinear systems with slope conditions (original) (raw)

Systems with slope restricted nonlinearities and neural networks dynamics

Advances in Computational Intelligence, Lecture Notes in Computer Science 6692/2011, pp. 565-572, 2011

The quite large standard class of additive neural networks is considered from the point of view of the qualitative theory of differential equations. Connections with the theory of absolute stability are pointed out and a new class of Liapunov functions is introduced, starting from the positiveness theory (Yakubovich-Kalman-Popov lemma). The results are valid for a quite large class of dynamical systems and they are tested on some neural network structures. In the concluding part some perspective research is mentioned, including synchronization and time-delay effects.

Practical Stability of Hopfield-type Neural Networks (Qualitative theory of functional equations and its application to mathematical science)

2001

The asymptotic, global and exponential stability properties of Hopfield-type neural networks have been extensively studied since Hopfield announced his results in the early 1980s (for areview of recent results, see Guan et al. [1]). This reflects the importance of Hopfield-type neural networks as applied to associative memory, pattern recognition and optimization problems. This paper considers aseemingly less important stability concept to neural networks. Historically termed practical stability and first proposed by LaSalle and Lefschetz [2], it offers a very general notion that may indicate any one of these: asymptotic or global types of stability; total stability or stability under persistent disturbances; instability or boundedness of solutions. It is neither weaker nor stronger than ordinary stability, and it does not imply stability or convergence of trajectories. This may explain the negligible volume of literature devoted so far to practical stability of neural networks.

Refined qualitative analysis for a class of neural networks

New results of qualitative analysis are presented for a class of neural networks (Hopfield-type), representing a refinement in the interpretation of their behaviour. The main instrument of this analysis consists in the individual monitoring of the statetrajectories by considering time-dependent rectangular sets that are forward invariant with respect to the dynamics of the investigated systems. Particular requirements for the rectangular sets approaching the equilibrium point allow a componentwise exploration of the stability properties, offering additional information with respect to the traditional framework (that expresses a global knowledge, built in terms of norms).

Configurations of steady states for Hopfield-type neural networks

Applied Mathematics and Computation, 2006

The dependence of the steady states on the external input vector I for the continuous-time and discrete-time Hopfieldtype neural networks of n neurons is discussed. Conditions for the existence of one or several paths of steady states are derived. It is shown that, in some conditions, for an external input I there may exist at least 2 n exponentially stable steady states (called configuration of steady states), and their regions of attraction are estimated. This means that there exist 2 n paths of exponentially stable steady states defined on a certain set of input values. Conditions assuring the transfer of a configuration of exponentially stable steady states to another configuration of exponentially stable steady states by successive changes of the external input are obtained. These results may be important for the design and maneuvering of Hopfield-type neural networks used to analyze associative memories.

Discussion on: On the Continuous Time-Varying JLQ Problem

New results of qualitative analysis are presented for a class of dynamical systems (including the Hopfield neural networks) whose nonlinearities satisfy certain slope conditions. The main instrument of this analysis consists in the individual monitoring of the statetrajectories by considering time-dependent rectangular sets that are forward invariant with respect to the dynamics of the investigated systems. Particular requirements for the rectangular sets approaching the equilibrium point allow a componentwise exploration of the stability properties, offering additional information with respect to the traditional framework (that expresses a global knowledge, built in terms of norms). Within this context, we are able to point out some important dynamical aspects that remained hidden for other works relying only on the standard tools of stability analysis. The refinement induced by the componentwise point of view is also revealed by two numerical examples.

Convergence Criteria for a Hopfield-type Neural Network

International Journal of Applied Mathematics and Computer Science 3(1):45-71, 2005

Motivated by recent applications of the Lyapunov's method in artificial neural networks, which could be considered as dynamical systems for which the convergence of the system trajectories to equilibrium states is a necessity. We re-look at a well-known Krasovskii's stability criterion pertaining to a non linear autonomous system. Instead, we consider the components of the same autonomous system with the help of the elements of Jacobian matrix J(x), thus proposing much simpler convergence criteria via the method of Lyapunov. We then apply our results to artificial neural networks and discuss our results with respect to recent ones in the field.

Developments in exploring set invariance for Hopfield neural networks

2019 IEEE 13th International Symposium on Applied Computational Intelligence and Informatics (SACI)

The paper considers the nonlinear dynamics of a large class of continuous-time Hopfield neural networks (abbreviated as HNNs). Our research proposes sufficient conditions for testing the existence of contractive invariant sets with general form, defined by p-norms, $ 1\leq p\leq\infty$, which are weighted by rectangular, full column rank, non-negative matrices. These sufficient conditions have algebraic form and use a test matrix built from the HNN coefficients. From the point of view of the mathematical constructions, this test matrix defines the dynamics of a comparison system (with linear form), whose trajectories ensure componentwise upper bounds for the HNN trajectories. These bounds play an intermediary role in proving that any HNN trajectory remains inside a contractive set, once initialized inside that set. Two theorems are stated for covering both the local and the global cases of invariance. The theoretical results are illustrated by numerical examples run in Matlab, which also offer a visual support for the invariance property.

On global stability of Hopfield neural networks with discontinuous neuron activations

Proceedings of the 2003 International Symposium on Circuits and Systems, 2003. ISCAS '03., 2000

The paper introduces a general class of neural networks where the neuron activations are modeled by discontinuous functions. The neural networks have an additive interconnecting structure and they include as particular cases the Hopfield neural networks (HNNs), and the standard Cellular Neural Networks (CNNs), in the limiting situation where the HNNs and CNNs possess neurons with infinite gain. Conditions are obtained which ensure global convergence toward the unique equilibrium point in finite time, where the convergence time can be easily estimated on the basis of the relevant neural network parameters. These conditions are based on the concept of Lyapunov Diagonally Stable (LDS) neuron interconnection matrices, and are applicable to general non-symmetric neural networks.

Stability analysis of an unsupervised neural network with feedforward and feedback dynamics

Neurocomputing, 2006

We present a new method of analyzing the dynamics of self-organizing neural networks with different time scales based on the theory of flow invariance. We prove the existence and the uniqueness of the equilibrium. A strict Lyapunov function for the flow of a competitive neural system with different time scales is given and based on it we are able to prove the global asymptotic stability of the equilibrium point. r

Exponential Stability of Implicit Euler, Discrete-Time Hopfield Neural Networks

2009

Abstract. The exponential stability of continuous-time Hopfield neural networks is not preserved when implemented on digital computers by means of explicit numerical methods, whereas the implicit (or backward) Euler method preserves this exponential stability under exactly the same sufficient conditions as those previously obtained for the continuous model. The proof is based on the nonlinear measure approach, here extended to discrete-time systems.