Exponential stabilization of delayed recurrent neural networks: A state estimation based approach (original) (raw)

Global exponential stability of recurrent neural networks with pure time-varying delays

This paper presents new theoretical results on the global exponential stability of recurrent neural networks with bounded activation functions and bounded time-varying delays in the presence of strong external stimuli. It is shown that the Cohen-Grossberg neural network is globally exponentially stable, if the absolute value of the input vector exceeds a criterion. As special cases, the Hopfield neural network and the cellular neural network are examined in detail. In addition, it is shown that criteria herein, if partially satisfied, can still be used in combination with existing stability conditions. Simulation results are also discussed in two illustrative examples.

New exponentially convergent state estimation method for delayed neural networks

Neurocomputing, 2009

The problem of designing a globally exponentially convergent state estimator for a class of delayed neural networks is investigated in this paper. The time-delay pattern is quite general and including fast time-varying delays. The activation functions are monotone nondecreasing with known lower and upper bounds. A linear estimator of Luenberger-type is developed and by properly constructing a new Lyapunov–Krasovskii functional

Global asymptotic stability of a general class of recurrent neural networks with time-varying delays

IEEE Transactions on Circuits and Systems I-regular Papers, 2003

This paper investigates the absolute exponential stability of a general class of delayed neural networks, which require the activation functions to be partially Lipschitz continuous and monotone nondecreasing only, but not necessarily differentiable or bounded. Three new sufficient conditions are derived to ascertain whether or not the equilibrium points of the delayed neural networks with additively diagonally stable interconnection matrices are absolutely exponentially stable by using delay Halanay-type inequality and Lyapunov function. The stability criteria are also suitable for delayed optimization neural networks and delayed cellular neural networks whose activation functions are often nondifferentiable or unbounded. The results herein answer a question: if a neural network without any delay is absolutely exponentially stable, then under what additional conditions, the neural networks with delay is also absolutely exponentially stable. q

Delay-Dependent Asymptotical Stabilization Criterion of Recurrent Neural Networks

Applied Mechanics and Materials, 2013

This paper deals with the problem of delay-dependent stability criterion of discrete-time recurrent neural networks with time-varying delays. Based on quadratic Lyapunov functional approach and free-weighting matrix approach, some linear matrix inequality criteria are found to guarantee delay-dependent asymptotical stability of these systems. And one example illustrates the exactness of the proposed criteria.

Simplified approach to the exponential stability of delayed neural networks with time varying delays

Chaos, Solitons & Fractals, 2007

Sufficient conditions in the form of linear matrix inequalities for the exponential stability of the equilibrium point for delayed neural networks with time varying delays are presented. The conditions turn out to be greatly simplified versions of the exponential stability results previously reported by Yucel and Arik. A distinct feature of the present criteria is that they are free of the degree of exponential stability. This feature makes the criteria computationally very attractive.

Delay-dependent exponential stability analysis of delayed neural networks: an LMI approach

Neural Networks, 2002

For neural networks with constant or time-varying delays, the problems of determining the exponential stability and estimating the exponential convergence rate are studied in this paper. An approach combining the Lyapunov–Krasovskii functionals with the linear matrix inequality is taken to investigate the problems, which provide bounds on the interconnection matrix and the activation functions, so as to guarantee the systems' exponential stability. Some criteria for the exponentially stability, which give information on the delay-dependence property, are derived. The results obtained in this paper provide one more set of easily verified guidelines for determining the exponentially stability of delayed neural networks, which are less conservative and less restrictive than the ones reported so far in the literature.

Global exponential stability and periodicity of recurrent neural networks with time delays

IEEE Transactions on Circuits and Systems I-regular Papers, 2005

In this paper, the global exponential stability and periodicity of a class of recurrent neural networks with time delays are addressed by using Lyapunov functional method and inequality techniques. The delayed neural network includes the well-known Hopfield neural networks, cellular neural networks, and bidirectional associative memory networks as its special cases. New criteria are found to ascertain the global exponential stability and periodicity of the recurrent neural networks with time delays, and are also shown to be different from and improve upon existing ones.

On the absolute stability for recurrent neural networks with time delays

IEEE International Carpathian Control Conference, ICCC 2012, Slovakia, pp. 97–102, 2012

In the paper there are considered two main aspects of the analysis for the dynamical properties of the recurrent neural networks with time-delays: the absolute stability and the global qualitative behavior of the system. The first aspect refers to the global asymptotic stability of the zero equilibrium and this means that only a single steady state of the neural network matters (the case of optimizers). The second aspect concerns global behavior of the systems with several equilibria. We have discussed the difficulties and the open problem concerning that second aspect.

(Corr. to) Delay-dependent exponential stability analysis of delayed neural networks: an LMI approach

Neural Networks, 2003

For bi-directional associative memory (BAM) neural networks (NNs) with different constant or time-varying delays, the problems of determining the exponential stability and estimating the exponential convergence rate are investigated in this paper. An approach combining the Lyapunov-Krasovskii functional with the linear matrix inequality (LMI) is taken to study the problems, which provide bounds on the interconnection matrix and the activation functions, so as to guarantee the systemÕs exponential stability. Some criteria for the exponential stability, which give information on the delay-dependent property, are derived. The results obtained in this paper provide one more set of easily verified guidelines for determining the exponential stability of delayed BAM (DBAM) neural networks, which are less conservative and less restrictive than the ones reported so far in the literature. Some typical examples are presented to show the application of the criteria obtained in this paper.