Global exponential stability of recurrent neural networks for synthesizing linear feedback control systems via pole assignment (original) (raw)
Related papers
Global exponential stability of recurrent neural networks with pure time-varying delays
This paper presents new theoretical results on the global exponential stability of recurrent neural networks with bounded activation functions and bounded time-varying delays in the presence of strong external stimuli. It is shown that the Cohen-Grossberg neural network is globally exponentially stable, if the absolute value of the input vector exceeds a criterion. As special cases, the Hopfield neural network and the cellular neural network are examined in detail. In addition, it is shown that criteria herein, if partially satisfied, can still be used in combination with existing stability conditions. Simulation results are also discussed in two illustrative examples.
Global asymptotic stability of a general class of recurrent neural networks with time-varying delays
IEEE Transactions on Circuits and Systems I-regular Papers, 2003
This paper investigates the absolute exponential stability of a general class of delayed neural networks, which require the activation functions to be partially Lipschitz continuous and monotone nondecreasing only, but not necessarily differentiable or bounded. Three new sufficient conditions are derived to ascertain whether or not the equilibrium points of the delayed neural networks with additively diagonally stable interconnection matrices are absolutely exponentially stable by using delay Halanay-type inequality and Lyapunov function. The stability criteria are also suitable for delayed optimization neural networks and delayed cellular neural networks whose activation functions are often nondifferentiable or unbounded. The results herein answer a question: if a neural network without any delay is absolutely exponentially stable, then under what additional conditions, the neural networks with delay is also absolutely exponentially stable. q
Global Exponential Stability of Hopfield-Type Neural Networks with Time Delays
2005
The paper is concerned with the improvement of sufficient conditions for the exponential stability of Hopfield-type neural networks displaying interaction delays. The results are based on a method obtained in our previous work that combines an idea suggested by Malkin for studying the absolute stability of a nonlinear system via their linearisations and a procedure proposed by Kharitonov for construction of an “exact” Liapunov-Krasovskii functional used in the analysis of uncertain linear time delay systems. Since the Liapunov function method give only sufficient conditions for stability, the improvement of these criteria is obviously necessary. These less conservative conditions are suitable for the implementation of recurrent neural networks.
Exponential stabilization of delayed recurrent neural networks: A state estimation based approach
2013
This paper is concerned with the stabilization problem of delayed recurrent neural networks. As the states of neurons are usually difficult to be fully measured, a state estimation based approach is presented. First, a sufficient condition is derived such that the augmented system under consideration is globally exponentially stable. Then, by employing a decoupling technique, the gain matrices of the controller and state estimator are achieved by solving some linear matrix inequalities. Finally, a delayed neural network with chaotic behaviors is exploited to demonstrate the applicability of the developed result.
Global exponential stability and periodicity of recurrent neural networks with time delays
IEEE Transactions on Circuits and Systems I-regular Papers, 2005
In this paper, the global exponential stability and periodicity of a class of recurrent neural networks with time delays are addressed by using Lyapunov functional method and inequality techniques. The delayed neural network includes the well-known Hopfield neural networks, cellular neural networks, and bidirectional associative memory networks as its special cases. New criteria are found to ascertain the global exponential stability and periodicity of the recurrent neural networks with time delays, and are also shown to be different from and improve upon existing ones.
1995
Su cient conditions for global asymptotic stability of discrete time multilayer recurrent neural networks are derived in this paper. Both the autonomous and non-autonomous case are treated. Multilayer recurrent neural networks are interpreted as so-called NLq systems, which are nonlinear systems consisting of an alternating sequence of linear and static nonlinear operators that satisfy a sector condition (q`layers'). It turns out that many problems arising in recurrent neural networks and system and control theory can be interpreted as NLq systems, such as multilayer Hop eld nets, locally recurrent globally feedforward networks, generalized cellular neural networks, neural state space control systems, the Lur'e problem, linear fractional transformations with real diagonal uncertainty block, digital lters with over ow characteristic etc. In this paper we discuss applications of the theorems for designing neural state space control systems (emulator approach). Narendra's dynamic backpropagation procedure is modi ed in order to assess closed loop stability. The new theory also enables to consider reference inputs belonging to the class of functions l2 instead of speci c reference inputs.
On the absolute stability for recurrent neural networks with time delays
IEEE International Carpathian Control Conference, ICCC 2012, Slovakia, pp. 97–102, 2012
In the paper there are considered two main aspects of the analysis for the dynamical properties of the recurrent neural networks with time-delays: the absolute stability and the global qualitative behavior of the system. The first aspect refers to the global asymptotic stability of the zero equilibrium and this means that only a single steady state of the neural network matters (the case of optimizers). The second aspect concerns global behavior of the systems with several equilibria. We have discussed the difficulties and the open problem concerning that second aspect.
International Journal of Biomathematics, 2014
This paper focuses on the existence, uniqueness and global robust stability of equilibrium point for complex-valued recurrent neural networks with multiple time-delays and under parameter uncertainties with respect to two activation functions. Two sufficient conditions for robust stability of the considered neural networks are presented and established in two new time-independent relationships between the network parameters of the neural system. Finally, three illustrative examples are given to demonstrate the theoretical results. and transmission, may create bad dynamical behaviors of the networks, for example, oscillation, instability and bifurcation [9-11, 16, 23]. Hence, it is necessary to study the dynamical behavior of delayed neural networks, and a great deal of significant results have been reported in the open literatures.
Neurocomputing, 2007
In this paper, we essentially drop the requirement of Lipschitz condition on the activation functions. Only using physical parameters of neural networks, some new criteria concerning global exponential stability for a class of generalized neural networks with time-varying delays are obtained. The neural network model considered includes the delayed Hopfield neural networks, bidirectional associative memory networks, and delayed cellular neural networks as its special cases. Since these new criteria do not require the activation functions to be differentiable, bounded or monotone nondecreasing, the connection weight matrices to be symmetric and the delay function t ij ðtÞ to be differentiable, our results are mild and more general than previously known criteria. Four illustrative examples are given to demonstrate the effectiveness of the obtained results. r