168
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: EXPRESS BRIEFS, VOL. 52, NO. 3, MARCH 2005
Global Asymptotic Stability and Global Exponential Stability of Neural Networks With Unbounded Time-Varying Delays Zhigang Zeng, Jun Wang, Senior Member, IEEE, and Xiaoxin Liao
Abstract—This brief studies the global asymptotic stability and the global exponential stability of neural networks with unbounded time-varying delays and with bounded and Lipschitz continuous activation functions. Several sufficient conditions for the global exponential stability and global asymptotic stability of such neural networks are derived. The new results given in the brief extend the existing relevant stability results in the literature to cover more general neural networks. Index Terms—Global asymptotic stability, global exponential stability, neural networks, unbounded time-varying delay(UDNN).
Obviously, the sigmoid activation function in the Hopfield neural network, the linear saturation activation function in the cellular neural network, and the radial basis function (RBF) in and . an RBF network satisfy the above assumptions It is well known that the equilibrium points of UDNN (1) exist by the Schauder fixed point theorem and assumptions and . Let be an equilibrium point of UDNN (1), , then UDNN (1) can be rewritten as
I. INTRODUCTION
C
ONSIDER a general class of continuous-time recurrent neural networks with unbounded time-varying delays (UDNNs) described by the following model:
(2) where
(1) where and are connection weights related to respectively the neurons without and with delays, is the self-feedback connection weight, is an input, is a continuous time delay, and and are activation functions related to respectively the neurons . In this paper, we without and with delays; : assume that for 1) 2)
: and : and constants
are bounded functions; are Lipschitz continuous; i.e., there exist , such that for any , , ,
Manuscript received August 15, 2002; revised June 6, 2004. This work was supported by the Hong Kong Research Grants Council under Grant CUHK4165/03E and by the Natural Science Foundation of China under Grant 60405002. This paper was recommended by Associate Editor A. Kuh. Z. Zeng is with the School of Automation, Wuhan University of Technology, Wuhan 430070, China, and also with the Department of Automation, University of Science and Technology of China, Hefei 230026, China (e-mail:
[email protected]). J. Wang is with the Department of Automation and Computer-Aided Engineering, The Chinese University of Hong Kong, Shatin, New Territories, Hong Kong (e-mail:
[email protected]). X. Liao is with the Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, China. Digital Object Identifier 10.1109/TCSII.2004.842047
,
. In recent years, global stability of various neural networks has been investigated extensively. In stability analysis of neural networks, the qualitative properties primarily concerned are uniqueness, global stability, robust stability, and absolute stability of their equilibria. In [8], [11], nice results are given for stability of neural networks without time delays. The case of constant time delay is also studied in [2], [9]. In addition, The case of bounded time-varying delay is also studied in [3], [13]. The global asymptotic stability of cellular neural networks with time delay is discussed in [5], [7], [9], [10]. As models of human brains, neural networks have memory function; i.e., the state of the present time is related to the state of the past. Time delay provides information of history. When the ), the time delay time delay is a fraction of time (e.g., is unbounded as well as time-varying. As in many other dynamical systems, it is well known that delays may result in instability. So, to study delayed neural networks (DNNs), one must address the problem of how to remove this destabilizing effect. To date, most research on DNNs has been restricted to simple cases of constant delays. In DNNs, it is clear that a constant delay is only a special case. In most situations, delays are variable, and in fact unbounded. That is, the entire history affects the present. Such delay terms, more suitable to practical neural nets, are called unbounded delays. Moreover, in some practical applications and hardware implementations of neural networks, the inevitable time delay may be unbounded. The global stability of UDNNs is seldom investigated [12]. Therefore, the stability analysis of UDNNs is necessary and rewarding. In this paper, we consider a general neural network model. Several sufficient
1057-7130/$20.00 © 2005 IEEE
ZENG et al.: GLOBAL ASYMPTOTIC STABILITY AND GLOBAL EXPONENTIAL STABILITY
conditions for the global asymptotic stability and the global exponential stability of UDNNs (1) are obtained. These results are new and different from the existing ones.
169
, then UDNN (2) is GES. Proof: Let . Then
II. PRELIMINARIES In this section, we introduce relevant notations and definitions to facilitate the presentation of main results in the ensuing sections. as the idenThroughout of this paper, we denote as the vector -norm of the vector tity matrix. Denote with satisfies . is the as the -norm of the matrix vector infinity norm. Denote induced by the vector -norm. Denote as the set of continuous functions. , the initial condition of the neural network For any model (2) is assumed to be
(5)
where
denotes upper-right Dini-derivative operator. Let
(3) Take Definition: UDNN (1) is said to be global asymptotically stable (GAS), if it is locally stable in the sense of Lyapunov and is globally attractive. In addition, UDNN (1) is said to be glob, ally exponentially stable (GES), if there exist constants such that the solutions of (2) with any initial condition (3) satisfies
such that for . For . When
, Now let for when
,
, , let , let
. , then
. Otherwise, because , there exist and such that for
(6) (7) and is said to be the rate of exponential convergence of UDNN (2).
and for
, when
, (8)
III. MAIN RESULTS But following (5) Theorem 1: If there exist , such that for
and
(4)
when If there exist
,
, then UDNN (2) is GAS. such that for ,
170
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: EXPRESS BRIEFS, VOL. 52, NO. 3, MARCH 2005
It follows from (6) and (8) that
If
is a nonsingular ) such that
If are bounded ciently small, then for
-matrix, then there exist
, by taking
suffi-
Thus, by letting , , Theorem 1 implies that UDNN (2) is GES. Moreover, the conditions of Theorem in [4], Theorem in [10] can lead to that is a nonsingular -matrix. Since matrix with nonpositive off-diagonal entries is Lyapunov diagonally stable, if and only if is a nonsingular -matrix [1], when , , and , the results in Theorem 4 of [6] and Theorem B in [7] can also be strengthened. , for all In the following discussions, let . Using the proof techniques differed from Theorem 1, we have following five theorems and two corollaries, which provide an effective method for the stability analysis of neural networks. Theorem 2: Let (9) where
By (4),
, which contradicts to (7). Hence, for
is any positive constant
then UDNN (2) is GAS. If there exist
. If
,
such that
(10) The conclusion holds. Remark 1: When UDNN (2) does not satisfy (4), it may be is a soluunstable, e.g., tion of UDNN
where , hence, such an UDNN is unstable. Remark 2: When is a monotone increasing function, the is substituted conclusion of Theorem 1 is still valid if by . Using Theorem 1, UDNN (2) can be strengthened as globally exponentially stable under the same conditions of [13, Th. 1]. In fact, let
then UDNN (2) is GES. Proof: Let
Let
, then
, . Then
and for
, (11)
ZENG et al.: GLOBAL ASYMPTOTIC STABILITY AND GLOBAL EXPONENTIAL STABILITY
When
, let
; when and it follows from increasing function and for
, let that
, then is a monotone
(12) By (11), when ,
, for
,
, hence for . So for
171
then UDNN (2) is GES. , then it is Proof: Let similar to the proof of Theorem 2, we derive the result immediately. Let in Theorem 3, we immediately have the following testable stability condition. Corollary 2: Let . If there exist , such that
Using (12), for then UDNN (2) is GES. Remark 5: When , using Corollaries 1 and 2, it is very convenient to estimate the rate of exponential convergence of UDNN (2). , , Denote . In the following discussions, we always denote that , , and are, respectively, the maximal eigenvalues of the matrices
then for
The proof is complete. Remark 3: In view of (9) and (10), is a necessary condition for the global exponential stability of UDNN. When is a nonsingular -matrix, , where , ; , . . For Remark 4: The stability condition (10) depends on the left-hand side of (10) may be rewritten in an a specific and explicit form without the integral. For example, if , then is estimated exponential convergence rate of UDNN (2). in Theorem 2, we immediately Let have the following testable stability condition. , Corollary 1: Let . If there exist and such that
If is an eigenvalue of the matric value of the matrix . Thus, Theorem 4: If is GAS, where such that
then UDNN (2) is GES. Proof: Let then UDNN (2) is GES. Using a different Lyapunov function from that in Theorem 2, we have following four theorems. , , let Theorem 3: For any
If UDNN (2) is GAS. If there exist
and
, then such that
then ,
is also an eigen.
, then UDNN (2) . If there exist and
, then
The remaining proof is similar to the last part of Theorem 2. Remark 6: When is a monotone increasing function, the is substituted conclusions of Theorems 2–4 are still valid if . by Theorem 5: If , then , such that UDNN (2) is GAS. If there exist
Proof: Let
, then UDNN (2) is GES. , then
The remaining proof is similar to the last part of Theorem 2.
172
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS—II: EXPRESS BRIEFS, VOL. 52, NO. 3, MARCH 2005
Theorem 6: For any , let
,
,
,
,
,
and
where , for any positive integer
, such that
If UDNN (2) is GAS. If there exist then UDNN (2) is GES. Proof: Let
,
, then such that ,
is continuous. . Take
Obviously then for any
, . Since
,
, it follows from Theorem 2 that UDNN (14) is GES. Furthermore, it is shown that the exponential convergence rate is at least equal to 1. However, the stability of (14) can not be determined , by Theorem 1 or Corollary 2. In addition, Taking , by Theorem 3, (14) is GES. Example 3: Consider a two-neuron UDNN
then
The remaining proof is similar to the last part of Theorem 2. Remark 7: When , the result of Theorem 6 is identical to that of [9, Th. 1]. However, the situation of time-varying delay has not be discussed in [9]. Moreover, Theorem 6 provides the estimated rate of exponential convergence. Two methods are used above for stability analysis: one for Theorem 1 is based on Halanay-like inequality and another for Theorems 2–6 is based on Growell–Bellman-like inequality.
IV. ILLUSTRATIVE EXAMPLES In this section, we will give three numerical examples to illustrate nonoverlapping differences of the theorems. Example 1: Consider UDNN
(13)
where , and UDNNs , , , , for any , Let . Moreover, according to Theorem 1, UDNN (13) is GAS. But Theorems 2–6 cannot be used to ascertain the stability of (13). Example 2: Consider UDNN
(15) ,2 is any continuous function which satisfies and with , the delay , is assumed to be the same continuous function in Example 2. Since
, where assumption
then
,
,
. Obviously, . It follows from Theorem 4 that UDNN (15) is GES. Furthermore, it is shown . But Theorems that the exponential convergence rates 1–3, or 5 cannot be used to ascertain the stability of (15). V. CONCLUDING REMARKS Stability analysis of neural networks is an important topic in UDNN applications to associative memory and optimization. It is well known that time delays is often unavoidable in hardware implementation and is sometimes desirable in practical applications of neural networks. In most situations, a time delay is variable and may extend over all the past. In this brief, we address the global stability of a large class of neural networks with UDNNs and globally Lipschitz continuous bounded activation functions. Several sufficient conditions for global asymptotic stability and global exponential stability are presented. These conditions offer testable criteria for ascertainment of global stability of the general class of neural networks. Several examples are also given to illustrate the use of the proposed results in comparison with some existing results. REFERENCES
(14)
[1] M. Forti and A. Tesi, “New conditions for global stability of neural networks with application to linear and quadratic programming problems,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 42, no. 7, pp. 354–366, Jul. 1995.
ZENG et al.: GLOBAL ASYMPTOTIC STABILITY AND GLOBAL EXPONENTIAL STABILITY
[2] P. V. D. Driessche and X. Zou, “Global attractivity in delayed Hopfield neural network models,” SIAM. J. Appl. Math, vol. 58, no. 6, pp. 1878–1890, 1998. [3] C. Hou and J. Qian, “Stability analysis for neural dynamics with timevarying delays,” IEEE Trans. Neural Netw., vol. 9, no. 1, pp. 221–223, Feb. 1998. [4] J. Cao and D. Zhou, “Stability analysis of delayed cellular neural networks,” Neural Networks, vol. 11, pp. 1601–1605, 1998. [5] T. L. Liao and F. C. Wang, “Global stability for cellular neural networks with time delay,” IEEE Trans. Neural Netw., vol. 11, no. 6, pp. 1481–1484, Dec. 2000. [6] S. Arik, “Global asymptotic stability of a class of dynamical neural networks,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 47, no. 4, pp. 568–571, Apr. 2000. [7] S. Arik and V. Tavsanoglu, “On the global asymptotic stability of delayed cellular neural networks,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 47, no. 4, pp. 571–574, Apr. 2000. [8] X. Liang and J. Wang, “Absolute exponential stability of neural networks with a general class of activation functions,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 47, no. 8, pp. 1258–1263, Aug. 2000.
173
[9] J. Cao and Q. Li, “On the exponential stability and periodic solutions of delayed cellular neural networks,” J. Math. Anal. Appl., vol. 252, no. 1, pp. 50–64, 2000. [10] J. Cao, “A set of stability criteria for delayed cellular neural networks,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 48, no. 4, pp. 494–498, Apr. 2001. [11] X. Liang and J. Wang, “An additive diagonal-stability condition for absolute exponential stability of a general class of neural networks,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 48, no. 11, pp. 1308–1317, Nov. 2001. [12] Y. Zhang, P. A. Heng, and K. S. Leung, “Convergence analysis of cellular neural networks with unbounded delay,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 48, no. 5, pp. 680–687, May 2001. [13] Z. G. Zeng, J. Wang, and X. X. Liao, “Global exponential stability of a general class of recurrent neural networks with time-varying delays,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 50, no. 10, pp. 1353–1358, Oct. 2003.