Almost Sure Exponential Stability of Recurrent Neural ... - IEEE Xplore

Report 3 Downloads 121 Views
840

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

Almost Sure Exponential Stability of Recurrent Neural Networks With Markovian Switching Yi Shen and Jun Wang, Fellow, IEEE

Abstract—This paper presents new stability results for recurrent neural networks with Markovian switching. First, algebraic criteria for the almost sure exponential stability of recurrent neural networks with Markovian switching and without time delays are derived. The results show that the almost sure exponential stability of such a neural network does not require the stability of the neural network at every individual parametric configuration. Next, both delay-dependent and delay-independent criteria for the almost sure exponential stability of recurrent neural networks with time-varying delays and Markovian-switching parameters are derived by means of a generalized stochastic Halanay inequality. The results herein include existing ones for recurrent neural networks without Markovian switching as special cases. Finally, simulation results in three numerical examples are discussed to illustrate the theoretical results. Index Terms—Almost sure exponential stability, Halanay inequality, Markov chain, recurrent neural networks, time-varying delay.

I. INTRODUCTION

S

TABILITY of recurrent neural network is necessary for most successful neural network applications. Since the resurgence of neural network research in 1980s, numerous studies on the stability analysis of various neural networks have been reported with intensifying interest in recent years [1]–[39]. In implementation or applications of neural networks, it is not uncommon for the parameters of neural networks (e.g., connection weights and biases) change abruptly due to unexpected failure or designed switching [40]. In such a case, neural networks can be represented by a switching model which can be regarded as a set of parametric configurations switching from one to another according to a given rule. Recently, the stability of several switching neural networks with time delays was analyzed [26], [31], [38]. The recurrent neural network models in [26] and [31] are required to be stable

Manuscript received May 09, 2007; revised December 27, 2007 and June 11, 2008; accepted November 07, 2008. Current version published May 01, 2009. The work was supported by the Research Grants Council of the Hong Kong Special Administrative Region, China under Project CUHK417608E and the Natural Science Foundation of China under Grants 60574025, 60740430664, and 60874031. Y. Shen is with the Department of Control Science and Engineering and the Key Laboratory of Ministry of Education for Image Processing and Intelligent Control, Huazhong University of Science and Technology, Wuhan, Hubei 430074, China (e-mail: [email protected]). J. Wang is with the Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Shatin, New Territories, Hong Kong (e-mail: [email protected]). Digital Object Identifier 10.1109/TNN.2009.2015085

at all parametric configurations in order to guarantee the stability of the models, where the switching in these models is considered to be arbitrary. Moreover, in the proof, a common Lyapunov function or Lyapunov functional is employed in [26] and [31], which leads to some stability criteria that are not so easy to check. In [38], robust exponential stability of neural networks with Markovian switching is investigated in the mean square sense. In this paper, new results on almost sure stability of a general class of recurrent neural networks with Markovian switching is presented. When the time delays are absent in the recurrent neural network with Markovian switching model, we will prove that its almost sure exponential stability does not require the stability of every single neural network in it, which shows the rich stability properties of recurrent neural networks with Markovian switching. Moreover, we will discuss the stability of recurrent neural networks with Markovian switching and time delays. Different from [26] and [31] where only the Lyapunov function or Lyapunov functional is applied, a generalized stochastic Halanay inequality is established and adopted in addition to the Lyapunov function or Lyapunov functional. The stability criteria are divided into two categories in terms of time-delay dependence and time-delay independence. Because the states of the recurrent neural networks with Markovian switching are stochastic processes, the discussion of their almost sure exponential stability is more complex than the discussion of the stability of recurrent neural networks without Markovian switching, and the method is different from that in [26] and [31]. It is worth mentioning that the almost sure exponential stability criteria of the recurrent neural networks with Markovian switching include the stability criteria of recurrent neural networks without Markovian switching as special cases [5]–[11], [13]–[17], [32], [37]. Moreover, numerical simulation results substantiate the effectiveness of these results. The remainder of this paper is organized as follows. Section II describes some preliminaries. The main results are stated in Sections III and IV. Several illustrative examples are given in Section V. Finally, concluding remarks are made in Section VI. II. PRELIMINARIES Throughout this paper, unless otherwise specified, we let be a complete probability space with satisfying the usual conditions (i.e., a filtration it is right continuous and contains all -null sets). Let and denote the family of confrom to with the norm tinuous functions where is the Euclidean norm in . Let and be a continuous function which will stand for the time delay

1045-9227/$25.00 © 2009 IEEE

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

of the system discussed in this paper. Let be a matrix; its and its operator norm is denoted transpose is denoted by (without any confusion with by ). Let be a symmetric matrix, denoted by and as its largest and smallest eigenvalue, respectively. be a right-continuous Markov chain on the Let probability space taking values in a finite state space with generator given by if if where

841

i.e., the origin is an equilibrium point of (4). Hence, stability of the equilibrium point of (1) equals to the stability of the equilibrium point at the origin in the state space of (4). In addition, the activation function in (4) is also monotone nondecreasing . and Lipschitz continuous like in (2) and in (4), then If

(5)

. Here

is the transition rate from to if while . It is known that almost every is a right-continuous step function with a sample path of finite number of simple jumps in any finite subinterval of [41]. As a standing assumption, we assume in this paper that the Markov chain is irreducible [41]. Under this assumption, the Markov chain has a unique stationary (probability) distri, which can be determined bution by solving the following homogenous linear equations subject to and . Let us consider a general recurrent neural network model with time-varying delay and Markovian switching

(1) with initial value on where is the state of the is a self-feedneuron, and back connection weight matrix, are, respectively, connection weight matrices associated without delay and with delay, respectively, is the time-varying delay that satisfies is a constant, is an external input (bias) vector to neurons, and is a vector-valued activation function, which satisfies the following assumption. , there exists such that For

Definition 1: Neural network model (4) is said to be almost surely exponentially stable if for any (where is the family of -mea-valued random varisurable bounded ables), there exist and such that hold almost surely; i.e., the Lyapunov exponent almost surely, where is the state of the model in (4). Neural network model (4) is said to be mean square exponentially stable if for any there exist and such that hold; i.e., the Lyapunov exponent where is the mathematical expectation. From the definitions, it is clear that the almost sure exponential stability implies the mean square exponential stability [41], but not vice versa. III. STABILITY CONDITIONS IN THE ABSENCE OF TIME DELAYS Theorem 1: For any initial value of (5) satisfies

the state

(6) where

is the stationary distribution of Markov chain

(7)

(2) . According to [41], for any Denote initial value (1) has only a globally . continuous state systems: Equation (1) can be regarded as the following

(8)

is a positive–definite matrix and positive diagonal matrix. In addition, if there exist a positive–definite matrix positive diagonal matrix such that

is a and a

(3) switching from one to another according to the movement of the . Markov chain Assume that (1) has an equilibrium point . Let . Then, (1) can be rewritten as

(4)

(9) then the recurrent neural network model without any time delay (5) is almost surely exponentially stable with its exponential convergence rate being at least . as for simplicity. If Proof: Let us denote then . Hence, (6) obviously holds. If then

842

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

according to the results in [41], in the following, assume

almost surely. Thus, . Let

being negative definite and the Schur complement [41], we have

(13)

(10) Then (11)

is If then (13) holds; i.e., negative definite, by choosing where is an identity matrix, and is a sufficiently small positive constant. is Remark 2: In [5], it is proved that if Lyapunov diagonally stable, then

From the generalized Itö formula [41] and (5), (8), (7), (10), and (11)

is globally asymptotically stable. Hence, Theorem 1 shows such that that if there exists a positive diagonal matrix is negative definite, [i.e., all subsystems of (5) are globally exponentially stable], then (5) is still almost surely exponentially stable under any Markovian switching. As such, Theorem 1 includes and extends the results in [5]. , then a Remark 3: Theorem 1 shows that if neural network is almost surely exponentially stable. From Re, then marks 1 and 2, we know that if the neural network at each individual parametric configuration (i.e., the neural network is stable. When only some at some parametric configurations is stable), the others have (i.e., the neural network at other configurations is unmay still hold (i.e., the neural network stable), is almost surely exponentially stable), where is the stationary . distribution of Markov chain and IV. STABILITY CONDITIONS IN THE PRESENCE OF TIME-VARYING DELAYS is a Lemma 1: Assume that real-valued -adapted process (for every is -measurable), is a right-continuous Markov chain taking values in and

(14) (12) By using the ergodic theory of Markov chain [41]

If there exist one of the following conditions holds:

such that

(15) Hence, from (11) and (12), (6) holds. Remark 1: If

versa. In fact, if

then is negative definite; i.e., is Lyapunov diagonally stable, and vice then by

(16)

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

then for any

843

Hence, it is just needed to prove (17) (18)

where

denotes Dini-derivative and If (18) does not hold, then for the right-continuity of there exist such that (19) (20) (21) On the other hand, from the definition of (20)

Proof: First, we prove let

Then, from (14),

. In fact, for any fixed

. From

(15), (19), and

and

and

there exists a unique such that . From the definition of . If then for any small enough there exists such that and . Without loss of generality, let . Then (22) From

the definition of

and

(23)

It is a contradiction. In order to prove (17), it is just needed to prove

Let

From by substituting (23) into (22), we have . It conflicts with (21). Hence, (18) holds. Letting (17) holds. It is similar to prove that if the condition (16) is satisfied, then (17) still holds. Remark 4: In (15), If and are constants, and is a deterministic function, then (15) can be rewritten as

844

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

Here, the condition (14) becomes . Hence, Lemma 1 degenerates into the classical Halanay inequality [42], which is a powerful tool in stability analysis of deterministic systems with delays. Hence, Lemma 1 is a generalization of the classical Halanay inequality. To establish a more general result, we need more notations. Let denote the family of all nonnegon , which are continuative functions ously twice differentiable in and once in . If define an operator from to by

and (27) (28)

Proof: For

let (29)

From (28) and (29)

where

(30) From (4), (27), (29), and the definition of Theorem 2: If nite matrices

[41]

, there exist positive–defiand positive diagonal matrices such that

where

independent on dependent on and (24) then for any initial value of (4) satisfies

the state

(25) i.e., (4) is almost surely exponentially stable with its exponential convergence rate being at least where is a unique positive solution of the transcend equation (26)

(31)

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

From (4) and the Hölder inequality

when

let and (33)–(35) into (32) yields

845

. Substituting

(36) By the generalized Itö formula [41], substituting (36) into (31) yields

(37) Substituting (30) into (37) yields

(32)

Hence

where

(38)

if then the Hölder inequality

. Choose

. By using again

From the generalized Itô formula [41] and (38)

(39)

(33)

From (24), (39), and Lemma 1 (40)

Similarly

(34)

(35)

where is a unique and positive solution of (26). From (30) and (40), (4) is the mean square exponentially stable. According to [41, Th. 7.24], (25) holds. Hence, (4) is almost surely exponentially stable. Remark 5: In Theorem 2, if and are independent of , then . Hence, the positive definiteness of is equivalent to that is positive definite; i.e., is Lyapunov diagonal stable. Theorem 2 has shown that if there exists a common positive diagonal matrix such that

846

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

Theorem 3: If for all

, there exist

matrices

, i.e., all subsystems

are exponentially stable (see Theorem 1), then when the delays are sufficiently small, for any Markovian switching, (4) is almost surely exponentially stable. The upper bound of the delays is given by (24). As such, Theorem 2 is a generalization of the main result in [5]. in (4) is Remark 6: In [38], only the case of there considered with the following results. If for any exist matrices , and such that

satisfying the coupled LMIs, shown in (41) at the bottom of the page, where are entries readily inferred by symmetry

then (4) is exponentially stable in the mean square sense. However, suppose . Then, the above condition does not hold for the resulting neural networks. In contrast, from Theorem 2, . According to Theit is easy to obtain that orem 2, (4) is almost surely exponentially stable provided that is sufficiently small. We first introduce a lemma, which is essential for the proof of the ensuing Theorem 3. Lemma 2 [43], [44] (Moon et al.’s inequality): Assume that , and are defined on the interval . Then, for any matrices , and , the following inequality holds: and

(42) where where

is defined in Theorem 2. If (43)

(41)

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

then for any initial value of (4) satisfies

the state

847

Thus

(44) i.e., (4) is almost surely exponentially stable with its exponential where is a unique and convergence rate being at least positive solution of the transcend equation

Since

we obtain . For the same reason, it has

, so we have .

Let

(45) and

(46) So, we have (53) From (4) and (53), we have (47) (48) (49) (50)

(54) For

let (55)

(51)

where

(52) Proof: Let us first prove that to be the smallest of

Let

. Choose

be the corresponding eigenvector of . Then

for , i.e.,

, i.e.,

Due to the generalized Itö formula [41] Moreover

(56)

848

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

where

From (54), we have (59) According to the generalized Itö formula [41]

(57) According to Lemma 2, we have

(60)

(58) (61) where By reversing the order of integral, we have

are constant matrices, and Therefore, by (56)–(58)

satisfying (42).

(62) while by (4), we then have

(63)

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

849

According to

we can get

(70) From (41), (46), (47), (59), (60), and (68)–(70), we have (64)

For the same reason, we have

(65)

(71)

(66) Analog to (63), from (48)–(51) and (55), it is easy to see

Then, by (47) and (63)–(66), we have

(67)

(72) From the generalized Itö formula [41] and (71) and (72)

From (61), (62), and (67), we have

(68) It is clear that

and

(73) When

Thus, for any diagonal free-weighting matrices

act of reversing the order of integral

and

we have

(69)

(74)

850

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

From (45), when

substituting (74) into (73) yields

then for any initial value of (4) satisfies

the state

(75) where . From (72) and (75)

i.e., (4) is almost surely exponentially stable with its exponential convergence rate being at least where is a unique and positive solution of the transcend equation

and From this and [41, Th. 7.24], (44) holds; i.e., (4) is almost surely exponentially stable. Remark 7: In Theorem 3, via linear matrix inequality, delaydependent sufficient conditions for the almost sure exponential stability of recurrent neural networks with time-varying delays and Markovian-switching parameters are derived by using a descriptor model transformation of system and by applying Moon et al.’s inequality and free-weighting matrix approach. It should in (55) is inspired by the results be pointed out that the in [45]. In Theorem 3, if we set , where is a sufficiently small scalar, then we can deduce the following corollary. Corollary 1: If there exist positive–definite maand diagonal matrices trices

Proof: For

let

, such that

where

From the Lyaunov functional, similar to Theorem 3, Corollary 1 holds. Remark 8: If there is no parameter switching in system (4), then Corollary 1 is the same as the main result in [37, Th. 1]. ( is suffiAssume cient small) in Corollary 1, then we have the following corollary. Corollary 2: If there exist positive–definite matrices and positive diagonal matrices such that (76) where

is defined in Theorem 2. If is defined in Theorem 2. If

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

then for any initial value of (4) satisfies

the state

851

delays, similar to the -matrix criteria in delayed recurrent neural networks without parametric switching. Theorem 4: If there exist positive diagonal matrices such that (79)

i.e., (4) is almost surely exponentially stable with its exponential where is a unique and convergence rate being at least positive solution of the transcend equation

the state

then for any initial value of (4) satisfies

(80)

and

i.e., (4) is almost surely exponentially stable with its exponential where convergence rate being at least

(81) and

Proof: For

(82)

let

From the Lyaunov functional, similar to Theorem 3, Corollary 2 holds. Remark 9: In (76), if are independent of , then the positive–definite property of is equivalent to

(83) Proof: Define

(77) From (4), (84), and the definition of where

It is easy to prove that if (77) holds, then recurrent neural network with time-varying delay and without switching (78) is exponentially stable [17]. Here, and the condition holds. Corollary 2 shows that if there exist , which are independent of , such that (76) holds, i.e., (78) is exponentially stable, then for any Markovian switching, system (4) is still almost surely exponentially stable. In addition, Corollary 2 is a generalization of the main result in [17]. Singh shows that the results in [17] are a generalization of the results in [7], [10], [11], and [13]. Hence, the results in [7], [10], [11], [13], and [17] are some especial cases of Corollary 2. The criteria in Theorems 2 and 3 depend on time delays. The stability condition in ensuing Theorem 4 do not depend on time

[41]

(84)

852

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

From (82), (83), and the Hölder inequality

It implies

(86) Equation (86) is equivalent to the fact that is an -matrix, where

Hence, if (4) does not have parametric switching, then . Hence, condition (86) becomes that is an -matrix. This happens to be the main results in [6], [8], [9], [14]–[16], and [32]. Hence, Theorem 4 generalizes the results in [6], [8], [9], [14]–[16], and [32]. Theorem 5: If From the generalized Itö formula [41] and the above inequality

(87) then for any initial value of (4) satisfies

the state

(88) i.e., (4) is almost surely exponentially stable with its exponential where convergence rate being at least

From the above inequality and Lemma 1, for any

From the definition (84) of

Hence (85) i.e., (4) is the mean square exponentially stable. According to [41, Th. 7.24] and (85), (4) is almost surely exponentially stable, i.e., (80) holds. Remark 10: In (79), if is independent of , then by denoting the condition (79) is equivalent to

Proof: Define . By using Lemma 1, similar to Theorem 4, Theorem 5 holds. Remark 11: Although Theorem 5 does not have the universality of Theorem 4, it can be expediently applied since the criterion is simple. V. ILLUSTRATIVE EXAMPLES Example 1: Consider one single neuron with Markovian switching (89) where the generator of Markov chain is

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

Fig. 1. Transient behavior of ln jx(t)j=t in Example 1.

853

Fig. 2. Transient behavior of ln jx(t)j=t with 

. Hence, such system (89) can be regarded as switching the following two configurations:

: in Example 2.

= 0 2

. According to Theorem 2

(90) (91) switching from one to the other according to the movement of Markov chain. In (8), let then . Hence, . . According to Theorem 1, (89) is almost surely exponentially stable. However, (90) is unstable while (91) is stable. It has been shown that it is needless to require all configurations be stable when the single neuron with Markovian switching is stable. of (89) are depicted in Simulation results of Fig. 1 with random initial values. It is easy to see that . Thus, (89) is almost surely exponentially stable. Example 2: Consider a two-neuron recurrent neural network with time-varying delays and Markovian-switching parameters

The upper bound of the delay is 1.0185. If then according to Theorem 2, the exponential convergence rate . According to Corollary 2 is at least

(92) where the generator of the Markov chain and parameters are

The upper bound of the delay is 0.2167. If then according to Corollary 2, the exponential convergence rate . is at least The simulation results of of (92) are depicted in Fig. 2 with ten random initial values. It is easy to see that . Thus, (92) is almost surely exponentially stable.

854

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 20, NO. 5, MAY 2009

then according to Theorem 5, the exponential If . If then convergence rate is at least according to Theorem 5, the exponential convergence rate is at . least of (93) are depicted in The simulation results of Figs. 3 and 4 with random initial values. It is easy to see that . Thus, (93) is almost surely exponentially stable. VI. CONCLUDING REMARKS

Fig. 3. Transient behavior of ln jx(t)j=t with 

: in Example 3.

= 0 2

Several almost sure exponential stability criteria for Markovian-switching recurrent neural networks with and without time delays are presented. The results show that the almost sure exponential stability criteria in the absence of time delays do not require the stability in the neural networks with Markovian switching at every parametric configuration. The results also show that if each configuration in the neural networks with Markovian switching and without time delays is exponentially stable, then the neural networks with Markovian switching and time delays are almost surely exponentially stable if the delays are sufficiently small, where an upper bound of the allowed time delays is also given to ensure almost sure exponential stability. In addition, the results show that neural networks with Markovian switching and time delays are almost sure exponentially stable if every configuration is exponentially stable. Furthermore, a delay-independent criteria, similar to -matrix criteria without Markovian switching, are given. REFERENCES

Fig. 4. Transient behavior of ln jx(t)j=t with 

= 5

in Example 3.

Example 3: Consider another two-neuron recurrent neural network with time-varying delays and Markovian-switching parameters

(93) where the generator of the Markov chain and parameters are

.

[1] J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proc. Nat. Acad. Sci. USA, vol. 79, pp. 2554–2558, 1982. [2] M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,” IEEE Trans. Syst. Man. Cybern., vol. SMC-13, no. 5, pp. 815–821, May. 1983. [3] J. J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons,” Proc. Nat. Acad. Sci. USA, vol. 81, pp. 3088–3092, 1984. [4] L. O. Chua and L. Yang, “Cellular neural networks: Theory,” IEEE Trans. Circuits Syst., vol. CAS-35, no. 10, pp. 1257–1272, Oct. 1988. [5] M. Forti and A. Tesi, “New conditions for global stability of neural networks with application to linear and quadratic programming problems,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 42, no. 7, pp. 354–366, Jul. 1995. [6] S. Arik and V. Tavsanoglu, “Equilibrium analysis of delayed CNN,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 45, no. 2, pp. 168–171, Feb. 1998. [7] T. L. Liao and F. C. Wang, “Global stability for cellular neural networks with time delay,” IEEE Trans. Neural Netw., vol. 11, no. 6, pp. 1481–1484, Nov. 2000. [8] T. P. Chen, “Global exponential stability of delayed Hopfield neural networks,” Neural Netw., vol. 14, pp. 977–980, 2001. [9] C. Feng and R. Plamondon, “On the stability analysis of delayed neural networks systems,” Neural Netw., vol. 14, pp. 1181–1188, 2001. [10] J. Cao, “Global stability conditions for delayed CNNs,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 48, no. 11, pp. 1330–1333, Nov. 2001. [11] S. Arik, “An improved global stability result for delayed cellular neural networks,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 49, no. 8, pp. 1211–1214, Aug. 2002. [12] S. Hu and J. Wang, “Global stability of a class of continuous-time recurrent neural networks,” IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., vol. 49, no. 9, pp. 1334–1347, Sep. 2002. [13] S. Arik, “An analysis of global asymptotic stability of delayed cellular neural networks,” IEEE Trans. Neural Netw., vol. 13, no. 5, pp. 1239–1242, Sep. 2002.

SHEN AND WANG: ALMOST SURE EXPONENTIAL STABILITY OF RECURRENT NEURAL NETWORKS

[14] J. D. Cao and J. Wang, “Global asymptotic stability of a general class of recurrent neural networks with time-varying delays,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 50, no. 1, pp. 34–44, Jan. 2003. [15] X. X. Liao and J. Wang, “Algebraic criteria for global exponential stability of cellular neural networks with multiple time delays,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 50, no. 2, pp. 268–275, Feb. 2003. [16] Z. G. Zeng, J. Wang, and X. X. Liao, “Global exponential stability of a general class of recurrent neural networks with time-varying delays,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 50, no. 10, pp. 1353–1358, Oct. 2003. [17] V. Singh, “A generalized LMI-based approach to the global asymptotic stability of delayed cellular neural networks,” IEEE Trans. Neural Netw., vol. 15, no. 1, pp. 223–225, Jan. 2004. [18] H. J. Jiang and Z. D. Teng, “Boundedness and stability for nonautonomous bidirectional associative neural networks with delay,” IEEE Trans. Circuits Syst. II, Exp. Briefs, vol. 51, no. 4, pp. 174–180, Apr. 2004. [19] T. P. Chen and L. B. Rong, “Robust global exponential stability of Cohen-Grossberg neural networks with time delays,” IEEE Trans. Neural Netw., vol. 15, no. 1, pp. 203–206, Jan. 2004. [20] H. D. Qi and L. Q. Qi, “Deriving sufficient conditions for global asymptotic stability of delayed neural networks via nonsmooth,” IEEE Trans. Neural Netw., vol. 15, no. 1, pp. 99–109, Jan. 2004. [21] J. D. Cao and J. Wang, “Global asymptotic and robust stability of recurrent neural networks with time delays,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 52, no. 2, pp. 417–426, Feb. 2005. [22] S. Arik, “Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays,” IEEE Trans. Neural Netw., vol. 16, no. 3, pp. 580–586, May 2005. [23] W. L. Lu and T. P. Chen, “Dynamical behaviors of Cohen-Grossberg neural networks with discontinuous activation functions,” Neural Netw., vol. 18, no. 3, pp. 231–242, 2005. [24] M. Forti, P. Nistri, and D. Papini, “Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain,” IEEE Trans. Neural Netw., vol. 16, no. 6, pp. 1449–1463, Nov. 2005. [25] J. G. Peng, Z. B. Xu, H. Qiao, and B. Zhang, “A critical analysis on global convergence of Hopfield-type neural networks,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 52, no. 4, pp. 804–814, Apr. 2005. [26] H. Huang, Y. Z. Qu, and H. X. Li, “Robust stability analysis of switched Hopfield neural networks with time-varying delay under uncertainty,” Phys. Lett. A, vol. 345, no. 4–6, pp. 345–354, 2005. [27] Z. G. Zeng and J. Wang, “Complete stability of cellular neural networks with time-varying delays,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 53, no. 4, pp. 944–955, Apr. 2006. [28] L. Wang and Z. Xu, “Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 53, no. 6, pp. 1373–1380, Jun. 2006. [29] C. G. Li and X. F. Liao, “Robust stability and robust periodicity of delayed recurrent neural networks with noise disturbance,” IEEE Trans. Circuits Syst. I, Reg. Papers, vol. 53, no. 10, pp. 2265–2273, Oct. 2006. [30] S. Xu, J. Lam, and D. W. C. Ho, “A new LMI condition for delaydependent asymptotic stability of delayed Hopfield neural networks,” IEEE Trans. Circuits Syst. II, Exp. Briefs, vol. 53, no. 3, pp. 230–234, Mar. 2006. [31] K. Yuan, J. D. Cao, and H. X. Li, “Robust stability of switched Cohen-Grossberg neural networks with mixed time-varying delays,” IEEE Trans. Syst. Man. Cybern., vol. 36, no. 6, pp. 1356–1363, Jun. 2006. [32] P. Z. Liu and Q. L. Han, “On stability of recurrent neural networks: An approach from Volterra integro-differential equations,” IEEE Trans. Neural Netw., vol. 17, no. 1, pp. 264–267, Jan. 2006. [33] Z. D. Wang, Y. R. Liu, M. Z. Li, and X. H. Liu, “Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays,” IEEE Trans. Neural Netw., vol. 17, no. 3, pp. 814–820, May 2006. [34] J. D. Cao, K. Yuan, and H. K. Li, “Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays,” IEEE Trans. Neural Netw., vol. 17, no. 6, pp. 1646–1651, Nov. 2006. [35] Z. G. Zeng and J. Wang, “Global exponential stability of recurrent neural networks with time-varying delays in the presence of strong external stimuli,” Neural Netw., vol. 19, no. 10, pp. 1528–1537, 2006. [36] Y. He, G. Liu, and D. Rees, “New delay-dependent stability criteria for neural networks with time-varying delay,” IEEE Trans. Neural Netw., vol. 18, no. 1, pp. 310–314, Jan. 2007. [37] Y. He, W. Wu, and J. - H. She, “An improved global asymptotic stability criterion for delayed cellular neural networks,” IEEE Trans. Neural Netw., vol. 17, no. 1, pp. 250–252, Jan. 2006.

855

[38] H. Huang, D. W. C. Ho, and Y. Z. Qu, “Robust stability of stochastic delayed additive neural networks with Markovian switching,” Neural Netw., vol. 20, pp. 799–809, 2007. [39] Y. Shen and J. Wang, “Noise-induced stabilization of the recurrent neural networks with mixed time-varying delays and Markovian-switching parameters,” IEEE Trans. Neural Netw., vol. 18, no. 6, pp. 1857–1862, Nov. 2007. [40] D. Liberzon, Switching in Systems and Control. Boston, MA: Birkhauser, 2003. [41] X. Mao and C. Yuan, Stochastic Differential Equations with Markovian Switching. London, U.K.: Imperial College Press, 2006. [42] A. Halanay, Differential Equations, Stability, Oscillation, Timelags. New York: Academic, 1996. [43] Y. S. Moon, P. Park, W. H. Kwon, and Y. S. Lee, “Delay-dependent robust stabilization of uncertain state-delayed systems,” Int. J. Control, vol. 74, no. 14, pp. 1447–1455, 2001. [44] W. H. Chen, J. X. Xu, and Z. H. Guan, “Guaranteed cost control for uncertain Markovian jump systems with mode-dependent time-delays,” IEEE Trans. Autom. Control, vol. 48, no. 12, pp. 2270–2277, Dec. 2003. [45] Y. He, Q. - G. Wang, L. Xie, and C. Lin, “Further improvement of freeweighting matrices technique for systems with time-varying delay,” IEEE Trans. Autom. Control, vol. 52, no. 2, pp. 293–298, Feb. 2007.

Yi Shen received the Ph.D. degree from Huazhong University of Science and Technology, Wuhan, China, in 1998 He was a Postdoctoral Research Fellow at the Huazhong University of Science and Technology during 1999–2001. He joined the Department of Control Science and Engineering, Huazhong University of Science and Technology, and became an Associated Professor in 2001 and was promoted to Professor in 2005. From 2006 to 2007, he was a Postdoctoral Research Fellow at the Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Hong Kong. He has authored over 50 research papers. His main research interests lie in the fields of stochastic control and neural networks.

Jun Wang (S’89–M’90–SM’93–F’07) received the B.S. degree in electrical engineering and the M.S. degree in systems engineering from Dalian University of Technology, Dalian, China, in 1982 and 1985, respectively, and the Ph.D. degree in systems engineering from Case Western Reserve University, Cleveland, OH, in 1991. Currently, he is a Professor at the Department of Mechanical and Automation Engineering, Chinese University of Hong Kong, Shatin, Hong Kong. In addition, he has been holding a Changjiang Chair Professorship in computer science and engineering at Shanghai Jiao Tong University on a part-time basis since 2008. Earlier, he held various academic positions at Dalian University of Technology, Case Western Reserve University, and University of North Dakota. Besides, he also held various short-term visiting positions at USAF Armstrong Laboratory (1995), REKEN Brain Science Institute (2001), Universite catholique de Louvain (2001), Chinese Academy of Sciences (2002), and Huazhong University of Science and Technology (2006–2007). His current research interests include neural networks and their applications. Dr.Wang has been an Associate Editor of the IEEE TRANSACTIONS ON NEURAL NETWORKS since 1999 and the IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART B: CYBERNETICS since 2003. He also served as an Associate Editor of the IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS—PART C: APPLICATIONS AND REVIEWS (2002–2005), a member of the Editorial Advisory Board of the International Journal of Neural Systems (2006–2008), a guest editor of the special issues of the European Journal of Operational Research (1996), the International Journal of Neural Systems (2007), and Neurocomputing (2008). He was an organizer of several international conferences such as the General Chair of the 13th International Conference on Neural Information Processing (2006) and the 2008 IEEE World Congress on Computational Intelligence held in Hong Kong. He served as the President of Asia Pacific Neural Network Assembly in 2006.