Stability of Nonnegative Periodic Solutions of High-Ordered Neural Networks Lili Wang1 and Tianping Chen2 1
Department of Applied Mathematics, Shanghai University of Finance and Economics, Shanghai, P.R. China 2 School of Computer Sciences, Key Laboratory of Nonlinear Mathematics Science, School of Mathematical Sciences, Fudan University, Shanghai, P.R. China
[email protected],
[email protected] Abstract. In this paper, a class of high-ordered neural networks are investigated. By rigorous analysis, a set of sufficient conditions ensuring the existence of a n -asymptotical stability are established. nonnegative periodic solution and its R+ The results obtained can also be applied to the first-ordered neural networks. Keywords: High-ordered neural networks, nonnegative periodic solutions, global stability, asymptotical stability.
1 Introduction Recently, the neural networks with high-ordered interactions have attracted considerable attentions due to the fact that they have stronger approximation property, faster convergence rate, greater storage capacity, and higher fault tolerance than first-ordered neural networks. It is of great importance to study the dynamics underlying these systems in both theory and applications. Consider a class of high-ordered Cohen-Grossberg neural networks described by n dui (t) = −ai (ui (t)) bi (ui (t)) + cij (t)fj (uj (t)) dt j=1
+
n n
dij1 j2 (t)gj1 (uj1 (t − τij1 ))gj2 (uj2 (t − τij2 )) + Ii (t) ,
j1 =1 j2 =1
i = 1, · · · , n,
(1)
where ui (t) represents the state of the ith unit at time t, ai (u) > 0 is the amplification function, cij (t) and dij1 j2 (t) are the first-order and high-order connection weights, respectively, τij ≥ 0 stands for the transmission delay, and τ = max τij , Ii (t) denotes the i,j
external input at time t, fj , gj are the activation functions, i, j = 1, · · · , n. The initial condition is (2) ui (s) = φi (s) f or s ∈ [−τ, 0], where φi ∈ C[−τ, 0], i = 1, · · · , n. C. Guo, Z.-G. Hou, and Z. Zeng (Eds.): ISNN 2013, Part I, LNCS 7951, pp. 174–180, 2013. c Springer-Verlag Berlin Heidelberg 2013
Stability of Nonnegative Periodic Solutions of High-Ordered Neural Networks
175
In the competition models of biological species, ui (t), which represents the amount of the i-th species, must be nonnegative. In [2], [4]-[6], theoretical analysis have been provided on the stability of positive solutions of neural networks. The dynamics of highordered neural networks were also studied in [7]-[9]. In this paper, we are concerned with the Cohen-Grossberg neural networks (1) with high-ordered connections, and to investigate the existence and stability of nonnegative periodic solutions.
2 Preliminaries First of all, we present some assumptions and definitions required throughout the paper. Assumption 1. The amplification function ai (ρ) is continuous with ai (0) = 0 and ai (ρ) > 0 when ρ > 0. And for any > 0, it holds that 0 aidρ (ρ) = +∞, i = 1, 2, · · · , n. Assumption 2. cij (t), dij1 j2 (t), Ii (t) are continuous and periodic functions with pei (y) riod ω. bi (x) is continuous and satisfies bi (x)−b ≥ γi > 0, i = 1, 2, · · · , n. x−y Assumption 3. There exist positive constants Fj > 0, Gj , Mj > 0, such that, |fj (x) − fj (y)| ≤ Fj |x − y|, |gj (x) − gj (y)| ≤ Gj |x − y|, |gj (x)| ≤ Mj for any x, y ∈ R, j = 1, 2, · · · , n. Definition 1. {ξ, ∞}-norm: u(t){ξ,∞} = max ξi−1 |ui (t)|, where ξi > 0, i = i=1,...,n
1, . . . , n. Definition 2. A nonnegative periodic solution u∗ (t) of the system (1) is said to be n -asymptotically stable if for any trajectory solution u(t) of (1) with initial condition R+ φi (s) > 0, s ∈ [−τ, 0], i = 1, 2, · · · , n, there holds that lim [u(t) − u∗ (t)] = 0. t→+∞
3 Main Results Lemma 1. Suppose that Assumption 1 is satisfied. If the initial condition φi (s) > 0, then, the corresponding trajectory u(t) = [u1 (t), u2 (t), · · · , un (t)]T satisfies ui (t) > 0 for t > 0. Proof: If for some t0 > 0 and some index i0 , ui0 (t0 ) = 0, then, by the Assumption 1, we have 0
t0
bi0 (ui0 (t)) +
n
ci0 j (t)fj (uj (t))
j=1
+
n n
di0 j1 j2 (t)gj1 (uj1 (t − τi0 j1 ))gj2 (uj2 (t − τi0 j2 )) + Ii0 (t) dt
j1 =1 j2 =1
=−
t0 0
u˙ i0 (t) dt = ai0 (ui0 (t))
φi0 (0) 0
dρ = +∞. ai0 (ρ)
(3)
Because of the continuity of ui (t), the left side of (3) is finite, which contradicts with the infinity of right side. Hence, ui (t) = 0 for all t > 0, which implies that ui (t) > 0 for all t > 0, i = 1, 2, · · · , n. Lemma 1 is proved.
176
L. Wang and T. Chen
Lemma 2. Suppose that Assumption 1 ∼ Assumption 3 are satisfied. If there are positive constants ξ1 , . . . , ξn such that − γi ξi +
n
|cij (t)|ξj Fj +
n n
|dij1 j2 (t)|(ξj1 Gj1 Mj2 + ξj2 Gj2 Mj1 ) < 0, (4)
j1 =1 j2 =1
j=1
for all 0 ≤ t < ω, i = 1, · · · , n, then any solution u(t) of the high-ordered CohenGrossberg neural networks (1) is bounded. Proof: By (4), we can find a small positive constant η such that −γi ξi +
n
|cij (t)|ξj Fj +
n n
|dij1 j2 (t)|(ξj1 Gj1 Mj2 + ξj2 Gj2 Mj1 ) < −η < 0,
j1 =1 j2 =1
j=1
for all t > 0. Define M (t) =
sup
t−τ ≤s≤t
u(s){ξ,∞} . For any fixed t0 , there are two possibilities:
(i) u(t0 ){ξ,∞} < M (t0 ). In this case, there exists δ > 0, u(t){ξ,∞} < M (t0 ) for t ∈ (t0 , t0 + δ). Thus M (t) = M (t0 ) for t ∈ (t0 , t0 + δ). (ii)u(t0 ){ξ,∞} = M (t0 ). In this case, let it0 be an index such that ξi−1 |uit0 (t0 )| = u(t0 ){ξ,∞} . Notice that t 0
|fj (x)| ≤ Fj |x|+|fj (0)|,
|gj1 (x)gj2 (y)| ≤ Gj1 Mj2 |x|+Gj2 Mj1 |y|+|gj1 (0)gj2 (0)|.
Denote = max C
n
i
|bi (0)| + c∗ij |fj (0)| +
n n j1 =1 j2 =1
j=1
d∗ij1 j2 |gj1 (0)gj2 (0)| + Ii∗ ,
(5)
where c∗ij = sup |cij (t)|, d∗ij1 j2 = sup |dij1 j2 (t)|, Ii∗ = sup |Ii (t)|. t
t
Then we have d|uit0 (t)| dt
t=t0
= −sign(uit0 (t0 ))ait0 (uit0 (t0 )) bit0 (uit0 (t0 )) +
t
n
cit0 j (t0 )fj (uj (t0 ))
j=1
+
n n j1 =1 j2 =1
dit0 ,j1 ,j2 (t0 )gj1 (uj1 (t0 − τit0 j1 ))gj2 (uj2 (t0 − τit0 j2 )) + Iit0 (t0 )
n |cit0 j (t0 )|Fj |uj (t0 )| ≤ ait0 (uit0 (t0 )) − γit0 |uit0 (t0 )| + j=1
+
n n j1 =1 j2 =1
|dit0 ,j1 ,j2 (t0 )|(Gj1 Mj2 |uj1 (t0 − τit0 j1 )| + Gj2 Mj1 |uj2 (t0 − τit0 j2 )|) + C˜
Stability of Nonnegative Periodic Solutions of High-Ordered Neural Networks ≤ ait0 (uit0 (t0 ))
− γit0 ξit0 +
n
177
|cit0 j (t0 )|ξj Fj
j=1
+
n n j1 =1 j2 =1
|dit0 ,j1 ,j2 (t0 )|(ξj1 Gj1 Mj2 + ξj2 Gj2 Mj1 ) M (t0 ) + C
. < ait0 (uit0 (t0 )) − ηM (t0 ) + C
If M (t0 ) ≥ C/η, then,
d|uit (t)| 0 dt
t=t0
(6)
≤ 0, M (t) is non-increasing nearby t0 , that is,
there exists δ1 > 0, such that M (t) = M (t0 ) for t ∈ (t0 , t0 + δ1 ). On the other hand, if for t ∈ (t0 , t0 + δ2 ). Let M (t0 ) < C/η, then there exist δ2 > 0, such that M (t) < C/η δ = min{δ1 , δ2 }, then, M (t) ≤ max{M (t0 ), C/η} holds for every t ∈ (t0 , t0 + δ). Lemma 2 is proved. In summary, u(t){ξ,∞} ≤ M (t) ≤ max{M (0), C/η}. Theorem 1. Suppose that Assumption 1 ∼ Assumption 3 are satisfied. If there are positive constants ξ1 , . . . , ξn , ζ1 , · · · , ζn , such that for every 0 ≤ t < ω, there hold (4) and − γi ζi +
n
|cji (t)|ζj Fi +
n n
(ζj1 d∗j1 ij2 Mj2 + ζj2 d∗j2 j1 i Mj1 )Gi < 0,
(7)
j1 =1 j2 =1
j=1
for i = 1, . . . , n. Then the system (1) has a nonnegative periodic solution with periodic n -asymptotically stable. ω, which is R+ Proof: In fact, for any positive solution u(t) of system (1), let xi (t) = ui (t + ω) − u (t+ω) dρ ui (t), yi (t) = uii(t) ai (ρ) . It is obvious that sign(xi (t)) = sign(yi (t)). Then, by direct calculation, we have
d|yi (t)| = sign(yi (t)) − bi (ui (t + ω)) + bi (ui (t)) dt n n − cij (t + ω)fj (uj (t + ω)) + cij (t)fj (uj (t)) − +
j=1 n
j=1 n
j1 =1 j2 =1 n n
dij1 j2 (t + ω)gj1 (uj1 (t + ω − τij1 ))gj2 (uj2 (t + ω − τij2 )) dij1 j2 (t)gj1 (uj1 (t − τij1 ))gj2 (uj2 (t − τij2 ))
j1 =1 j2 =1
−Ii (t + ω) + Ii (t) ≤ −γi |xi (t)| +
n j=1
+
n n j1 =1 j2 =1
|cij (t)|Fj |xj (t)|
|dij1 j2 (t)| Gj1 Mj2 |xj1 (t − τij1 )| + Gj2 Mj1 |xj2 (t − τij2 )| .
178
L. Wang and T. Chen
Define V (t) =
n i=1
ζi |yi (t)| +
n n
n
i=1 j1 =1 j2 =1
t ∗ ζi dij j Gj1 Mj2 1 2 t−τ
ij1
|xj1 (s)|ds + Gj2 Mj1
t t−τij 2
|xj2 (s)|ds . (8)
Differentiating it along the trajectory u(t), it gets n n dV (t) ζi − γi |xi (t)| + |cij (t)|Fj |xj (t)| ≤ dt i=1 j=1 +
n n j1 =1 j2 =1
+
|dij1 j2 (t)| Gj1 Mj2 |xj1 (t − τij1 )| + Gj2 Mj1 |xj2 (t − τij2 )|
n n n i=1 j1 =1 j2 =1
ζi d∗ij1 j2 Gj1 Mj2 |xj1 (t)| + Gj2 Mj1 |xj2 (t)|
−Gj1 Mj2 |xj1 (t − τij1 )| + Gj2 Mj1 |xj2 (t − τij2 )| ≤
n
(−γi ζi )|xi (t)| +
n n
i=1
+
n n n
n
− γi ζi +
i=1
≤ −λ
|cij (t)|ζi Fj |xj (t)|
i=1 j=1
i=1 j1 =1 j2 =1
=
ζi d∗ij1 j2 (Gj1 Mj2 |xj1 (t)| + Gj2 Mj1 |xj2 (t)|)
n
|cji (t)|ζj Fi +
(ζj1 d∗j1 ij2 Mj2 + ζj2 d∗j2 j1 i Mj1 )Gi |xi (t)|
j1 =1 j2 =1
j=1 n
n n
|xi (t)| = −λx(t)1 ,
(9)
i=1
where λ is defined by −λ = max sup i
t
− γi ζi +
n j=1
|cji (t)|ζj Fi +
n n
(ζj1 d∗j1 ij2 Mj2 + ζj2 d∗j2 j1 i Mj1 )Gi
< 0.
j1 =1 j2 =1
Integrating both sides of (9), we have ∞ 1 x(t)1 dt ≤ V (0) < +∞, λ 0
(10)
that is, ∞ n=1
0
ω
u(t + nω) − u(t + (n − 1)ω)1 dt ≤
1 V (0) < +∞. λ
(11)
By Cauchy convergence principle, we have that u(t + nω) converges in L1 [0, ω] as n → +∞. On the other hand, from Lemma 2, we know that ui (t) is bounded, so that ai (ui (t)) is bounded and u(t) is uniformly continuous correspondingly. Then the sequence u(t + nω) is uniformly bounded and equicontinuous. By Arz´ ela − Ascoli theorem, there exists a subsequence u(t + nk ω) converging on any compact set of R. Denote the limit as u∗ (t).
Stability of Nonnegative Periodic Solutions of High-Ordered Neural Networks
179
It is easy to see that u∗ (t) is also the limit of u(t + nω) in L1 [0, ω], i.e., ω lim u(t + nω) − u∗ (t)1 dt = 0. n→+∞
0
Then, we have that u(t + nω) converges to u∗ (t) uniformly on [0, ω]. Similarly, u(t + nω) converges to u∗ (t) uniformly on any compact set of R. Now we will prove that u∗ (t) is a nonnegative periodic solution of (1). Clearly, u∗ (t) is nonnegative. And, u∗ (t + ω) = lim u(t + (n + 1)ω) = lim u(t + nω) = x∗ (t), n→+∞
n→+∞
so that u∗ (t) is periodic with period ω. Then, replace u(t) with u(t + nk ω) in system (1), let k → ∞, and it gets n du∗i (t) ∗ ∗ = −ai (ui (t)) bi (ui (t)) + cij (t)fj (u∗j (t)) dt j=1
+
n n
dij1 j2 (t)gj1 (u∗j1 (t − τij1 ))gj2 (u∗j2 (t − τij2 )) + Ii (t) . (12)
j1 =1 j2 =1
Hence u∗ (t) is a solution of system (1). Let t = t1 + nω, where t1 ∈ [0, ω]. Then, u(t) − u∗(t) = u(t1 + nω) − u∗(t1 ). And the uniform convergence of {u(t + nω)} on [0, ω] leads to lim u(t) − u∗ (t) = 0.
t→+∞
(13)
Finally, we prove that any positive solution v(t) of system (1) converges to u∗ (t). In v (t) fact, redefine xi (t) = vi (t) − ui (t), yi (t) = uii(t) aidρ (ρ) . Using the same method above, it is easy to get that lim v(t) − u(t) = 0. Combining with (13), we get t→+∞
lim v(t) − u∗ (t) = 0. Theorem 1 is proved.
t→+∞
4 Conclusions In this paper, the high-ordered Cohen-Grossberg neural networks are addressed. Under n some mild conditions, the existence of a nonnegative periodic solution and its R+ asymptotical stability are presented. Acknowledgments. This work is jointly supported by the National Natural Sciences Foundation of China under Grant No. 61101203, Shanghai Young College Teachers Program scd11011, and Shanghai Municipal International Visiting Scholar Project for College Teachers.
References 1. Cohen, M.A., Grossberg, S.: Absolute Stability and Global Pattern Formation and Parallel Memory Storage by Competitive Neural Networks. IEEE Trans. Syst. Man Cybern. B. 13, 815–821 (1983)
180
L. Wang and T. Chen
2. Lin, W., Chen, T.: Positive Periodic Solutions of Delayed Periodic Lotka-Volterra Systems. Physics Letters A 334(4), 273–287 (2005) 3. Chen, T., Wang, L., Ren, C.: Existence and Global Stability Analysis of Almost Periodic ˙ Solutions for Cohen- Grossberg Neural Networks. In: Wang, J., Yi, Z., Zurada, J.M., Lu, B.L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3971, pp. 204–210. Springer, Heidelberg (2006) 4. Chen, T., Bai, Y.: Stability of Cohen-Grossberg Neural Networks with Nonnegative Periodic Solutions. In: Proceedings of International Joint Conference on Neural Networks, Orlando, Florida, USA, August 12–17 (2007) n -global Stability of A Cohen-Grossberg Neural Network System with 5. Lu, W., Chen, T.: R+ Nonnegative Equilibria. Neural Networks 20(6), 714–722 (2007) 6. Fan, Q., Shao, J.: Positive Almost Periodic Solutions for Shunting Inhibitory Cellular Neural Networks with Time-Varying and Continuously Distributed Delays. Communications in Nonlinear Science And Numerical Simulation 15(6), 1655–1663 (2010) 7. Wang, L.: Existence and Global Attractivity of Almost Periodic Solutions for Delayed HighOrdered Neural Networks. Neurocomputing 73, 802–808 (2010) 8. Wang, Y., Lin, P., Wang, L.: Exponential Stability of Reaction-Diffusion High-Order Markovian Jump Hopfield Neural Networks with Time-Varying Delays. Nonlinear Analysis-Real World Applications 13(3), 1353–1361 (2012) 9. Li, D.: Dynamical Analysis for High-Order Delayed Hopfield Neural Networks with Impulses. Abstract And Applied Analysis (2012), doi:10.1155/2012/825643