Neurocomputing 142 (2014) 326–334
Contents lists available at ScienceDirect
Neurocomputing journal homepage: www.elsevier.com/locate/neucom
Asymptotic almost automorphic solutions of impulsive neural network with almost automorphic coefficients Syed Abbas a,n, Lakshman Mahto a, Mokhtar Hafayed b, Adel M. Alimi c a
School of Basic Sciences, Indian Institute of Technology Mandi, Mandi, H.P. 175001, India Laboratory of Applied Mathematics, Mohamed Khider University, PO Box 145, Biskra 07000, Algeria c Department of Electrical and Computer Engineering, University of Sfax, Sfax, Tunisia b
art ic l e i nf o
a b s t r a c t
Article history: Received 24 October 2013 Received in revised form 16 February 2014 Accepted 7 April 2014 Communicated by N. Ozcan Available online 24 May 2014
In this paper existence and asymptotic stability of asymptotically almost automorphic solution of impulsive neural networks with delay is discussed. The results are established by using various fixed point theorems and Lyapunov-like functional. As far as we know, this is the first paper to discuss such kind of solutions for impulsive neural networks. At the end, we give few examples to illustrate our theoretical findings. One can see that the numerical simulation results show asymptotically almost automorphic behaviour of the solution. & 2014 Elsevier B.V. All rights reserved.
Keywords: Neural networks Impulsive condition Fixed point method Almost automorphic functions Lyapunov-like functional
1. Introduction Since the introduction of almost periodic functions by Bohr [8], there have been various important generalization of this concept. One important generalization is the concept of asymptotic almost automorphic functions which has been introduced in the literature by N'Guérékata [18]. One of the very important and natural questions in the field of differential equations is that if the force function possesses a special characteristic then whether the solution possesses the same characteristic or not. This concept is no exception and many mathematicians applied this in the field of differential equations, for more details one could see [10,11,15,28] and references therein. It could be argued that many phenomena exhibit regularity behaviour periodicity. These kinds of phenomena could be modelled by considering more general notion such as almost periodic, almost automorphic, pseudo-almost automorphic and asymptotically almost automorphic. Recurrent neural networks (RNNs) create an internal state of the network which allows it to exhibit dynamic temporal behaviour. Majorally used RNNs are Hopfield neural networks, cellular neural networks and there are many applications of RNN in the fields of signal processing, pattern recognition, optimization and
n
Corresponding author. E-mail address:
[email protected] (S. Abbas).
http://dx.doi.org/10.1016/j.neucom.2014.04.028 0925-2312/& 2014 Elsevier B.V. All rights reserved.
associative memories. There are many qualitative results on neural networks, some of them are [4,12,14,20] and the references therein. It is very important to study the mathematical properties of the RNN as it gives the long term prediction of the behaviour. The mathematical topics of interests are the nature of the solutions, stability, periodicity, almost periodicity, etc. It has been shown that the RNNs are universal approximators of dynamical systems (see [21]). Asymptotic almost automorphic functions are more general than almost periodic, automorphic, hence it covers wider class of functions. If the observed output is not showing periodic, almost periodic or almost automorphic behaviour, then one could check whether its behaviour is asymptotic almost automorphic or not. On the other hand, impulsive differential equations involve differential equations on continuous time interval as well as difference equations on discrete set of times. It provides a real framework of modelling the system, which undergo for abrupt changes like shocks, earthquake, and harvesting. There are few published monographs and literatures on impulsive differential [6,7,17]. Impulses are sudden interruptions in the systems. In neural case, one could say that these are abrupt changes in the neural state. The affect on human will depend on the intensity of the change. In signal processing, the faulty elements in the corresponding artificial network may produce sudden changes in the state voltages and thereby affect the normal transient behaviour in processing signals or information. For more details one can see [20] and reference therein. Neural networks have been
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
studied extensively, but the mathematical modelling of dynamical systems with impulses is very recent area of research [1–3,21,22]. Impulsive neural networks have been extensively studied by Stamova [26]. There are many works on almost periodicity of impulsive neural networks (for example [22–27] and reference therein). One cannot avoid time delay while working with many real phenomena. In neural networks the time delay refers to delays in the information processing of neurons due to various reasons. It could be caused by finite switching speed of amplifier circuits. While it is more natural to introduce the time delay, but at the same time it makes the dynamics more complex and the system may lose its stability and show almost-periodic, almost automorphic, pseudo-almost automorphic motion or asymptotic almost automorphic, bifurcation and chaotic nature. These kinds of solutions are more general and cover bigger class of dynamics. Motivated by Stamova [22], in this work we shall study the stability and existence of asymptotic almost automorphic solutions of the following impulsive differential equations: dxi ðtÞ ¼ ∑ aij ðtÞxj ðtÞ þ ∑ αij ðtÞf j ðxj ðtÞÞ dt j¼1 j¼1 n
n
Definition 2.1. A bounded piecewise continuous f A PCðR; Rn Þ is called almost automorphic if
327
function
(i) sequence of impulsive moments ft k g is an almost automorphic sequence, (ii) for every real sequence ðsn Þ, there exists a subsequence ðsnk Þ such that gðtÞ ¼ lim f ðt þ snk Þ n-1
is well defined for each t A R and lim gðt snk Þ ¼ f ðtÞ
n-1
for each t A R. Denote by AAðR; Rn Þ the set of all such functions. Definition 2.2. A bounded piecewise continuous function f A PCðR Rn ; Rn Þ is called almost automorphic in t uniformly for x in compact subsets of Rn if
n
þ ∑ βij ðtÞf j ðxj ðt αÞÞ þ γ i ðtÞ;
(i) sequence of impulsive moments ft k g is an almost automorphic sequence, (ii) for every compact subset K of Rn and every real sequence ðsn Þ, there exists a subsequence ðsnk Þ such that
j¼1
t a τk ;
α 4 0;
Δðxðτk ÞÞ ¼ Ak xðτk Þ þ I k ðxðτk ÞÞ þ γ k ; xðτk 0Þ ¼ xðτk Þ; k A Z;
gðt; xÞ ¼ lim f ðt þ snk ; xÞ n-1
xðτk þ 0Þ ¼ xðτk Þ þ Δxðτk Þ;
is well defined for each t A R, x A K and
t A R;
lim gðt snk ; xÞ ¼ f ðt; xÞ
n-1
xðtÞ ¼ Ψ 0 ðtÞ;
t A ½ α; 0
ð1:1Þ
where aij ; αij ; βij ; f j ; γ i A CðR; RÞ for i ¼ 1; 2; …; n; j ¼ 1; 2; …; n. Also Ak A Rnn ; I k ðxÞ A CðΩ; Rn Þ; γ k A Rn . We denote Ω a domain in Rn . The symbol CðX; YÞ denote the set of all continuous functions from X to Y. The organization of the paper is as follows: in Section 2, we give some basic definitions and results. In Section 3, we establish the existence of asymptotic almost automorphic solution of Eq. (1.1). At the end in Section 4, we give examples with numerical simulations to illustrate our analytical findings. 2. Preliminaries The symbol Rn denotes the n-dimensional space with norm jxj ¼ maxfjxi j; i ¼ 1; 2; …; ng. We denote PCðJ; Rn Þ, space of piecewise continuous functions from J R to Rn with point of discontinuity of first kind τk at which it is left continuous. For reader's convenience, we define the following class of spaces:
PC0 ðR þ Rn ; Rn Þ ¼ fϕ A PCðR þ Rn ; Rn Þ : limt-1 jϕðt; xÞj ¼ 0 in t uniformly in x A Rn g;
AAðR; Rn Þ ¼ fϕ A PCðR; Rn Þ : ϕ is almost automorphic functiong; AAðR Rn ; Rn Þ ¼ fϕ A PCðR Rn ; Rn Þ : ϕ is almost automorphic functiong;
AAAðR Rn ; Rn Þ ¼ fϕ A PCðR Rn ; Rn Þ : ϕ is asymptotically almost automorphic functiong;
AASðZ; RÞ ¼ fϕ : Z-R is an almost automorphic sequenceg; AAASðZ; RÞ ¼ fϕ : Z-R is an asymptotically almost automorphic sequenceg. The definition of almost automorphic operator has been given by N'Guérékata and Pankov [19]. Now we state the following definitions in the framework of impulsive systems inspired from [5,9,16,26].
for each t A R, x A K. Denote by AAðR Rn ; Rn Þ the set of all such functions. Definition 2.3. A piecewise continuous function f A PCðR þ ; Rn Þ is called asymptotically almost automorphic if and only if it can be written as f ¼ f 1 þf 2 , where f 1 A AAðR þ ; Rn Þ and f 2 A PC0 ðR þ ; Rn Þ. The space of these kinds of functions is denoted by AAAðR þ ; Rn Þ. Definition 2.4. A piecewise continuous function f A PCðR þ Rn ; Rn Þ is called asymptotically almost automorphic if and only if it can be written as f ¼ f 1 þ f 2 , where f 1 A AAðR Rn ; Rn Þ and f 2 A PC0 ðR þ Rn ; Rn Þ. This class of functions is denoted by AAAðR þ Rn ; Rn Þ. We state a lemma inspired by the paper of Liang et al. [15] about the composition result. Lemma 2.5. Let f ðt; xÞ ¼ gðt; xÞ þ ϕðt; xÞ is an asymptotically almost automorphic function with gðt; xÞ A AAðR Rn ; Rn Þ and ϕðt; xÞ A PC0 ðR þ Rn ; Rn Þ and f ðt; xÞ is uniformly continuous on any bounded subset Ω Rn uniformly in t. Then for uðÞ A AAAðR; Rn Þ, the function f ð; uðÞÞ A AAAðR Rn ; Rn Þ. The proof of the above lemma is similar to the proof of Lemma 2.5 of [11]. Definition 2.6. A bounded sequence x : Z þ -R is called almost automorphic if for every real sequence ðsn Þ, there exists a subsequence ðsnk Þ such that yðmÞ ¼ lim xðm þsnk Þ n-1
is well defined for each m A Z and lim yðm snk Þ ¼ xðmÞ
n-1
for each m A Z þ . We denote AASðZ þ ; RÞ, the set of all such sequences.
328
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
Definition 2.7. A bounded sequence z : Z þ -R þ is called asymptotically almost automorphic iff it can be written as z ¼ z1 þ z2 , where z1 A AASðZ þ ; RÞ and z2 is a null sequence. The space of these kinds of sequences is denoted by AAASðZ þ ; RÞ.
TðxÞ ¼ ðAðtÞÞ 1 ððBðtÞ þ CðtÞÞf ðxÞ þ γðtÞÞ:
Consider the following linear system corresponding to the system (1.1):
ð3:1Þ
where AðtÞ ¼ ðaij ðtÞÞ; i; j ¼ 1; 2; …; n. In order to prove our result, we need the following assumptions: (H1) The function AðtÞ A CðR; Rn Þ is asymptotically almost automorphic. (H2) detðI þ Ak Þ a 0 and the sequences Ak and τk are asymptotically almost automorphic. If U k ðt; sÞ is the Cauchy matrix associated with the system dxðtÞ ¼ AðtÞxðtÞ; dt
τk 1 rt r τk ;
then the Cauchy matrix of the system (3.1) is of the form 8 U ðt; sÞ; τk 1 r t r τk ; > > > k > > ðt; τk þ 0ÞðI þ Ak ÞU k ðt; sÞ; U > > > kþ1 > < τk 1 os o τk o t o τk þ 1 ; Uðt; sÞ ¼ ðt; τk þ 0ÞðI þ Ak ÞU k ðτk ; τk þ 0Þ U > > > kþ1 > > > ⋯ðI þ Ai ÞU i ðτi ; sÞ > > > : for τi 1 o s r τi oτk ot o τk þ 1: For the above Cauchy matrix, the solution of the system could be written as xðt; t 0 ; x0 Þ ¼ Uðt; sÞx0 , where x0 is the initial condition at the initial point t0. Let us further assume the followings: (H3) The Cauchy matrix Uðt; sÞ satisfies that there exist a positive constant Kand δ such that jUðt; sÞj rKe
δðt sÞ
J ðAðÞÞ 1 J ð J B J þ J C J ÞL o 1: Proof. Define an operator T : BðrÞ-Rn such that
3. Existence and stability
dxðtÞ ¼ AðtÞxðtÞ; t aτk dt Δxðτk Þ ¼ Ak xðτk Þ; k A Z; t A R;
fx A Ω : J x J rr ¼ J ðAðÞÞ 1 J ðð J B J þ J C J ÞM þ J γ J Þg, provided
;
this further implies that
It is not difficult to see that T is self-mapping and contraction on BðrÞ. Hence using Banach contraction principle it has unique fixed point in BðrÞ. □ Lemma 3.2. Under the properties of Cauchy matrix Uðt; sÞ, the impulsive differential equation (1.1) is equivalent to the following integral equation: Z t xðtÞ ¼ Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds 1
þ ∑ Uðt; t k ÞðI k ðxðτk ÞÞ þ γ k Þ;
ð3:2Þ
t 4 τk
where F ¼ ðF 1 ; F 2 ; …; F n ÞT and n
n
j¼1
j¼1
F i ðt; xÞ ¼ ∑ αij ðtÞf j ðxj ðtÞÞ þ ∑ βij ðtÞf j ðxj ðt αÞÞ: Proof. For t A ½0; τ1 , we claim that the following function is the solution of system (1.1): Z t xðtÞ ¼ Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds: 1
Differentiating both sides with respect to t, we get Z t dxðtÞ ∂Uðt; sÞ ¼ ðFðs; xðsÞÞ þ γðsÞÞ ds þ Fðt; xðtÞÞ þ γðtÞ; dt ∂t 1 xð0Þ ¼ ψ 0 ð0Þ 3
dxðtÞ ¼ AðtÞxðtÞ þ Fðt; xðtÞÞ þγðtÞ; dt
xð0Þ ¼ ψ 0 ð0Þ:
For t A ðτ1 ; τ2 , define Z t xðtÞ ¼ Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds 1
þ Uðt; τ1 ÞðI 1 ðxðτ1 ÞÞ þ γ 1 Þ
jUðt þt nk ; s þ t nk Þ Uðt; sÞj r Mϵe ðδ=2Þðt sÞ for any ϵ 4 0 and positive constant K. (H4) The functions αij ; βij are almost automorphic such that 1 o αij n r αij ðtÞ rαnij o 1; 1 o βij n rβij ðtÞ rβnij o1: (H5) The function fj is asymptotic almost automorphic with 0 o supt A R f j ðtÞ o 1 and satisfies jf j ðtÞ f j ðsÞj r Lj jt sj;
Z 3 xðtÞ ¼ Uðt; τ1 Þ I 1 ðxðτ1 ÞÞ þ γ 1 þ Z þ
t τ1
jI k ðxÞ I k ðyÞj r Ljx yj; for k A Z; x; y A Ω Rn . Now we are in the position to prove the main results of this paper, which are given below. Theorem 3.1. Assuming the boundedness condition of fj from (H5), there exists a unique equilibrium solution of Eq. (1.1) in BðrÞ ¼
1
Uðτ1 ; sÞðFðs; xðsÞÞ þ γðsÞÞ ds
Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds
Z 3 xðtÞ ¼ Uðt; τ1 Þ xðτ1þ Þ þ
j ¼ 1; 2; …; n:
(H6) The function γi is asymptotically almost automorphic and satisfies 1 o γ i n r γ i r γ ni o 1. (H7) The sequence Ik is asymptotically almost automorphic and there exists a positive constant L such that
τ1
xðτ1þ Þ ¼ I 1 ðxðτ1 ÞÞ þγ 1 þ
Z
τ1 1
t τ1
Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds ;
Uðτ1 ; sÞðFðs; xðsÞÞ þ γðsÞÞ ds:
Differentiating both sides of the above relation with respect to t, we obtain Z t dxðtÞ ∂Uðt; τ1 Þ ∂Uðt; sÞ ¼ xðτ1þ Þ þ ðFðs; xðsÞÞ þ γðsÞÞ ds dt ∂t ∂t τ1 þ Fðt; xðtÞÞ þ γðtÞ; xðτ1þ Þ ¼ A1 xðτ1 Þ þ I 1 ðxðτ1 ÞÞ þ γ 1 Z τ1 þ Uðτ1 ; sÞðFðs; xðsÞÞ þ γðsÞÞ ds 1
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
3
Z t dxðtÞ ¼ AðtÞ Uðt; τ1 Þ xðτ1þ Þ þ Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds dt τ1 þFðt; xðtÞÞ þ γðtÞ;
where F ¼ ðF 1 ; F 2 ; …; F n ÞT and
dxðtÞ ¼ AðtÞxðtÞ þ Fðt; xðtÞÞ þ γðtÞ dt
Δxðτ1 Þ ¼ A1 xðτ1 Þ þ I 1 ðxðτ1 ÞÞ þ γ 1 :
1
f j A AAAðRn ; RÞ:
and
A AAAðA; RÞ:
Z 3 xðtÞ ¼ Uðt; τk Þ I k ðxðτk ÞÞ þ γ k þ
τk
j¼1
1
þ Uðt; τk ÞðI 1 ðxðτk ÞÞ þ γ 1 Þ
þ
j¼1
Denote uðÞ ¼ Fð þ xðÞÞ þ γðÞ, from composition theorem and result from N'Guérékata [18], we have Z t Λ1 ϕ ¼ Uðt; sÞðFðs; ϕðsÞÞ þγðsÞÞ ds
For t A ðτk ; τk þ 1 , define Z t Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds xðtÞ ¼
t
n
αij ; βij A AAðR; RÞ
⋮
Z
n
F i ðt; xÞ ¼ ∑ αij ðtÞf j ðxj ðtÞÞ þ ∑ βij ðtÞf j ðxj ðt αÞÞ: Let us denote B PCðR; Rn Þ, the set of all asymptotically almost automorphic functions satisfying J ϕ J rK 1 , where J ϕ J ¼ supt A R jϕðtÞj and K 1 ¼ KC 0 ð1=λ þ 1=ð1 e λ ÞÞ. It is not difficult to see that Λϕ is asymptotically almost automorphic as ϕ is asymptotically almost automorphic. The function F i A AAAðR Rn ; RÞ as
Δxðτ1 Þ ¼ A1 xðτ1 Þ þ I 1 ðxðτ1 ÞÞ þ γ 1 3
329
τk 1
Uðτk ; sÞðFðs; xðsÞÞ þ γðsÞÞ ds
Further from the assumption I k ðϕðτk ÞÞ ¼ I k1 ðϕðτk ÞÞ þ I k2 ðϕðτk ÞÞ as I k A AAAðRn ; Rn Þ. Thus ∑
τ k o t þ t nk
Uðt þ t nk ; t k ÞðI k1 ðϕðτk ÞÞ þγ k Þ
¼ ∑ Uðt þ t nk ; t k þ t nk ÞðI k1 ðϕðτk þ t nk ÞÞ þ γ k Þ
Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds
τk o t
- ∑ Uðt; t k ÞðI nk1 ðϕðτk ÞÞ þ γ k Þ: Z 3 xðtÞ ¼ Uðt; τk Þ xðτkþ Þ þ xðτkþ Þ ¼ I k ðxðτk ÞÞ þ γ k þ
Z
τk 1
t Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds ;
τk
∑
Uðt t nk ; t k ÞðI nk1 ðϕðτk ÞÞ þγ k Þ
¼ ∑ Uðt t nk ; t k t nk ÞðI nk1 ðϕðτk t nk ÞÞ þ γ k Þ τk o t
Again differentiating both sides of the above relation with respect to t, we get Z t dxðtÞ ∂Uðt; τk Þ ∂Uðt; sÞ ¼ xðτkþ Þ þ ðFðs; xðsÞÞ þ γðsÞÞ ds dt ∂t ∂t τk þFðt; xðtÞÞ þ γðtÞ;
- ∑ Uðt; t k ÞðI k1 ðϕðτk ÞÞ þ γ k Þ:
ð3:4Þ
τk o t
Also lim ∑ jUðt; t k ÞjjI k2 ðϕðτk ÞÞj ¼ lim jI k2 j ∑ jUðt; t k Þj ¼ 0;
t-1τ o t k
t-1
τk o t
as ∑τk o t jUðt; t k Þj o 1. Thus Λϕ A AAAðA; RÞ. Let us denote rK 1 ; B*Bn ¼ ϕ A B : J ϕ ϕ0 J r 1 r
xðτ1þ Þ ¼ A1 xðτk Þ þ I k ðxðτk ÞÞ þ γ k Z τk þ Uðτk ; sÞðFðs; xðsÞÞ þ γðsÞÞ ds 1
where
Z t dxðtÞ ¼ AðtÞ Uðt; τk Þ xðτkþ Þ þ Uðt; sÞðFðs; xðsÞÞ þ γðsÞÞ ds 3 dt τk þ Fðt; xðtÞÞ þ γðtÞ
Z ϕ0 ðtÞ ¼
t 1
Uðt; sÞγðsÞ ds þ ∑ Uðt; t k Þγ k : τk o t
Let us calculate the norm of ϕ0 first ( ) Z t jUðt; sÞjjγ i ðsÞj ds þ ∑ jUðt; t k Þjjγ k j J ϕ0 J ¼ sup max i
tAR
(
Δxðτk Þ ¼ Ak xðτk Þ þ I k ðxðτk ÞÞ þ γ k ; 3
Similarly τ k o t t nk
Uðτk ; sÞðFðs; xðsÞÞ þγðsÞÞ ds:
ð3:3Þ
τk o t
Z
r sup max
dxðtÞ ¼ AðtÞxðtÞ þ Fðt; xðtÞÞ þ γðtÞ dt
t AR
r KC 0
Δxðτk Þ ¼ Ak xðτk Þ þ I k ðxðτk ÞÞ þ γ k :
i
1 t
1
1 1 þ λ 1 e λ
τk o t
Ke δðt sÞ jγ i ðsÞj ds þ ∑ Ke δðt τk Þ jγ k j τk o t
¼ K1:
ð3:5Þ
Hence for any ϕ A Bn , we get
⋮ Similarly the result holds for any interval ðτl ; τl þ 1 .
□
Theorem 3.3. Under the hypotheses (H1)–(H7), there exists a unique asymptotically almost automorphic solution of Eq. (1.1) provided n KLn KL max ∑ ðαnij þ βnij Þ þ o 1: 1eδ δ i j¼1
Proof. Define the operator Z t Uðt; sÞðFðs; ϕðsÞÞ þ γðsÞÞ ds þ ∑ Uðt; t k ÞðI k ðϕðτk ÞÞ þ γ k Þ; ΛϕðtÞ ¼ 1
t 4 τk
J ϕ J r J ϕ ϕ0 J þ J ϕ0 J r
rk1 K1 þK 1 ¼ : 1 r 1 r
Our next aim is to prove that Λ maps the set Bn to Bn . In order to achieve this, let us first observe that Z t J Λϕ ϕ0 J r sup max jUðt; sÞjjF i ðs; ϕðsÞÞj ds i t AR 1 ) þ ∑ jUðt; t k ÞjjI k ðϕðτk ÞÞj τk o t
(
Z
r sup max tAR
i
t 1
"
n
Ke δðt sÞ ∑ αnij jf j ðϕj ðsÞÞj j¼1
330
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
n
#
)
n
þ ∑ βij jf j ðϕj ðt αÞÞj dsþ ∑ Ke τk o t
j¼1
(
Z
"
t
r sup max
Ke
i
tAR
δðt τk Þ
δðt sÞ
Our next theorem is about exponential stability of the system (1.1).
jI k ðϕðτk ÞÞj
Theorem 3.4. Under the hypotheses (H1)–(H7), the solution of the system (1.1) is asymptotically stable if
n
∑ αnij ðLn jϕj ðsÞj þjf j ð0ÞjÞ
1
j¼1
!#
n
þ ∑ βnij jLn jϕj ðt αÞj þ jf j ð0Þj j¼1
τk o t
where Ln ¼ maxfLi ; i ¼ 1; 2; …; ng. In order to have zero as an equilibrium solution of the system (1.1), we assume that f j ð0Þ ¼ I k ð0Þ ¼ 0, and thus we have " Z t
J Λϕ ϕ0 J r max #
n
Ke
i
n
δðt sÞ
n n
∑ αij L
1
jVðtÞj ¼ jxðtÞ yðtÞj
Z rjUðt; t 0 Þjjx0 y0 j þ max i
þ
! n KL KL n n max ∑ ðαij þ βij Þ þ r JϕJ 1eδ δ i j¼1 ð3:6Þ
jUðt; sÞj jF i ðs; ϕ1 ðsÞÞ F i ðs; ϕ2 ðsÞÞj ds 1 ) t
þ ∑ jUðt; t k ÞjjI k ðϕ1 ðτk ÞÞ I k ðϕ2 ðτk ÞÞj τk o t
(
Z
r sup max i
"
t 1
j¼1
#
n
j¼1
r sup max
" Ke
i
δðt sÞ
1
n
n
#
Let us choose XðtÞ ¼ jxðtÞ yðtÞjeδt , then we have ! Z t n n n n KL ∑ ðαij þ βij ÞXðsÞ ds XðtÞ r KXðt 0 Þ þ max i
þ
)
( Z r max i
þ ∑ Ke
t 1
"
τk o t
)
j¼1
t0
KLXðτk Þ:
δ þ KLn maxi ∑nj ¼ 1 ðαnij þ βnij Þðt t 0 Þ
:
□
Using the above transformation, the system (1.1) can be converted into
n
þ ∑ βij ðtÞf j ðyj ðt αÞÞ; j¼1
n
n
j¼1
j¼1
#
Ke δðt sÞ ∑ αnij Ln þ ∑ βnij Ln ds
δðt τk Þ
∑
XðtÞ r Kð1 þ KLÞiðt0 ;tÞ Xðt 0 Þe
þ ∑ Ke δðt τk Þ Ljϕ1 ðτk Þ ϕ2 ðτk Þj ; τk o t
ð3:7Þ
n n dyi ðtÞ ¼ ∑ aij ðtÞyj ðtÞ þ ∑ αij ðtÞf j ðyj ðtÞÞ dt j¼1 j¼1
þ ∑ βnij Ln jϕ1j ðt αÞ ϕ1j ðt αÞj ds j¼1
Ke δðt τk Þ jxðτk Þ yðτk Þj:
yi ðtÞ ¼ xi ðtÞ xn ; f i ðyi ðtÞÞ ¼ f i ðxi ðtÞ þ xn Þ f i ðxn Þ; I k ðyðτk ÞÞ ¼ I k ðyðτk Þ þ xn Þ I k ðxn Þ:
∑ αnij Ln jϕ1j ðsÞ ϕ2j ðsÞj
j¼1
∑
t 0 o τk o t
By shifting the equilibrium point xn to origin, we obtain
þ ∑ Ke δðt τk Þ jI k ðϕ1 ðτk ÞÞ I k ðϕ1 ðτk ÞÞj t
j¼1
þ
Hence the proof is complete.
)
Z
Ke δðt sÞ Ln !
t0
Now by using Gronwall–Bellman's lemma [13], we get
n
þ ∑ βnij jf j ðϕ1j ðt αÞÞ f j ðϕ2j ðt αÞÞj ds
(
n
t
∑ ðαnij þ βnij Þjxi ðsÞ yi ðsÞj ds
t 0 o τk o t
Ke δðt sÞ ∑ αnij jf j ðϕ1j ðsÞÞ f j ðϕ2j ðsÞÞj
τk o t
Z
i
Thus we conclude that Λϕ A Bn . Now our aim is to prove that Λ is a contraction. For any ϕ1 ; ϕ2 A Bn , we obtain
i
jUðt; sÞj
t0
jUðt; t k ÞjjI k ðxðτk ÞÞ I k ðyðτk ÞÞj
r Ke δðt t0 Þ jx0 y0 j þmax
rK 1 : 1r
J Λϕ1 Λϕ2 J Z r sup max
∑
t 0 o τk o t
n
¼ r JϕJ r
t
jF i ðs; xðsÞÞ F i ðs; yðsÞÞj ds
j¼1
)
τk o t
j¼1
tAR
j¼1
Proof. The proof is similar to the proof of Assertion-2 [22] but for the readers convenience, we are giving the details here. For any two solutions x(t) and y(t) of system (1.1) with values at t 0 ; x0 and y0, let us denote VðtÞ ¼ xðtÞ yðtÞ. Then
þ ∑ βnij Ln ds þ ∑ Ke δðt τk Þ L J ϕ J
tAR
i
)
þ ∑ Ke δðt τk Þ ðLjϕðτk Þj þ jI k ð0ÞjÞ ;
tAR
n
δ KLn max ∑ ðαnij þ βnij Þ N lnð1 þ KLÞ 4 0:
1 þ KL o e;
ds
L J ϕ1 ϕ2 J ;
( ) n KLn KL max ∑ ðαnij þ βnij Þ þ r J ϕ1 ϕ2 J 1eδ δ i j¼1 ¼ r J ϕ1 ϕ2 J : From our assumption, we have n KLn KL max ∑ ðαnij þ βnij Þ þ o1 1 eδ δ i j¼1
and hence the mapping Λ is a contraction. By using Banach contraction principle, we conclude that there exists a unique asymptotically almost automorphic solution of (1.1). □
t aτk ;
α 40;
Δðyðτk ÞÞ ¼ Ak yðτk Þ þ I k ðyðτk ÞÞ; yðτk 0Þ ¼ yðτk Þ; k A Z;
yðτk þ 0Þ ¼ yðτk Þ þ Δyðτk Þ;
t A R;
yðtÞ ¼ Ψ 0 ðtÞ xn ;
t A ½ α; 0:
ð3:8Þ
Theorem 3.5. Equilibrium solution of the system (3.8) is stable if there exists ϵ ¼ 1=L 4 0 and PðÞ A SM þ ½0; 1Þ such that the Riccati differential inequality P_ ðtÞ þ 2PðtÞAðtÞ þ ϵðPðtÞðBT ðtÞBðtÞ þC T ðtÞCðtÞÞPðtÞÞ þ 2I r 0 holds, where SM þ ½0; 1Þ denotes the set of all semi-positive definite symmetric matrices on ½0; 1Þ.
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
Proof. Consider ϵ 4 0 and PðÞ A SM þ ½0; 1Þ as a solution of Riccati's inequality and define the following Lyapunov–Krasovskii functional: Z t VðtÞ ¼ 〈PðtÞyðtÞ; yðtÞ〉 þ jyðsÞj ds: t α
Taking derivative of the above functional along the solution, we obtain _ V_ ðtÞ ¼ 〈P_ ðtÞyðtÞ; yðtÞ〉 þ 2〈PðtÞyðtÞ; yðtÞ〉
331
Hence using the Lyapunov theory for stability of dynamical systems, the equilibrium solution of the system (3.8) is stable. □ 4. Examples The classical model of Hopfield neural network is the following: n dxi ðtÞ ¼ ai ðtÞxi ðtÞ þ ∑ αij f j ðxj ðtÞÞ dt j¼1
þ jyðtÞj2 jyðt αÞj2 ¼ 〈P_ ðtÞyðtÞ; yðtÞ〉 þ 2〈PðtÞAðtÞyðtÞ; yðtÞ〉
n
þ ∑ βij f j ðxj ðt αÞÞ þ γ i ðtÞ; j¼1
þ 2〈PðtÞBðtÞf ðyðtÞÞ; yðtÞ〉
t a τk ;
þ 2〈PðtÞBðtÞf ðyðtÞÞ; yðtÞ〉 þ jxðtÞj2 jyðt αÞj2 : ¼ 〈P_ ðtÞyðtÞ; xðtÞ〉þ 2〈PðtÞAðtÞyðtÞ; yðtÞ〉
α 40;
Δðxðτk ÞÞ ¼ Ak xðτk Þ þ I k ðxðτk ÞÞ þ γ k ;
þ ϵ〈PðtÞBT ðtÞBðtÞyðtÞ; yðtÞ〉
xðτk 0Þ ¼ xðτk Þ;
þ 2〈f ðyðtÞÞ; yðtÞ〉 þϵ〈PðtÞC T ðtÞCðtÞyðtÞ; yðtÞ〉
k A Z;
þ 2〈f ðyðt αÞÞ; yðtÞ〉
t A R;
xðtÞ ¼ ϕ0 ðtÞ;
þ jyðtÞj2 jyðt αÞj2 : ¼ 〈ðP_ ðtÞ þ2PðtÞAðtÞ þ ϵPðtÞBT ðtÞBðtÞPðtÞ
xðτk þ 0Þ ¼ xðτk Þ þ Δxðτk Þ;
t A ½ α; 0
ð4:1Þ
where ai ; f j ; γ i A CðR; RÞ, αij ; βij R for i ¼ 1; 2; …; n, j ¼ 1; 2; …; n. Also Ak A Rnn , I k ðxÞ A CðΩ, Rn Þ; γ k A Rn . We denote Ω a domain in Rn . In this case our matrix A(t) is a diagonal matrix with diagonal entire a1 ðtÞ; …; an ðtÞ. We assume that ai(t) is asymptotically almost automorphic and ai ðtÞ 1 for any i ¼ 1; 2; …; n. One can easily
þ ϵPðtÞC T ðtÞCðtÞPðtÞ þ ðϵ 1 L þ 1ÞIÞyðtÞ; yðtÞ〉 r 0: 20
15
(x1 and x2)
10
5
0
−5
−10
0
50
100
150
200
250
300
350
400
450
500
450
500
Time (t) Fig. 1. Asymptotically almost automorphic solution of (4.1) for the case 1.
2.5 2 1.5
(x1 and x2)
1 0.5 0 −0.5 −1 −1.5 −2
0
50
100
150
200
250
300
350
400
Time (t) Fig. 2. Asymptotically almost automorphic solution of (4.1) for the case 2.
332
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
2.5 2 1.5
(x1 and x2)
1 0.5 0 −0.5 −1 −1.5 −2
0
50
100
150
200
250
300
350
400
450
500
Time (t)
Fig. 3. Asymptotically almost automorphic solution of (4.1) for the case 3.
30 25 20
(x1 and x2)
15 10 5 0 −5 −10
0
50
100
150
200
250
300
350
400
450
500
Time (t)
Fig. 4. Asymptotically almost automorphic solution of (4.1) for the case 4.
verify the hypotheses (H1), (H2) for this case. Let us assume the hypothesis (H3). Now if all the conditions of Theorem 3.3 are satisfied, then there exists an asymptotically almost automorphic solution of (4.1). Here we simulate the behaviour of the solution of Eq. (4.1) for several sets of parameters. The simulation results can be seen in the corresponding figures. One can easily see that the behaviour of the solution is asymptotically almost automorphic when the parameters are asymptotic almost automorphic. Fig. 1 corresponds to case 1, Fig. 2 corresponds to case 2, Fig. 3 corresponds to case 3 and Fig. 4 corresponds to case-4. Case 1: Let us consider the following set of parameters: a1 ðtÞ ¼ signumð cos 2πtθÞ;
β12 ¼ 0:2;
γ 1 ðtÞ ¼ 2 sin
pffiffiffiffiffi 2t þ e t
a2 ðtÞ ¼ 4 þ cos ðtÞ; β21 ¼ signumð cos 2πtθÞ; pffiffiffiffiffi 1 γ 2 ðtÞ ¼ 2 cos 3t þ ð1 þ tÞ
Ak ¼
0:3
0
0
0:3
a2 ðtÞ ¼ signumð2πtθÞ; Ak ¼
0 0:3
γ 2 ðtÞ ¼ 2 cos
pffiffiffiffiffi 3t þ
1 ð1 þ tÞ
I k ðxÞ ¼ 0:9jxj;
x1 ðsÞ ¼ 1 ¼ x2 ðsÞ; s A ½ 0:1; 0;
γk ¼
0:25
:
0:25
Case 3: Let us consider the following set of parameters: pffiffiffiffiffi a1 ðtÞ ¼ signumð2πtθÞ; β12 ¼ 0:2; γ 1 ðtÞ ¼ 2 sin 2t þ e t
x1 ðsÞ ¼ 1 ¼ x2 ðsÞ; s A ½ 0:1; 0;
0:3 0
β21 ¼ 0:2;
a2 ðtÞ ¼ signumð2πtθÞ; β21 ¼ signumð cos 2πtθÞ; pffiffiffiffiffi 1 γ 2 ðtÞ ¼ 2 cos 3t þ ð1 þ tÞ Ak ¼
I k ðxÞ ¼ 0:9jxj;
Case 2: Let us consider the following set of parameters: pffiffiffiffiffi a1 ðtÞ ¼ signumð2πtθÞ; β12 ¼ 0:2; γ 1 ðtÞ ¼ 2 sin 2t þ e t
γk ¼
0:25 0:25
0:3 0
0 0:3
:
I k ðxÞ ¼ 0:9jxj;
x1 ðsÞ ¼ 1 ¼ x2 ðsÞ; s A ½ 0:1; 0;
γk ¼
0:25 0:25
:
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
Case 4: Let us consider the following set of parameters: pffiffiffiffiffi a1 ðtÞ ¼ signumð cos 2πtθÞ; β12 ¼ 0:2; γ 1 ðtÞ ¼ 2 sin 2t þe t a2 ðtÞ ¼ signumð cos 2πtθÞ; Ak ¼
0:3
0
0
0:3
I k ðxÞ ¼ 0:9jxj;
β21 ¼ 0:2;
γ 2 ðtÞ ¼ 2 cos
pffiffiffiffiffi 3t þ
1 ð1 þ tÞ
x1 ðsÞ ¼ 1 ¼ x2 ðsÞ; s A ½ 0:1; 0;
γk ¼
0:25 0:25
:
5. Discussion It is well known that periodicity is very well studied behaviour of many physical and natural systems. In recent past many mathematicians and scientists have argued that a more general class of functions are more suitable to explain many complicated processes which show behaviour which is near periodic. This kind of behaviour of a physical system is called almost periodic. Hence almost periodic functions are most suitable to explain these kinds of phenomena. The asymptotically almost automorphic functions cover larger class of functions and hence more complicated behaviour can be expressed in terms of these functions. This class already contain the class of almost periodicity, automorphy, asymptotic almost periodicity and hence it is more general. One natural question one can always ask in the neural network theory is that what will be the nature of output when all the parameters are asymptotic almost automorphic. In this work, we answered this question under certain condition. The asymptotic stability of solution is also established under certain conditions on the parameters. In the numerical simulation section, one can easily see that the results support our claim.
Acknowledgement We are thankful to anonymous reviewers for their constructive comments and suggestions, which help us to improve the manuscript. The first author work is partially supported by NBHM Grant “SRIC/IITMANDI/2013/NBHM/SYA/45/02”. References [1] S. Abbas, Y. Xia, Existence and attractivity of k-almost automorphic sequence solution of a model of cellular neural networks with delay, Acta Math. Sci. 33 (1) (2013) 290–302. [2] S. Abbas, Existence and attractivity of k-pseudo almost automorphic sequence solution of a model of bidirectional neural networks, Acta Appl. Math. 119 (2012) 57–74. [3] S. Abbas, Pseudo almost periodic sequence solutions of discrete time cellular neural networks, Nonlinear Anal. Model. Control 14 (3) (2009) 283–301. [4] B. Ammar, F. Cherif, A.M. Alimi, Existence and uniqueness of pseudo almostperiodic solutions of recurrent neural networks with time-varying coefficients and mixed delays, IEEE Trans. Neural Netw. Learn. Syst. 23 (1) (2012) 109–118. [5] D. Araya, R. Castro, C. Lizama, Almost automorphic solutions of difference equations, Adv. Differ. Equ. (2009) Art. ID 591380, 15 pp. [6] D.D. Bainov, P.S. Simeonov, Systems with Impulsive Effects, Horwood, Chichister, 1989. [7] D.D. Bainov, P.S. Simeonov, Impulsive Differential Equations: Periodic Solutions and its Applications, Longman Scientific and Technical Group, England, 1993. [8] H. Bohr, Almost-periodic functions, Chelsea, reprint, 1947 (AMS Chelsea Publishing, 1947, ISBN-13: 978-0828400275). [9] A. Chavez, S. Castiallo, M. Pinto, Discontinuous almost automorphic functions and almost solutions of differential equations with piecewise constant argument, ariv, preprint, 2013. [10] T. Diagana, E. Hernndez, J.C. Santos, Existence of asymptotically almost automorphic solutions to some abstract partial neutral integro-differential equations, Nonlinear Anal. (71) (2009) 248–257. [11] H.S. Ding, T.J. Xiao, J. Liang, Asymptotically almost automorphic solutions for some integrodifferential equations with nonlocal initial conditions, J. Math. Anal. Appl. 338 (2008) 141–151.
333
[12] Q. Dong, K. Matsui, X. Haung, Existence and stability of periodic solutions for Hopfield neural network equations with periodic input, Nonlinear Anal.: Theory Methods Appl. 49 (4) (2002) 471–479. [13] A.M. Samoilenko, N.A. Perestyuk, Differential Equations with Impulse Effects, Viska Skola, Kiev, 1987 (in Russian). [14] Z. Gui, W. Ge, X. Yang, Periodic oscillation for a Hopfield neural networks with neutral delays, Phys. Lett. 364 (3–4) (2007) 267–273. [15] J. Liang, J. Zhang, T.J. Xiao, Composition of pseudo almost automorphic and asymptotically almost automorphic functions, J. Math. Anal. Appl. 340 (2008) 1493–1499. [16] J. Liu, C. Zhang, Composition of piecewise pseudo almost periodic functions and applications to abstract impulsive differential equations, Adv. Differ. Equ. 2013 (2013:11) 21 pp. [17] L. Mahto, S. Abbas, A. Favini, Analysis of Caputo impulsive fractional order differential equations with applications, Int. J. Differ. Equ. 2013 (2013) 1–11. [18] G.M. N'Guérékata, Topics in Almost Automorphy, Springer, New York, 2005. [19] G.M. N'Guérékata, A. Pankov, Integral operators in spaces of bounded, almost periodic and almost automorphic functions, in: Differ. Integral Equ. 21(11–12) (2008) 1155–1176. [20] M. Sannay, Exponential stability in Hopfield-type neural networks with impulses, Chaos Solitons Fractals 32 (2) (2007) 456–467. [21] A.M. Schäfer, H. Zimmermann, Recurrent neural networks are universal approximators, Artificial Neural Networks, Lecture Notes in Computer Science, vol. 4131, Springer-Verlag, New York, 2006, pp. 632–640. [22] G.T. Stamova, Impulsive cellular neural networks and almost periodicity, Proc. Jpn, Acad. Sci. Ser. A 80 (10) (2005) 198–203. [23] I.M. Stamova, G.T. Stamov, Impulsive control on global asymptotic stability for a class of impulsive bidirectional associative memory neural networks with distributed delays, Math. Comput. Model. 53 (5–6) (2011) 824–831. [24] S. Ahmad, I.M. Stamova, Global exponential stability for impulsive cellular neural networks with time-varying delays, Nonlinear Anal. 69 (3) (2008) 786–795. [25] I.M. Stamova, R. Ilarionov, On global exponential stability for impulsive cellular neural networks with time-varying delays, Comput. Math. Appl. 59 (11) (2010) 3508–3515. [26] G.T. Stamova, Almost Periodic Solutions of Impulsive Differential Equations, Lecture Notes in Mathematics, vol. 2047, Springer, Heidelberg, 2012, xxþ 217 pp. ISBN: 978-3-642-27545-6. [27] G.T. Stamova, I.M. Stamova, J.O. Alzabut, Existence of almost periodic solutions for strongly stable nonlinear impulsive differential-difference equations, Nonlinear Anal. Hybrid Syst. 6 (2012) 818–823. [28] J.Q. Zhao, Y.K. Chang, G.M. N'Guérékata, Existence of asymptotically almost automorphic solutions to nonlinear delay integral equations, Dyn. Syst. Appl. 21 (2012) 339–350. Syed Abbas received his M.Sc. and Ph.D. degrees from Indian Institute of Technology Kanpur, India, in 2004 and 2009 respectively. He is currently working as an assistant professor at the School of Basic Sciences, Indian Institute of Technology Mandi, India, since August 2010. He worked as a research associate at the University of Fribourg, Switzerland, during July–September 2009 and postdoctoral fellow at the University of Bologna, Italy, during June–August 2010 and May– July 2011. From November 2012 to December 2012, he was a visiting guest scientist at the Department of Mathematics, TU Dresden, Germany. His main research interests are Abstract/delay/fractional differential equations, almost periodic solutions, neural networks, and ecological modelling. Lakshman Mahto received his M.Sc. degree in Mathematics from the Department of Mathematics, Ranchi university, Ranchi, Jharkhand, India, in 2009. He is currently pursuing his Ph.D. degree in Mathematics from Indian Institute of Technology Mandi, India. His current research interests include stability theory of neural networks and fractional differential equations.
Mokhtar Hafayed was born in Biskra, Algeria. He is currently working as an assistant professor in the Laboratory of Applied Mathematics, Biskra University, Algeria. His main research interests are differential equations, stochastic control, maximum principle, forward–backward stochastic differential equations and mathematics finance.
334
S. Abbas et al. / Neurocomputing 142 (2014) 326–334
Adel M. Alimi was born in Sfax, Tunisia, in 1966. He graduated in Electrical Engineering in 1990, obtained a Ph.D. and then an HDR both in Electrical & Computer Engineering in 1995 and 2000 respectively. He is now a professor in Electrical & Computer Engineering at the University of Sfax. His research interest includes applications of intelligent methods (neural networks, fuzzy logic, evolutionary algorithms) to pattern recognition, robotic systems, vision systems, and industrial processes. He focuses his research on intelligent pattern recognition, learning, analysis and intelligent control of large scale complex systems. He is an associate editor and a member of the editorial board of many international scientific journals (e.g. “Pattern Recognition Letters”, “Neurocomputing”, “Neural Processing Letters”, “International Journal of Image and Graphics”, “Neural Computing and Applications”, “International Journal of Robotics and Automation”, “International Journal of Systems Science”, etc.). He is an IEEE senior member and a member of IAPR, INNS and PRS. He is the 2009–2010
IEEE Tunisia Section Treasurer, the 2009–2010 IEEE Computational Intelligence Society Tunisia Chapter Chair, the 2011 IEEE Sfax Subsection, the 2010–2011 IEEE Computer Society Tunisia Chair, the 2011 IEEE Systems, Man, and Cybernetics Tunisia Chapter, the SMCS corresponding member of the IEEE Committee on Earth Observation, and the IEEE Counselor of the ENIS Student Branch.