Neural Networks 53 (2014) 127–133

Contents lists available at ScienceDirect

Neural Networks journal homepage: www.elsevier.com/locate/neunet

Further results on robustness analysis of global exponential stability of recurrent neural networks with time delays and random disturbances Weiwei Luo a , Kai Zhong a , Song Zhu a,∗ , Yi Shen b a

College of Sciences, China University of Mining and Technology, Xuzhou, 221116, China

b

School of Automation, Huazhong University of Science and Technology, Wuhan, 430074, China

article

info

Article history: Received 30 December 2013 Received in revised form 8 February 2014 Accepted 16 February 2014 Keywords: Recurrent neural networks Global exponential stability Time delays Random disturbances Adjustable parameters

abstract In this paper, further results on robustness analysis of global exponential stability of recurrent neural networks (RNNs) subjected to time delays and random disturbances are provided. Novel exponential stability criteria for the RNNs are derived, and upper bounds of the time delay and noise intensity are characterized by solving transcendental equations containing adjustable parameters. Through the selection of the adjustable parameters, the upper bounds are improved. It shows that our results generalize and improve the corresponding results of recent works. In addition, some numerical examples are given to show the effectiveness of the results we obtained. © 2014 Elsevier Ltd. All rights reserved.

1. Introduction Recurrent neural networks RNNs are nonlinear dynamic systems with some resemblance of biological neural networks in the brain, which include the well-known Hopfield neural networks, cellular neural networks as special cases. In recent decades, many RNNs have been developed and extensively applied to many areas, such as associative memories, image processing, pattern recognition, classification and prediction, signal processing, robotics and control. The stability of recurrent neural network is necessary for most successful neural network applications. The stability of RNNs depends mainly on their parametrical configuration. In biological neural systems, signal transmission via synapses is usually a noisy process influenced by random fluctuations from the release of neurotransmitters and other disturbances (Haykin, 1994). Moreover, in the implementation of RNNs, external random disturbances and time delays of signal transmission are common and hardly been avoided. It is known that random disturbances and time delays in the neuron activations may result in oscillation or instability of RNNs (Gopalsamy & Leung, 1996; Pham, Pakdaman, & Virbert, 1998). The stability analysis of delayed RNNs (DRNNs) and stochas-



Corresponding author. E-mail addresses: [email protected] (W. Luo), [email protected] (S. Zhu), [email protected] (Y. Shen). http://dx.doi.org/10.1016/j.neunet.2014.02.007 0893-6080/© 2014 Elsevier Ltd. All rights reserved.

tic RNNs (SRNNs) with external random disturbances have been widely investigated in recent years (see, e.g., Arik, 2002, 2004, Cao, Yuan, & Li, 2006, Chua & Yang, 1988, Faydasicok & Arik, 2013, He, Wu, & She, 2006, Hu, Gao, & Zheng, 2008, Liao, Chen, & Sanchez, 2002, Liao & Wang, 2003, Liu, Wang, & Liu, 2006, Wang, Liu, Li, & Liu, 2006, Xu, Lam, & Ho, 2006, Yuan, Cao, & Li, 2006, Zeng & Wang, 2006a, 2006b, Zhang & Jin, 2000, Zhang, Wang, & Liu, 2009 and Zhu, Shen, & Chen, 2010a, 2010b and the references cited therein). It is well known that noise and time delays can lead to instability and they can destabilize stable RNNs if they exceed their limits. The instability depends on the intensity of the noise and time delays (Mao, 2007). For a stable RNN, if the noise intensity is low and time delay is small, the perturbed RNN may still be stable. Therefore, it is interesting to determine how much time delay and random disturbance that a stable RNN can withstand without losing its global exponential stability. Although various stability properties of RNNs have been analyzed extensively in recent years by using the Lyapunov and the linear matrix inequality (LMI) methods (Chen & Lu, 2008; Huang, Ho, & Lam, 2005; Zhang & Wang, 2008; Zheng, Shan, Zhang, & Wang, 2013), the robustness of the global stability of RNNs is rarely investigated directly by estimating the upper bounds of noise level and time delays. Motivated by the above discussions, our purpose is to quantify the parameter uncertainty level for stable RNNs in this paper. Compared with the conventional Lyapunov stability theory and linear matrix inequality methods, we investigate the robust stability for global exponential stability directly from the coefficients of the

128

W. Luo et al. / Neural Networks 53 (2014) 127–133

RNNs which should satisfied the global exponential stability condition. In this paper, we further characterize the robustness of RNNs with time delays and additive noise by deriving the upper bounds of delays and noise for global exponential stability. Novel exponential stability criteria for the RNNs are derived, and upper bounds of the time delay and noise intensity are estimated by solving transcendental equations containing adjustable parameters. We generalize and improve previous results by choosing adjustable parameters. Moreover, we prove theoretically that, for any globally exponentially stable RNNs, if additive noise and time delays are smaller than the derived upper bounds herein, then the perturbed RNNs are guaranteed to be globally exponentially stable.

Throughout this paper, unless otherwise specified, Rn and Rn×m denote, respectively, the n-dimensional Euclidean space and the set of n × m real matrices. Let (Ω , F , {Ft }t ≥0 , P ) be a complete probability space with a filtration {Ft }t ≥0 satisfying the usual conditions (i.e., the filtration contains all P-null sets and is right continuous). ω(t ) be a scalar Brownian motion defined on the probability space. If A is a matrix, its operator norm is denoted by ∥A∥ = sup{|Ax| : |x| = 1}, where | · | is the Euclidean norm. Denote L2F0 ([−τ¯ , 0]; Rn ) as the family of all F0 − measurable C ([−τ¯ , 0]; Rn ) valued random variables ψ = {ψ(θ ) : −τ¯ ≤ θ ≤ 0} such that sup−τ¯ ≤θ≤0 E |ψ(θ )|2 < ∞ where E {} stands for the mathematical expectation operator with respect to the given probability measure P. Consider an RNN model z˙ (t ) = −Az (t ) + Bg (z (t )) + u, z (t0 ) = z0 ,

(1)

where z (t ) = (z1 (t ), . . . , zn (t )) ∈ R is the state vector of the neurons, t0 ∈ R+ and z0 ∈ Rn are the initial values, A = diag{a1 , . . . , an } ∈ Rn×n is the self-feedback connection weight matrix, B = (bkl )n×n ∈ Rn×n is the connection weight matrix, u is the neuron external input (bias), g (z (t )) = (g1 (z1 (t )), . . . , gn (zn (t )))T ∈ Rn is a vector-valued activation function which satisfies the global Lipschitz condition, i.e., T

n

∀u, v ∈ Rn ,

(2)

where k is a known constant. In addition, we assume that RNN (1) has an equilibrium point z ∗ = (z1∗ , z2∗ , . . . , zn∗ )T ∈ Rn . Let x(t ) = z (t ) − z ∗ , f (x(t )) = g (x(t ) + z ∗ ) − g (z ∗ ), and then, (1) can be rewritten as x˙ (t ) = −Ax(t ) + Bf (x(t )), x(t0 ) = x0 ,

(3)

where x0 = z0 − z ∗ i.e., the origin is an equilibrium point of (3). Hence, the stability of the equilibrium point z ∗ of (1) equals to the stability of origin point in the state space of (3). In addition, the function f in (3) satisfies the following Lipschitz condition and f (0) = 0. Assumption 1. The activation function f (·) satisfies the following Lipschitz condition; i.e.,

|f (u) − f (v)| ≤ k|u − v|,

|x(t ; t0 , x0 )| ≤ α|x(t0 )| exp(−β(t − t0 )),

∀t ≥ t0 ,

(5)

where x(t ; t0 , x0 ) is the state of the model in (3). Numerous criteria for ascertaining the global exponential stability of RNN (3) have been developed, e.g., Chen (2001), Shen and Wang (2007, 2008, 2012), Zeng, Wang, and Liao (2005), Zhu and Shen (2013) and Zhu et al. (2010a, 2010b). 3. Main results In the following equation, we consider the noise-induced stochastic RNNs (SRNNs) described by the Itô stochastic equation

2. Problem formulation

|g (u) − g (v)| ≤ k|u − v|,

Definition 1. The state of RNN (3) is globally exponentially stable, if for any t0 , x0 , there exist α > 0 and β > 0 such that

∀u, v ∈ Rn , f (0) = 0,

(4)

where k is a known constant. It is known that, based on Assumption 1, RNN (3) has a unique state x(t ; t0 , x0 ) on t ≥ t0 for any initial value t0 , x0 . The origin is the equilibrium point of RNN (3), as f (0) = 0. Now we define the global exponential stability of the state of RNN (3).

dy(t ) = [−Ay(t ) + Bf (y(t ))]dt + σ y(t )dω(t ), y(t0 ) = x0 ,

(6)

with the initial data y(t0 ) = x0 ∈ R , and A, B, f , in here are the same in (3), σ is the intensity of noise, ω(t ) is a scalar Brownian motion defined on the probability space (Ω , F , {Ft }t ≥0 , P ). Under Assumption 1, SRNN (6) has a unique state y(t ; t0 , x0 ) on t ≥ t0 for any initial value t0 , x0 ; the origin point y = 0 is the equilibrium point. Now, the question is, given globally exponentially stable RNN (3), how much the noise intensity will not derail the stability of RNNs? We will characterize how much the stochastic perturbation can bear such that SRNN (6) continues to remain globally exponentially stable. For the SRNN model (6), we give the following definition of global exponential stability. n

Definition 2 (Mao, 2007). SRNN (6) is said to be almost surely globally exponentially stable if for any t0 ∈ R+ , x0 ∈ Rn , there exist α > 0 and β > 0 such that ∀t ≥ t0 , |y(t ; t0 , x0 )| ≤ α|y(t0 )| exp(−β(t − t0 )) holds almost surely, i.e., the Lyapunov exponent lim supt →∞ (ln |y(t ; t0 , x0 )|/t ) < 0 almost surely, where y(t ; t0 , x0 ) is the state of SRNN (6). SRNN (6) is said to be mean square globally exponentially stable if for any t0 ∈ R+ , x0 ∈ Rn , there exist α > 0 and β > 0 such that ∀t ≥ t0 , E |y(t ; t0 , x0 )|2 ≤ α|y(t0 )| exp(−β(t − t0 )) holds; i.e., the Lyapunov exponent lim supt →∞ (ln(E |y(t ; t0 , x0 )|2 )/t ) < 0, where y(t ; t0 , x0 ) is the state of SRNN (6). From the definition, in general, almost surely global exponential stability and mean square one do not imply each other (Mao, 2007). However, if Assumption 1 holds, we have the following lemma (Mao, 2007, Theorem 4.2, p. 128). Lemma 1. Let Assumption 1 hold. The global exponential stability in sense of mean square of SRNN (6) implies the almost surely global exponential stability of SRNN (6). Theorem 1. Let Assumption 1 hold and RNN (3) be globally exponentially stable. Noise-induced SRNN (6) is mean square globally exponentially stable and also almost surely globally exponentially stable, if |σ | < σ¯ is a unique positive solution of the transcendental equation 2σ 2 α 2

(1 − ε)β



exp 4∆



2∆

ε

(∥A∥2 + ∥B∥2 k2 ) +

σ2 1−ε



+ 2α exp(−2β ∆) = 1, 2

(7)

where ε is an adjustable parameter, ε ∈ (0, 1) and ∆ > ln(2α )/ (2β) > 0. 2

Proof. For simplify, we will denote x(t ; t0 , x0 ) and y(t ; t0 , x0 ) as x(t ) and y(t ), respectively. From (3) and (6), we have x(t ) − y(t ) =



t

[−A(x(s) − y(s)) + B(f (x(s)) − f (y(s)))]ds

t0



t

− t0

σ y(s)dω(s).

W. Luo et al. / Neural Networks 53 (2014) 127–133

So, when t ≤ t0 + 2∆, based on Assumption 1 and the global exponential stability of (3), we have



ε

(∥A∥ + ∥B∥ k ) + 2 2

− y(s)|2 ds +  ≤

4∆

ε t





2



1−ε

2σ 2 1−ε

t0

E |x(s)

2σ 2

· (∥A∥2 + ∥B∥2 k2 )ε + 8∆2 (∥A∥2 + ∥B∥2 k2 ) = 0.



1−ε

2

2σ α

(1 − ε)β  · sup

exp 4∆

t0 ≤ t ≤ t0 + ∆



E |y(t )|2

2∆

ε 

(∥A∥2 + ∥B∥2 k2 ) +

+ 2α 2 |x(t0 )|2 exp(−2β(t − t0 ))    2σ 2 α 2 2∆ ≤ exp 4∆ (∥A∥2 + ∥B∥2 k2 ) (1 − ε)β ε   σ2 + + 2α 2 exp(−2β ∆) 1−ε   2 · sup E |y(t )| . t0 ≤ t ≤ t0 + ∆

From (7), when |σ | < σ¯ , we have c1 < 1,

So γ > 0, from (9), we have



sup t0 +∆≤t ≤t0 +2∆

E |y(t )|2 ≤ exp(−γ ∆)

σ 1−ε

t0 ≤t ≤t0 +∆

E |y(t )|2 .

(12)

y(t ; t0 , x0 ) = y(t ; t0 + (m − 1)∆, y(t0 + (m − 1)∆; t0 , x0 )). (13) (8)

From (12) and (13) sup

t0 +m∆≤t ≤t0 +(m+1)∆

E |y(t ; t0 , x0 )|2

 =

sup

t0 +(m−1)∆+∆≤t ≤t0 +(m−1)∆+2∆ 2



sup

Then, for any positive integer m = 1, 2, . . . , from the existence and uniqueness of the state of RNN (6), when t ≥ t0 + (m − 1)∆, we have

E |y(t )|2 ≤ 2E |x(t ) − y(t )|2 + 2E |x(t )|2



γ = − ln c1 /∆.



So, when t0 + ∆ ≤ t ≤ t0 + 2∆, from (8) and the global exponential stability of (3),



(11)

Therefore, from (11), it has a real solution ε0 , to ε = ε0 into c1 to go, we can know that c1 is a strictly monotone function. So, Eq. (7) exsits a unique σ¯ such that |σ¯ | = σmax , when ε = ε0 ∈ (0, 1). Let

σ α |x0 | 4∆ exp (∥A∥2 + ∥B∥2 k2 ) (1 − ε)β ε   2σ 2 + (t − t0 ) 1−ε   σ 2α2 2∆ ≤ exp 4∆ (∥A∥2 + ∥B∥2 k2 ) (1 − ε)β ε    σ2 2 + · sup E |y(t )| . 1−ε t0 ≤ t ≤ t0 + ∆ 2

2

(10)

t0

When t0 + ∆ ≤ t ≤ t0 + 2∆, from the Gronwall inequality (Mao, 2007), we have

2



  ε 3 + 8∆2 (∥A∥2 + ∥B∥2 k2 ) − 4∆σ 2 − 1 ε 2 − 16∆2

α 2 |x0 |2 exp(−2β(s − t0 ))ds

σ 2 α 2 |x0 |2 E |x(s) − y(s)| ds + . (1 − ε)β

E |x(t ) − y(t )|2 ≤

σ2 1−ε

So, from (10), we have

t0

t

(∥A∥2 + ∥B∥2 k2 ) +

2

ε

(∥A∥2 + ∥B∥2 k2 ) +

 ∂ c1 2σ 2 α 2 1 4∆σ 2 8∆2 = + − ∂ε β (1 − ε)2 (1 − ε)3 ε 2 (1 − ε)    2∆ · (∥A∥2 + ∥B∥2 k2 ) exp 4∆ (∥A∥2 + ∥B∥2 k2 ) ε  σ2 + . 1−ε

2

·

2∆

Therefore, we have

t



(1 − ε)β



∂ c1 = 0. ∂ε

t0 2



exp 4∆

So, when t0 + ∆ ≤ t ≤ t0 + 2∆, let

σ2 + B(f (x(s)) − f (y(s)))|2 ds + 1−ε  t E |x(s) − y(s) − x(s)|2 ds · 4∆

2σ 2 α 2

+ 2α 2 exp(−2β ∆).

2   1  t  ≤ E  [−A(x(s) − y(s)) + B(f (x(s)) − f (y(s)))]ds ε t0 2  t   1 σ y(s)dω(s) E  + 1−ε t0  t  t 1 | − A(x(s) − y(s)) 12 ds ≤ E ε t0 t0



where c1 =

E |x(t ) − y(t )|2

129



E |y(t ; t0

 + (m − 1)∆, y(t0 + (m − 1)∆; t0 , x0 ))|2   ≤ exp(−γ ∆) sup E |y(t ; t0 , x0 )|2 ...

t0 +(m−1)∆≤t ≤t0 +m∆

 ≤ exp(−γ m∆)

sup

t0 ≤ t ≤ t0 + ∆

E |y(t ; t0 , x0 )|

2



= c exp(−γ m∆), where c = supt0 ≤t ≤t0 +∆ E |y(t ; t0 , x0 )|2 . So for ∀t > t0 + ∆, there have been a positive integer m such that t0 + m∆ ≤ t ≤ t0 + (m + 1)∆, we have (9)

E |y(t ; t0 , x0 )|2 ≤ c exp(−γ t + γ t0 + γ ∆)

= (c exp(γ ∆)) exp(−γ (t − t0 )).

(14)

The condition is also true when t0 ≤ t ≤ t0 + ∆. So SRNN (6) is mean square globally exponentially stable. According to Lemma 1, SRNN (6) is also almost surely globally exponentially stable.

130

W. Luo et al. / Neural Networks 53 (2014) 127–133

Corollary 1 (Shen & Wang, 2012). When ε = 1/2, noise-induced SRNN (6) is mean square globally exponentially stable and also almost surely globally exponentially stable, if |σ | < σ¯ , is a unique positive solution of the transcendental equation 4σ 2 α 2

β



exp 8∆[2∆(∥A∥2 + ∥B∥2 k2 ) + σ 2 ]



+ 2α 2 exp(−2β ∆) = 1

2c3 exp(2∆c2 ) + 2α 2 exp(−2β(∆ − τ¯ )) = 1, (15)

where ∆ > ln(2α 2 )/(2β) > 0.

In Theorem 1, we derived the conditions of global exponential stability of RNNs in the presence of noise impact stability of recurrent neural networks. Next we will consider the impact of delay on the global stability of stochastic RNNs. For the stochastic DRNNs (SDRNNs) dy(t ) = [−Ay(t ) + Bf (y(t )) + Df (y(t − τ (t )))]dt + σ y(t )dω(t ), t > t0 , t0 − τ¯ ≤ t ≤ t0 ,

(16)

where A, B, f are as the same in Section 2, D ∈ Rn×n is the delayed weight matrix of (13), τ (t ) is a delay, which satisfies τ (t ) : [t0 , +∞) → [0, τ¯ ], τ ′ (t ) ≤ µ < 1, ψ = {ψ(s) : −τ¯ ≤ s ≤ 0} ∈ C ([−τ¯ , 0], Rn ), the activation function f satisfies Assumption 1. So SDRNN (16) has a unique state for any initial value t0 , and the origin point of ψ is the equilibrium point. In the case of no time delay (i.e., τ = 0), the resulting SRNN becomes: dx(t ) = [−Ax(t ) + (B + D)f (x(t ))]dt + σ x(t )dω(t ), x(t0 ) = ψ(0) ∈ Rn ,

(17)

where f satisfies Assumption 1. So SRNN (17) also has a unique state for any initial value t0 , ψ(0) is the equilibrium point. Now, the question is, given a globally exponentially stable SRNN without delay (17), how much the impact of time delay on the stability of SRNN? We will characterize how much the delay that SDRNN (16) can bear to still be globally exponentially stable. For SDRNN (16), we give the following definition of global exponential stability. Definition 3 (Mao, 2007). SDRNN (16) is said to be almost surely globally exponentially stable if for any t0 ∈ R+ , ψ ∈ L2F0 ([−τ¯ , 0]; Rn ), there exist α > 0 and β > 0 such that ∀t ≥ t0 , |y(t ; t0 , ψ)| ≤ α∥ψ∥ exp(−β(t − t0 )) hold almost surely, i.e., the Lyapunov exponent lim supt →∞ (ln |y(t ; t0 , ψ)|/t ) < 0 almost surely, where y(t ; t0 , ψ) is the state of SDRNN (16). SDRNN (16) is said to be mean square globally exponentially stable if for any t0 ∈ R+ , ψ ∈ L2F0 ([−τ¯ , 0]; Rn ), there exist α > 0 and β > 0 such that ∀t ≥ t0 , E |y(t ; t0 , ψ)|2 ≤ α∥ψ∥ exp(−β(t − t0 )) hold, i.e., the Lyapunov exponent lim supt →∞ (ln(E |y(t ; t0 , ψ)|2 )/t ) < 0, where y(t ; t0 , ψ) is the state of SDRNN (16).

If Assumption 1 holds, we have the following lemma (Theorem 6.2, P175 Mao, 2007). Lemma 2. Let Assumption 1 hold. Then the global exponential stability in the sense of mean square of SDRNN (16) implies the almost surely exponential stability of SDRNN (16).

(18)

where

σ2 (∥A∥2 + ∥B∥2 k2 + 2∥D∥2 k2 ) + λ 1−λ     24∆ ∥D ∥2 k 2 2 2 2 2 2 2 2 + ∥D∥ k 6τ¯ ∥A∥ + ∥B∥ k + + 2τ¯ σ λ 1−µ    24∆ τ¯ 12∆ 6τ¯ 3 c3 = ∥D∥2 k2 τ¯ + + ∥D∥2 k2 λ 1−µ λ 1−µ   2 α · ∥D∥2 k2 + 6τ¯ 2 ∥A∥2 + ∥B∥2 k2 β   ∥D∥2 k2 + + 2τ¯ σ 2 1−µ c2 =

Remark 1. From the proof of Theorem 1, Theorem 1 generalizes that when RNNs are globally exponentially stable, noise-induced SRNNs can be mean square global exponential stability and almost surely globally exponentially stable provided that the noise intensity is smaller than the derived upper bound by using the transcendental equation containing an adjustable parameter.

y(t ) = ψ(t − t0 ) ∈ L2F0 ([−τ¯ , 0]; Rn ),

Theorem 2. Let Assumption 1 hold and SRNN (17) be globally exponentially stable. SDRNN (16) is mean square globally exponentially stable and also almost surely globally exponentially stable, if τ¯ <  τ,  τ is a unique positive solution of the transcendental equation

6∆

λ is an adjustable parameter, λ ∈ (0, 1), and ∆ is a step, ∆ > 0. Proof. Fix t0 , ψ = {ψ(s) : −τ¯ ≤ s ≤ 0}, for the simplify, we will write x(t ; t0 , ψ(0)), y(t ; t0 , ψ) as x(t ), y(t ) respectively. From (16) and (17), we have x(t ) − y(t ) =



t

[−A(x(s) − y(s)) + B(f (x(s)) − f (y(s)))

t0

+ D(f (x(s)) − f (y(s − τ (s))))]ds  t − σ (x(s) − y(s))dω(s). t0

When t ≤ t0 + 2∆, from Assumption 1 and the Holder inequality (Mao, 2007), we have E |x(t ) − y(t )|2



1  ≤ E 

λ

t

[−A(x(s) − y(s)) + B(f (x(s)) − f (y(s))) t0

2  + D(f (x(s)) − f (y(s − τ (s))))]ds  t 2   1  + E σ (x(s) − y(s))dω(s) 1 − λ  t0  t 6∆ (∥A∥2 + ∥B∥2 k2 ) ≤ E |x(s) − y(s)|2 ds λ t0   t 6∆ + ∥D∥2 E |f (x(s)) − f y(s − τ (s))|2 ds λ t0  t 2 σ + E |x(s) − y(s)|2 ds 1 − λ t0   6∆ σ2 ≤ (∥A∥2 + ∥B∥2 k2 + 2∥D∥2 k2 ) + λ 1−λ  t · E |x(s) − y(s)|2 ds t0

+

12∆

λ

2 2



t

E |y(s) − y(s − τ (s))|2 ds.

∥D ∥ k

t0

(19)

W. Luo et al. / Neural Networks 53 (2014) 127–133



In addition, when t ≥ t0 + τ¯ , from (16) and Assumption 1,



∥D ∥ k 1−µ   t E |y(s) − x(s) + x(s)|2 ds . · + 6τ¯ 2 ∥A∥2 + ∥B∥2 k2 +

t t0 +τ¯

E |y(s) − y(s − τ (s))|2 ds t



s







ds

6τ¯ (∥A∥2 + ∥B∥2 k2 ) + 2σ



s−τ¯

t0 +τ¯

(20)

By reversing the order of integral, we have s



 s−τ¯

t0 +τ¯ t

 =

6τ¯ (∥A∥2 + ∥B∥2 k2 ) + 2σ 2 E |y(r )|2 dr



min(r +τ¯ ,t )

 dr

max(t0 +τ¯ ,r )

t0

6τ¯ (∥A∥2 + ∥B∥2 k2 ) + 2σ 2 E |y(r )|2 ds





 ≤ 6τ¯ (∥A∥2 + ∥B∥2 k2 ) + 2σ 2 τ¯ 

t



E |y(r )|2 dr .

(21)

t0

For the same reason, we have



t

s

 ds

t0 +τ¯

s−τ¯ t



6τ¯ ∥D∥2 k2 E |y(r − τ (r ))|2 dr min(r +τ¯ ,t )



=

dr max(t0 +τ¯ ,r )

t0

≤ 6τ¯ ∥D∥ k 2

2 2

6τ¯ ∥D∥2 k2 E |y(r − τ (r ))|2 ds

σ2 (∥A∥2 + ∥B∥2 k2 + 2∥D∥2 k2 ) + λ 1−λ   24∆ + ∥D∥2 k2 6τ¯ 2 ∥A∥2 + ∥B∥2 k2 λ    t ∥D∥2 k2 E |x(s) − y(s)|2 ds + + 2τ¯ σ 2 · 1−µ t0    24∆ τ¯ 12∆ + ∥D∥2 k2 τ¯ + + ∥D∥2 k2 λ 1−µ λ    6τ¯ 3 α2 · ∥D∥2 k2 + 6τ¯ 2 ∥A∥2 + ∥B∥2 k2 1−µ β     ∥D∥2 k2 + + 2τ¯ σ 2 · sup E |y(s)|2 1−µ t0 −τ¯ ≤s≤t0 +τ¯    t =: c2 E |x(s) − y(s)|2 ds + c3 sup E |y(s)|2 . 



1−µ 6τ¯ 3



1−µ

+

6τ¯

t



2 2

∥D ∥ k

t0 −τ¯

∥D∥2 k2

E |y(u)|2 du

 sup

t0 −τ¯ ≤s≤t0

2

1−µ

∥D∥2 k2



E |x(t ) − y(t )|2 ≤ c3 exp(2∆c2 )

E |y(s)|2



E |y(u)|2 du. t0

E |y(s) − y(s − τ (s))| ds



1−µ

∥D∥2 k2

sup

t0 −τ¯ ≤s≤t0

E |y(s)|2

· =: cˆ

(23)

λ

(∥A∥2 + ∥B∥2 k2 + 2∥D∥2 k2 ) +

t

E |x(s) − y(s)|2 ds +

· t0

 × τ¯ + +

12∆

λ

τ¯ 1−µ

∥D∥2 k2

  ·



24∆

λ

6τ¯ 3 1−µ

σ2 1−λ

sup

∥D∥2 k2 ·

E |y(s)|2

E |y(s)|2

t0 −τ¯ ≤s≤t0 +τ¯

 (28)

where cˆ = 2c3 exp(2∆c2 ) + 2α 2 exp(−2β(∆ − τ¯ )). From (18), when τ¯ <  τ , then cˆ < 1. So, when t0 − τ¯ + ∆ ≤ t ≤ t0 − τ¯ + 2∆, let

Therefore, we have



∂ ln c¯ ∂ cˆ = =0 ∂λ ∂λ

∥D∥2 k2

t0 −τ¯ ≤s≤t0 +τ¯

sup



∂ cˆ = 0. ∂λ

E |x(t ) − y(t )|2



E |y(s)|2

t0 −τ¯ ≤s≤t0 +τ¯

2

6∆

sup



Substituting (23) into (19), when t ≥ t0 + τ¯ , we have



(27)

Thus, when t0 − τ¯ + ∆ ≤ t ≤ t0 − τ¯ + 2∆



t0



(26)

E |y(t )|2 ≤ [2c3 exp(2∆c2 ) + 2α 2 exp(−2β(∆ − τ¯ ))]

   ∥D∥2 k2 + 6τ¯ ∥A∥2 + ∥B∥2 k2 + + 2τ¯ σ 2 1−µ  t × E |y(s)|2 ds. 

t0 −τ¯ ≤s≤t0 +τ¯

E |y(s)|2 .

t0 −τ¯ ≤s≤t0 +τ¯

 



sup

≤ [2c3 exp(2∆c2 ) + 2α 2 exp(−2β(t − t0 ))]   · sup E |y(s)|2 .

(22)

2

6τ¯ 3



E |y(t )|2 ≤ 2E |x(t ) − y(t )|2 + 2E |x(t )|2

t

t t0 +τ¯

(25)

Therefore

So, when t ≥ t0 + τ¯ , by substituting (21) and (22) into (20), we have



t0 −τ¯ ≤s≤t0 +τ¯

When t0 + τ¯ ≤ t ≤ t0 + 2∆, applying the Gronwall inequality (Mao, 2007), we obtain

E |y(r − τ (r ))|2 dr t0

6τ¯ 2

6∆

t0

t



(24)

E |x(t ) − y(t )|2



ds

+ 2τ¯ σ 2



From (24), we further have

× E |y(r )|2 + 6τ¯ ∥D∥2 k2 E |y(r − τ (r ))|2 dr .

t



t0

 2





131 2 2



where c¯ = cˆ − 2α 2 exp(−2β(∆ − τ¯ ))



 sup

t0 −τ¯ ≤s≤t0

E |y(s)|2



∂ ln c¯ ∂ ln c3 ∂ c2 = + 2∆ ∂λ ∂λ ∂λ   ∂ c3 ∂ c2 = + 2∆c3 /c3 . ∂λ ∂λ

(29)

132

W. Luo et al. / Neural Networks 53 (2014) 127–133

So, from (29) we have,

λ3 + (c4 − 2∆σ 2 − 2)λ2 + (1 − 2c4 )λ + c4 = 0

(30)

where c4 = 12∆2 (∥A∥2 + ∥B∥2 k2 + 2∥D∥2 k2 ) + 48∆2

    ∥D∥2 k2 · ∥D∥2 k2 6τ¯ 2 ∥A∥2 + ∥B∥2 k2 + + 2τ¯ σ 2 . 1−µ Therefore, from (30), we can know that there exists a unique τ¯ such that τ¯ = τmax , when λ ∈ (0, 1). Select γ = − ln cˆ /∆, we can conclude sup

t0 −τ¯ +∆≤t ≤t0 −τ¯ +∆

≤ exp(−γ ∆)

E |y(t ; t0 , ψ)|2



E |y(t ; t0 , ψ)|

2

sup

t0 −τ¯ ≤t ≤t0 −τ¯ +∆



.

(31) Fig. 1. The transient state of RNN (32).

The rest of the proof can be completed similarly to the proof of Theorem 1. Remark 2. Theorem 2 shows that when SRNNs without delay are globally exponentially stable, the SDRNNs induced by time delays can be mean square globally exponentially stable and almost surely globally exponentially stable, provided that the delays are respectively lower than the given upper bounds. Remark 3. It shows that our results generalize and improve the corresponding results of recent works in Theorems 1 and 2. From the proofs of Theorems 1 and 2, we can see that the upper bounds of parameter uncertainty intensity are derived via subtle inequalities and can be estimated by solving transcendental equation. As transcendental equation can be solved by using some software such as MATLAB, Mathematica the derived conditions in these theorems can be verified easily. 4. Numerical examples

Fig. 2. The stability region with (ε, σ ) in SRNN (33).

In this section, we give two numerical examples to illustrate the new results. Example 1. Consider two-state RNN dx1 (t ) dt dx2 (t ) dt

= −x1 (t ) − 2f (x1 (t )) + 2f (x2 (t )), = −x2 (t ) + 2f (x1 (t )) − 2f (x2 (t )).

(32)

The parameters are as follows

 A=

1 0



0 , 1

 B=

−2 2



2 , −2

f (xj ) = sin(xj ), (j = 1, 2), x(0) = [0.3, −0.1]T . Hence, according to Theorem 1 in Liao and Wang (2003), RNN (33) is globally exponentially stable with α = 0.8, β = 0.5. In the presence of random disturbance, the RNNs become dy1 (t ) = [−y1 (t ) − 2f (y1 (t )) + 2f (y2 (t ))]dt

+ σ y1 (t )dω(t ), dy2 (t ) = [−y2 (t ) + 2f (y1 (t )) − 2f (y2 (t ))]dt + σ y2 (t )dω(t ),

(33)

where σ is the intensity of noise, ω(t ) is a scalar Brownian motion defined on the probability space. According to Theorem 1, let

∆ = 0.3 > ln(1.28) = 0.2469, by solving Eq. (7)    2.56σ 2 10.2 σ2 exp 1.2 + + 1.28 exp(−0.3) = 1, (34) 1−ε ε 1−ε we can obtain its curve for (ε, σ ). Fig. 1 shows the transient states of RNNs (32). Fig. 2 shows the stability region for (ε, σ ) in (33). We can obtain σ˜ = 5.216 × 10−5 . Fig. 3 depicts the transient states of SRNN (33) with σ = 5.2 × 10−5 . It shows that the SRNN (33) is mean square globally exponentially stable and also almost surely globally exponentially stable, as the parameter |σ | < σ˜ . Meanwhile, we can obtain its solution σ˜ = 4.859 × 10−7 according to Theorem 1 in Shen and Wang (2012) easily. Comparing the results above, we improved the previous result. Example 2. Consider a single neuron dx(t ) = [−3.1x(t ) + 0.1f (x(t ))]dt + 0.02788x(t )dω(t ),

(35)

where f (x) = (exp(x) − exp(−x))/(exp(x) + exp(−x)), σ is the intensity of noise, ω(t ) is a scalar Brownian motion defined on the probability space. Hence, according to Theorem 4.4 in Mao (2007, P130), the neuron state is globally exponentially stable with α = 1, β = 3. In the presence of time delays, the model becomes: dy(t ) = [−3.1y(t ) + 0.05f (y(t )) + 0.05f (y(t − τ (t )))]dt

+ 0.02788y(t )dω(t ),

(36)

W. Luo et al. / Neural Networks 53 (2014) 127–133

133

References

Fig. 3. The transient state of SRNN (33) with σ = 5.2 × 10−5 in Example 1.

where τ (t ) is the time-varying delay. According to Theorem 2, let µ = 0, ∆ = 0.15. Eq. (18)

 2

0.018

0.0045



0.015τ¯ 3 + 19.23τ¯ 2 λ λ    0.0015545888 8.65575 + τ¯ exp 0.3 3 λ  0.0007772944 0.06 + + (57.69τ¯ 2 + 0.0015545888τ¯ ) (1 − λ) λ + 2 exp(−6(0.15 − τ¯ )) = 1.

+

We can obtain τ˜ = 0.0306. It has the solution τ˜ = 0.005124, in Example 3 (Shen & Wang, 2012). Comparing the results above, it is obviously that our conclusion is generalized and improved from the former one. 5. Conclusion In this paper, we derived improved results for global robust exponential stability of recurrent neural networks with time delays and additive noises by estimating the upper bounds of noise level and time delays. These improved upper bounds can be estimated by solving transcendental equations containing adjustable parameters. The results herein provide a theoretic basis for the designs and applications of recurrent neural networks in the presence of time delays and random disturbances. Further investigations will be aimed at improving the upper bounds to allow larger stability margins for withstanding parameter uncertainty. We will continue to study the stability condition, compared with other methods of the stability analysis such as the Lyapunov theory or linear matrix inequality methods, release inequality in the proof of theorems, less the conservative of the results. Acknowledgments The authors would like to thank the Editor and the anonymous reviewers for their insightful comments and valuable suggestions, which have helped us in finalizing the paper. This work was supported by the Key Program of National Natural Science Foundation of China with Grant No. 61134012, National Natural Science Foundation of China with Grant No. 61203055, 11271146 and supported by the Fundamental Research Funds for the Central Universities of 2012QNA48.

Arik, S. (2002). An analysis of global asymptotc stability of delayed cellular neural networks. IEEE Transactions on Neural Networks, 13, 1239–1242. Arik, S. (2004). An analysis of exponential stability of delayed neural networks with time varying delays. Neural Networks, 17, 1027–1031. Cao, J., Yuan, K., & Li, H. (2006). Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays. IEEE Transactions on Neural Networks, 17, 1646–1651. Chen, T. (2001). Global exponential stability of delayed Hopfield neural networks. Neural Networks, 14, 977–980. Chen, W., & Lu, X. (2008). Mean square exponential stability of uncertain stochastic delayed neural networks. Physics Letters A, 372, 1061–1069. Chua, L. O., & Yang, L. (1988). Celluar neural networks: theory. IEEE Transactions on Circuits and Systems, 35, 1257–1272. Faydasicok, O., & Arik, S. (2013). A new upper bound for the norm of interval matrices with application to robust stability analysis of delayed neural networks. Neural Networks, 44, 64–71. Gopalsamy, K., & Leung, I. (1996). Delay induced periodicity in a neural netlet of excitation and inhibition. Physica D, 89, 395–426. Haykin, S. (1994). Neural networks. New York: Prentice Hall. He, Y., Wu, W., & She, J. (2006). An improved global asymptotic stability criterion for delayed cellular neural networks. IEEE Transactions on Neural Networks, 17, 250–252. Hu, L., Gao, H., & Zheng, W. (2008). Novel stability of cellular neural networks with interval time varying delay. Neural Networks, 21, 1263–1458. Huang, H., Ho, D. W. C., & Lam, J. (2005). Stochastic stability analysis of fuzzy Hopfield neural networks with time-varying delays. IEEE Transactions on Circuits and Systems II: Express Briefs, 52, 251–255. Liao, X., Chen, G., & Sanchez, E. N. (2002). Delay dependent exponential stability of delayed neural networks: an LMI approach. Neural Networks, 15, 855–866. Liao, X., & Wang, J. (2003). Algebraic criteria for global exponential stability of cellular neural networks with multiple time delays. IEEE Transactions on Circuits and Systems I: Regular Papers, 50, 268–275. Liu, Y., Wang, Z., & Liu, X. (2006). Global exponential stability of generalized recurrent neural networks with discrete and distributed delays. Neural Networks, 19, 667–675. Mao, X. (2007). Stochastic differential equations and applications (second ed.). Chichester: Harwood. Pham, J., Pakdaman, K., & Virbert, J. (1998). Noise-induced coherent oscillations in randomly connected neural networks. Physical Review E, 58, 3610–3622. Shen, Y., & Wang, J. (2007). Noise-induced stabilization of the recurrent neural networks with mixed time varying delays and Markovian-switching parameters. IEEE Transactions on Neural Networks, 18, 1857–1862. Shen, Y., & Wang, J. (2008). An improved algebraic criterion for global exponential stability of recurrent neural networks with time-varying delays. IEEE Transactions on Neural Networks, 19, 528–531. Shen, Y., & Wang, J. (2012). Robustness analysis of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances. IEEE Transactions on Neural Networks and Learning Systems, 23, 87–96. Wang, Z., Liu, Y., Li, M., & Liu, X. (2006). Stability analysis for stochastic Cohen–Grossberg neural networks with mixed time delays. IEEE Transactions on Neural Networks, 17, 814–820. Xu, S., Lam, J., & Ho, D. W. C. (2006). A new LMI condition for delay dependent asymptotic stability of delayed Hopfield neural networks. IEEE Transactions on Circuits and Systems II: Express Briefs, 53, 230–234. Yuan, K., Cao, J., & Li, H. (2006). Robust stability of switched Cohen–Grossberg neural networks with mixed time-varying delays. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 36, 1356–1363. Zeng, Z., & Wang, J. (2006a). Complete stability of cellular neural networks with time-varying delays. IEEE Transactions on Circuits and Systems I: Regular Papers, 53, 944–955. Zeng, Z., & Wang, J. (2006b). Global exponential stability of recurrent neural networks with time-varying delays in the presence of strong external stimuli. Neural Networks, 19, 1528–1537. Zeng, Z., Wang, J., & Liao, X. (2005). Global asymptotic stability and global exponential stability of neural networks with unbounded time-varying delays. IEEE Transactions on Circuits and Systems II: Express Briefs, 52, 168–173. Zhang, H., & Wang, Y. (2008). Stability analysis of Markovian jumping stochastic Cohen–Grossberg neural networks with mixed time delays. IEEE Transactions on Neural Networks, 19, 366–370. Zhang, H., Wang, Z., & Liu, D. (2009). Global asymptotic stability and robust stability of a class of Cohen–Grossberg neural networks with mixed delays. IEEE Transactions on Circuits and Systems I: Regular Papers, 56, 616–629. Zhang, J., & Jin, X. (2000). Global stability analysis in delayed Hopfield neural networks models. Neural Networks, 13, 745–753. Zheng, C., Shan, Q., Zhang, H., & Wang, Z. (2013). On stabilization of stochastic Cohen–Grossberg neural networks with mode-dependent mixed time-delays and Markovian switching. IEEE Transactions on Neural Networks and Learning Systems, 24(5), 800–811. Zhu, S., & Shen, Y. (2013). Robustness analysis for connection weight matrices of global exponential stability of stochastic recurrent neural networks. Neural Networks, 38, 17–22. Zhu, S., Shen, Y., & Chen, G. (2010a). Exponential passivity of neural networks with time-varying delay and uncertainty. Physics Letters A, 375, 136–142. Zhu, S., Shen, Y., & Chen, G. (2010b). Noise suppress or express exponential growth for hybrid Hopfield neural networks. Physics Letters. A, 374, 2035–2043.

Further results on robustness analysis of global exponential stability of recurrent neural networks with time delays and random disturbances.

In this paper, further results on robustness analysis of global exponential stability of recurrent neural networks (RNNs) subjected to time delays and...
440KB Sizes 2 Downloads 3 Views