Skip to main content
Cognitive Neurodynamics logoLink to Cognitive Neurodynamics
. 2008 Sep 24;2(4):363–370. doi: 10.1007/s11571-008-9058-9

New stability criteria for uncertain neural networks with interval time-varying delays

Haixia Wu 1,2,, Wei Feng 2, Xinyuan Liang 3
PMCID: PMC2585622  PMID: 19003461

Abstract

This paper is concerned with the stability analysis for neural networks with interval time-varying delays and parameter uncertainties. An approach combining the Lyapunov-Krasovskii functional with the differential inequality and linear matrix inequality techniques is taken to investigate this problem. By constructing a new Lyapunov-Krasovskii functional and introducing some free weighting matrices, some less conservative delay-derivative-dependent and delay-derivative-independent stability criteria are established in term of linear matrix inequality. And the new criteria are applicable to both fast and slow time-varying delays. Three numerical examples show that the proposed criterion are effective and is an improvement over some existing results in the literature.

Keywords: Stability, Neural networks, Interval time-varying delays, Parameter uncertainties, Linear matrix inequality

Introduction

In the past decades, neural networks have received a great deal of interest due to their extensive applications in image processing, quadratic optimization, fixed-point computation, pattern recognition, associative memory, another areas (Borkar and Soumyanatha 1997; Chua and Yang 1988; Cichocki and Unbehauen 1993; Michel and Liu 2002; Chen and Fang 2000). These applications strongly depend on the dynamic behavior of the network. However, in neural processing and signal transmission, significant time delays as a source of instability and bad performance may occur. Therefore, the stability of delayed neural networks has received considerable attention and a large amount of literature has been available (Arik 2000, 2002; Cao et al. 2005, 2006; Li et al. 2006, 2007; Liao et al. 2002; Liao et al. 2005; Wang 2007).

Liao et al. (2002) derives some sufficient conditions for asymptotic stability of neural networks with constant or time-varying delays. The Lyapunov-Krasovskii stability theory for functional differential equations and the linear matrix inequality (LMI) approach are employed to investigate the problem. Cao et al. (2006) considered the issue of global asymptotical stability for recurrent neural networks with mixed time-varying delays. They have derived several new sufficient conditions for checking the global asymptotical stability of recurrent neural networks with mixed time-varying delays. Li et al. (2006) have investigated the asymptotical and exponential stability of cellular neural networks with single and multiple delays. The method combining Lyapunov-Krasovskii functional, parameterized first-order model transformation and the linearization of the model under consideration has been adopted to study these issues. As a result, several novel delay-dependent and delay-independent asymptotical/exponential stability criteria for delayed cellular neural networks have been exploited. Particularly, the general delayed cellular neural networks have been shifted into a class of non-autonomic linear systems under the appropriate assumption on the activation functions.

On the other hand, the connection weights of the neurons depend on certain resistance and capacitance values that include uncertainties (modeling errors). When modeling neural networks, the parameter uncertainties (also called variations or fluctuations) should be taken into account. And in recent years, the stability analysis issues for neural networks in the presence of parameter uncertainties perturbations have stirred some initial research attention (Park 2007; Singh 2004; Zhang et al. 2005).

Recently, a special type of time delay in practical engineering systems, i.e., interval time-varying delay, is identified and investigated (Qiu et al. 2007; He et al. 2007). Interval time-varying delay is a time delay that varies in an interval in which the lower bound is not restricted to be 0. Qiu et al. (2007) have investigated the problem of robust stability of uncertain neural networks with interval time-varying delays. The delay factor is assumed to be time-varying and belongs to a given interval, which means that the lower and upper bounds of the interval time-varying delays are available. Based on the Lyapunov-Krasovskii functional approach, a new delay-dependent stability criteria is presented in terms of linear matrix inequalities (LMIs). It is worth noting that these stability criteria in Qiu et al. (2007) leave much room for improvement. A significant source of conservativeness that could be further reduced lies in the calculation of the time-derivative of the Lyapunov-Krasovskii functional. To the best of our knowledge, very few papers investigate the stability problem of neural networks with interval time-varying delays, which remains open but challenging. Therefore, it is of great significance to consider the stability of neural networks with interval time-varying delays. He et al. (2007) have studied the stability problem for neural networks with time-varying interval delay. Some less conservative stability criteria have been obtained by considering the relationship between the time-varying delay and its lower and upper bounds when calculating the upper bound of the derivative of Lyapunov functional.

In this paper, we deal with the robust stability problem for neural networks with interval time-varying delays and parameter uncertainties by choosing an appropriate Lyapunov functional. Some delay-derivative-dependent and delay-derivative-independent stability criteria are derived based on the new Lyapunov functional and the consideration of range for the time-delay. The resulting criteria are applicable to both fast and slow time-varying delay. Finally, three numerical examples are given to demonstrate the effectiveness and the merit of the proposed method.

Notations

The notations used throughout the paper are fairly standard. The superscript “Inline graphic ”stands for matrix transposition; Inline graphic denotes the n-dimensional Euclidean space; the notation Inline graphic means that Inline graphic is real symmetric and positive definite; Inline graphic and 0 represent identity matrix and zero matrix. In symmetric block matrices or long matrix expressions, we use an asterisk (Inline graphic) to represent a term that is induced by symmetry. Matrices, if their dimensions are not explicitly stated, are assumed to be compatible for algebraic operations.

Problem formulation

Consider the following neural networks with interval time-varying delays and parameter uncertainties:

graphic file with name M7.gif 1

where Inline graphic is the neural state vector, Inline graphic denotes the bounded neuron activation function with Inline graphicInline graphic is a constant input vector. Inline graphic, Inline graphicInline graphic are the inter connection matrices representing the weight coefficients of the neurons.

Assumption 1 The time-varying delay Inline graphic satisfies.

graphic file with name M16.gif 2
graphic file with name M17.gif 3

where Inline graphic are constants.

Remark 1 Obviously, when Inline graphic and Inline graphic then Inline graphic denotes a constant delay; the case when Inline graphic it implies that Inline graphic which is investigated in almost all the reported literature.

Assumption 2 The parameter uncertainties Inline graphic are of the form:

graphic file with name M25.gif 4

in which Inline graphic are known constant matrices with appropriate dimensions. The uncertain matrix Inline graphic satisfies

graphic file with name M28.gif 5

In addition, the activation function Inline graphic is bounded, and satisfies that

graphic file with name M30.gif 6

In the following, we always shift the equilibrium point Inline graphic to the origin by transformation Inline graphic puts system (1) into the following form:

graphic file with name M33.gif 7

where Inline graphic is the state vector of the transformed system, with

graphic file with name M35.gif

and Inline graphic

Note that functions Inline graphic here satisfy condition (H). Which is equivalent to

graphic file with name M38.gif 8

Now, we give the following lemma that is useful in deriving our LMI-based stability criterion.

Lemma 1 [Schur complement] Given constant symmetric matricesInline graphicwhereInline graphicandInline graphicthenInline graphicif and only if

graphic file with name M43.gif 9

Lemma 2 For anyInline graphicand a positive scalarInline graphic, the following inequality:

graphic file with name M46.gif 10

holds.

Main results

In order to discuss robust stability of system (1), which has parametric uncertainties (4) and (5), first, we consider the case in which the matrices Inline graphic and Inline graphic are fixed, i.e., Inline graphic and Inline graphic. For this case, the following theorem holds.

Theorem 1 For given scalarsInline graphicandInline graphic, the neural network (7) is asymptotically stable, if there exist definite matricesInline graphic, Inline graphicInline graphicInline graphicInline graphicInline graphicInline graphicsuch that the following LMI holds:

graphic file with name M60.gif 11

where

graphic file with name M61.gif
graphic file with name M63.gif

and

graphic file with name M64.gif
graphic file with name M65.gif

Proof The Lyapunov functional of system (7) is defined by:

graphic file with name M66.gif
graphic file with name M67.gif
graphic file with name M68.gif

where Inline graphic and Inline graphicInline graphic are to be determined. From the Leibniz–Newton formula, the following equations are true for any matrices Inline graphic and Inline graphic with appropriate dimensions,

graphic file with name M74.gif 12
graphic file with name M75.gif 13
graphic file with name M76.gif 14

Noting that, for any diagonal matrices Inline graphic and Inline graphic from Eq. 8 there exist that

graphic file with name M79.gif 15

Calculating the derivative of Inline graphic along the solutions of system (7)

graphic file with name M81.gif 16
graphic file with name M82.gif 17
graphic file with name M83.gif 18
graphic file with name M84.gif 19

Combining Eqs. (1519) and adding the left side of Eqs. (1214) into the derivative of Inline graphic

graphic file with name M86.gif 20

where

graphic file with name M87.gif
graphic file with name M88.gif

Since Inline graphic then the last three parts in Eq. 20 are all <0. So, if

graphic file with name M90.gif

which is equivalent to Eq. 11 by Schur complements, then Inline graphic for a sufficiently small Inline graphic and Inline graphic which ensures the asymptotic stability of system (7), see e.g. Hale and Verduyn Lunel (1993). The proof is completed. □

Remark 2 It is worth noting that the results in Qiu et al. (2007) are only applicable to systems with fast time-varying delay. In fact, in many cases, the derivative of time-varying delays is known and may be small. Thus, the results in Qiu et al. (2007) may have limited use. In our Theorem 1, Inline graphic can be any value or unknown. Therefore, Theorem 1 is applicable to both fast and slow time-varying delays.

In fact, Theorem 1 gives a criterion for system (1) with Inline graphic satisfying (2) and (3). In many cases, the information of the derivative of delay is unknown. Regarding this circumstance, a rate-independent criterion for a delay only satisfying (2) is derived as follows by choosing Inline graphic in Theorem 1.

Corollary 1 For given scalarsInline graphic, the neural network (7) is asymptotically stable, if there exist definite matricesInline graphicInline graphicInline graphicInline graphicInline graphicInline graphicInline graphicandInline graphicInline graphic such that the following LMI holds:

graphic file with name M107.gif 21

where

graphic file with name M108.gif
graphic file with name M110.gif

and

graphic file with name M111.gif
graphic file with name M112.gif

The following result provides the feasible robust stability criterion for systems with the admissible uncertainty.

Theorem 2 For given scalarsInline graphicandInline graphic, the neural network (7) is robustly, asymptotically stable, if there exist definite matricesInline graphicInline graphicInline graphic Inline graphic Inline graphic Inline graphic Inline graphicand three positive scalarsInline graphicsuch that the following LMI holds:

graphic file with name M123.gif 22

where

graphic file with name M124.gif
graphic file with name M125.gif
graphic file with name M126.gif

and

graphic file with name M127.gif
graphic file with name M128.gif
graphic file with name M129.gif
graphic file with name M130.gif
graphic file with name M131.gif

Proof By Lemma 1, the system is robustly, asymptotically stable if the following inequality holds:

graphic file with name M132.gif 23

where

graphic file with name M133.gif

By Lemma 2, Eq. (23) holds if the following inequality satisfies:

graphic file with name M134.gif 24

where Inline graphic

graphic file with name M136.gif
graphic file with name M137.gif
graphic file with name M138.gif
graphic file with name M139.gif

Then, by Lemma 1, the inequality given in Eq. 24 is equivalent to the LMI Eq. 22. Thus, if the LMIs given in Eq. 22 holds, the system (1) is robust asymptotically stable. This completes the proof.

By setting Inline graphic, the Corollary 2 is established from Theorem 2. □

Corollary 2 For given scalarsInline graphic, the neural network (7) is robustly, asymptotically stable, if there exist definite matricesInline graphicInline graphicInline graphicInline graphicInline graphicInline graphicInline graphicand three positive scalarsInline graphic, such that the following LMI holds:

graphic file with name M150.gif 25

where

graphic file with name M151.gif
graphic file with name M153.gif

where

graphic file with name M154.gif
graphic file with name M155.gif

and other parameters are defined in Theorem 2.

Numerical examples

In this section, three examples are given to show the effectiveness and less conservativeness of our theoretical results.

Example 1 Consider the following neural networks with interval time-varying delays:

graphic file with name M156.gif

The calculation results obtained by Theorem 1 in this paper for different cases are listed in Table 1.

Table 1.

Allowable upper bound of Inline graphic with given Inline graphic

Inline graphic Inline graphic Inline graphic Inline graphic Inline graphic
Inline graphic 11.0183 11.0727 11.4128 11.7128 11.9128

Remark 3 From Example 1, we can see that our results are correct for fast time-varying delay. And, it is easy to verify that out theoretical result is also applicable to slow time-varying delay.

In the following, the example in Yang et al. (2006) and Qiu et al. (2007) is used to demonstrate that our result is better than some existing results in the literature.

Example 2 Consider the following neural networks with interval time-varying delays:

graphic file with name M165.gif 26
graphic file with name M166.gif

Using Matlab LMI Control Toolbox, by our Corollary 1, we can find that the system (17) asymptotically stable and a part of the solutions of LMI (21) are given as follows:

graphic file with name M167.gif
graphic file with name M168.gif

Remark 4 It should be pointed out that, if we let Inline graphic the conditions in Qiu et al. (2007) are not feasible, but using Corollary 1 in this paper, we can find that system (26) is still asymptotically stable. Therefore, our method is less conservative in some degree than that in Qiu et al. (2007).

Remark 5 If we let Inline graphic we can also find the neural network (26) is asymptotically stable. Here, the solution of the LMI (21) is omitted. The same neural networks have been considered in Yang et al. (2006) and Qiu et al. (2007), where the maximal admissible time delay for stability was obtained to be Inline graphic and Inline graphic, respectively. By Corollary 1, we can have the maximal admissible time delay for stability is Inline graphic. It is clear that Inline graphic.

Further, another example is given to illustrate the effectiveness of Corollary 2.

Example 3 Consider the following neural networks with interval time-varying delays and parameter uncertainties:

Inline graphic Under different levels of the upper bounds of the time-delay, Table 2 lists the results of the maximum allowable delay bounds. It is seen from Table 2 that the results obtained from our method are less conservative than those obtained from Qiu et al. (2007).

Table 2.

Allowable upper bound of Inline graphic with given Inline graphic

Inline graphic Inline graphic Inline graphic Inline graphic
Qiu et al. (2007) 2.082 2.182 2.582 2.882
Corollary 2 3.883 3.905 4.139 4.333

Therefore, we can say that the results in this paper are much effective and less conservative than the one in Yang et al. (2006) and Qiu et al. (2007).

Conclusion

The stability problem for neural networks with interval time-varying delays and parameter uncertainties is considered. Based on the Lyapunov-Krasovskii functional approach, some delay-dependent stability criteria are derived by introducing free weighting matrices, which are used to reduce the conservatism of the obtained criterion. As a result, the new stability criteria in terms of LMIs are applicable to both fast and slow time-varying delays. Numerical examples are given to show the effectiveness of the method.

Acknowledgments

The work described in this paper was supported by grants from the National Nature Science and Foundation of China (No. 60574024, 60703035), and Scientific Research Fund of Chongqing Municipal Education Commission Grant (No. kj081501).

References

  1. Arik S (2000) Stability analysis of delayed neural networks. IEEE Trans Circuits Syst I 47(7):1089–1092. doi:10.1109/81.855465 [DOI]
  2. Arik S (2002) An analysis of global asymptotic stability of delayed cellular neural networks. IEEE Trans Neural Netw 13(5):1239–1242. doi:10.1109/TNN.2002.1031957 [DOI] [PubMed]
  3. Borkar V, Soumyanatha K (1997) An analog scheme for fixed point computation—Part I: Theory. IEEE Trans Circuits Syst I 44(4):351–355. doi:10.1109/81.563625 [DOI]
  4. Cao J, Wang J (2005) Global exponential stability and periodicity of recurrent neural networks with time delays. IEEE Trans. Circuits Syst I 52(5):920–931. doi:10.1109/TCSI.2005.846211 [DOI]
  5. Cao J, Yuan K, Li H (2006) Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays. IEEE Trans Neural Netw 17(6):1646–1651. doi:10.1109/TNN.2006.881488 [DOI] [PubMed]
  6. Chen Y, Fang S (2000) Neurocomputing with time delay analysis for solving convex quadratic programming problems. IEEE Trans Neural Netw 11(1):230–240. doi:10.1109/72.822526 [DOI] [PubMed]
  7. Chua L, Yang L (1988) Cellular neural networks: applications. IEEE Trans Circ Syst 35(10):1273–1290. doi:10.1109/31.7601 [DOI]
  8. Cichocki A, Unbehauen R (1993) Neural networks for optimization and signal processing. Wiley, New York
  9. Hale J, Lunel S (1993) Introduction of functional differential equations. Springer, New York
  10. He Y, Liu G, Rees D, Wu M (2007) Stability analysis for neural networks with time-varying interval delay. IEEE Trans Neural Netw 18(6):1850–1854. doi:10.1109/TNN.2007.903147 [DOI] [PubMed]
  11. Li C, Chen J, Huang T (2007) A new criterion for global robust stability of interval neural networks with discrete time delays. Chaos Solitons Fractals 31(3):561–570. doi:10.1016/j.chaos.2005.10.031 [DOI]
  12. Li C, Liao X, Wong K (2006) Delay-dependent and delay-independent stability criteria for cellular neural networks with delays. Int J Bifurcat Chaos 16(11):3323–3340. doi:10.1142/S0218127406016811 [DOI]
  13. Liao X, Chen G, Sanchez E (2002) LMI-based approach for asymptotically stability analysis of delayed neural networks. IEEE Trans Circuits Syst I 49(7):1033–1039. doi:10.1109/TCSI.2002.800842 [DOI]
  14. Liao T, Yan J, Cheng C, Hwang C (2005) Global exponential stability condition of a class of neural networks with time-varying delays. Phys Lett A 339(3–5):333–342. doi:10.1016/j.physleta.2005.03.034 [DOI]
  15. Michel A, Liu D (2002) Qualitative analysis and synthesis of recurrent neural networks. Marcel Dekker, New York
  16. Park J (2007) An analysis of global robust stability of uncertain cellular neural networks with discrete and distributed delays. Chaos Solitons Fractals 32(2):800–807. doi:10.1016/j.chaos.2005.11.106 [DOI]
  17. Qiu J, Yang H, Zhang J, Gao Z (2007) New robust stability criteria for uncertain neural networks with interval time-varying delays. Chaos Solitons Fractals. doi:10.1016/j.chaos.2007.01.087
  18. Singh V (2004) Robust stability of cellular neural networks with delay: linear matrix inequality approach. IEE Proc Contr Theory Appl 151(1):125–129. doi:10.1049/ip-cta:20040091 [DOI]
  19. Wang L (2007) Interactions between neural networks: a mechanism for tuning chaos and oscillations. Cogn Neurodynamics 1(2):185–188. doi:10.1007/s11571-006-9004-7 [DOI] [PMC free article] [PubMed]
  20. Yang H, Chu T, Zhang C (2006) Exponential stability of neural networks with variable delays via LMI approach. Chaos Solitons Fractals 30(1):133–139. doi:10.1016/j.chaos.2005.08.134 [DOI]
  21. Zhang H, Li C, Liao X (2005) A note on the robust stability of neural networks with time delay. Chaos Solitons Fractals 25(2):357–360. doi:10.1016/j.chaos.2004.11.017 [DOI]

Articles from Cognitive Neurodynamics are provided here courtesy of Springer Science+Business Media B.V.

RESOURCES