Abstract
This paper is concerned with the stability analysis for neural networks with interval time-varying delays and parameter uncertainties. An approach combining the Lyapunov-Krasovskii functional with the differential inequality and linear matrix inequality techniques is taken to investigate this problem. By constructing a new Lyapunov-Krasovskii functional and introducing some free weighting matrices, some less conservative delay-derivative-dependent and delay-derivative-independent stability criteria are established in term of linear matrix inequality. And the new criteria are applicable to both fast and slow time-varying delays. Three numerical examples show that the proposed criterion are effective and is an improvement over some existing results in the literature.
Keywords: Stability, Neural networks, Interval time-varying delays, Parameter uncertainties, Linear matrix inequality
Introduction
In the past decades, neural networks have received a great deal of interest due to their extensive applications in image processing, quadratic optimization, fixed-point computation, pattern recognition, associative memory, another areas (Borkar and Soumyanatha 1997; Chua and Yang 1988; Cichocki and Unbehauen 1993; Michel and Liu 2002; Chen and Fang 2000). These applications strongly depend on the dynamic behavior of the network. However, in neural processing and signal transmission, significant time delays as a source of instability and bad performance may occur. Therefore, the stability of delayed neural networks has received considerable attention and a large amount of literature has been available (Arik 2000, 2002; Cao et al. 2005, 2006; Li et al. 2006, 2007; Liao et al. 2002; Liao et al. 2005; Wang 2007).
Liao et al. (2002) derives some sufficient conditions for asymptotic stability of neural networks with constant or time-varying delays. The Lyapunov-Krasovskii stability theory for functional differential equations and the linear matrix inequality (LMI) approach are employed to investigate the problem. Cao et al. (2006) considered the issue of global asymptotical stability for recurrent neural networks with mixed time-varying delays. They have derived several new sufficient conditions for checking the global asymptotical stability of recurrent neural networks with mixed time-varying delays. Li et al. (2006) have investigated the asymptotical and exponential stability of cellular neural networks with single and multiple delays. The method combining Lyapunov-Krasovskii functional, parameterized first-order model transformation and the linearization of the model under consideration has been adopted to study these issues. As a result, several novel delay-dependent and delay-independent asymptotical/exponential stability criteria for delayed cellular neural networks have been exploited. Particularly, the general delayed cellular neural networks have been shifted into a class of non-autonomic linear systems under the appropriate assumption on the activation functions.
On the other hand, the connection weights of the neurons depend on certain resistance and capacitance values that include uncertainties (modeling errors). When modeling neural networks, the parameter uncertainties (also called variations or fluctuations) should be taken into account. And in recent years, the stability analysis issues for neural networks in the presence of parameter uncertainties perturbations have stirred some initial research attention (Park 2007; Singh 2004; Zhang et al. 2005).
Recently, a special type of time delay in practical engineering systems, i.e., interval time-varying delay, is identified and investigated (Qiu et al. 2007; He et al. 2007). Interval time-varying delay is a time delay that varies in an interval in which the lower bound is not restricted to be 0. Qiu et al. (2007) have investigated the problem of robust stability of uncertain neural networks with interval time-varying delays. The delay factor is assumed to be time-varying and belongs to a given interval, which means that the lower and upper bounds of the interval time-varying delays are available. Based on the Lyapunov-Krasovskii functional approach, a new delay-dependent stability criteria is presented in terms of linear matrix inequalities (LMIs). It is worth noting that these stability criteria in Qiu et al. (2007) leave much room for improvement. A significant source of conservativeness that could be further reduced lies in the calculation of the time-derivative of the Lyapunov-Krasovskii functional. To the best of our knowledge, very few papers investigate the stability problem of neural networks with interval time-varying delays, which remains open but challenging. Therefore, it is of great significance to consider the stability of neural networks with interval time-varying delays. He et al. (2007) have studied the stability problem for neural networks with time-varying interval delay. Some less conservative stability criteria have been obtained by considering the relationship between the time-varying delay and its lower and upper bounds when calculating the upper bound of the derivative of Lyapunov functional.
In this paper, we deal with the robust stability problem for neural networks with interval time-varying delays and parameter uncertainties by choosing an appropriate Lyapunov functional. Some delay-derivative-dependent and delay-derivative-independent stability criteria are derived based on the new Lyapunov functional and the consideration of range for the time-delay. The resulting criteria are applicable to both fast and slow time-varying delay. Finally, three numerical examples are given to demonstrate the effectiveness and the merit of the proposed method.
Notations
The notations used throughout the paper are fairly standard. The superscript “
”stands for matrix transposition;
denotes the n-dimensional Euclidean space; the notation
means that
is real symmetric and positive definite;
and 0 represent identity matrix and zero matrix. In symmetric block matrices or long matrix expressions, we use an asterisk (
) to represent a term that is induced by symmetry. Matrices, if their dimensions are not explicitly stated, are assumed to be compatible for algebraic operations.
Problem formulation
Consider the following neural networks with interval time-varying delays and parameter uncertainties:
![]() |
1 |
where
is the neural state vector,
denotes the bounded neuron activation function with 
is a constant input vector.
, 
are the inter connection matrices representing the weight coefficients of the neurons.
Assumption 1 The time-varying delay
satisfies.
![]() |
2 |
![]() |
3 |
where
are constants.
Remark 1 Obviously, when
and
then
denotes a constant delay; the case when
it implies that
which is investigated in almost all the reported literature.
Assumption 2 The parameter uncertainties
are of the form:
![]() |
4 |
in which
are known constant matrices with appropriate dimensions. The uncertain matrix
satisfies
![]() |
5 |
In addition, the activation function
is bounded, and satisfies that
![]() |
6 |
In the following, we always shift the equilibrium point
to the origin by transformation
puts system (1) into the following form:
![]() |
7 |
where
is the state vector of the transformed system, with
![]() |
and 
Note that functions
here satisfy condition (H). Which is equivalent to
![]() |
8 |
Now, we give the following lemma that is useful in deriving our LMI-based stability criterion.
Lemma 1 [Schur complement] Given constant symmetric matrices
where
and
then
if and only if
![]() |
9 |
Lemma 2 For any
and a positive scalar
, the following inequality:
![]() |
10 |
holds.
Main results
In order to discuss robust stability of system (1), which has parametric uncertainties (4) and (5), first, we consider the case in which the matrices
and
are fixed, i.e.,
and
. For this case, the following theorem holds.
Theorem 1 For given scalars
and
, the neural network (7) is asymptotically stable, if there exist definite matrices
, 




such that the following LMI holds:
![]() |
11 |
where
![]() |
![]() |
and
![]() |
![]() |
Proof The Lyapunov functional of system (7) is defined by:
![]() |
![]() |
![]() |
where
and 
are to be determined. From the Leibniz–Newton formula, the following equations are true for any matrices
and
with appropriate dimensions,
![]() |
12 |
![]() |
13 |
![]() |
14 |
Noting that, for any diagonal matrices
and
from Eq. 8 there exist that
![]() |
15 |
Calculating the derivative of
along the solutions of system (7)
![]() |
16 |
![]() |
17 |
![]() |
18 |
![]() |
19 |
Combining Eqs. (15–19) and adding the left side of Eqs. (12–14) into the derivative of 
![]() |
20 |
where
![]() |
![]() |
Since
then the last three parts in Eq. 20 are all <0. So, if
![]() |
which is equivalent to Eq. 11 by Schur complements, then
for a sufficiently small
and
which ensures the asymptotic stability of system (7), see e.g. Hale and Verduyn Lunel (1993). The proof is completed. □
Remark 2 It is worth noting that the results in Qiu et al. (2007) are only applicable to systems with fast time-varying delay. In fact, in many cases, the derivative of time-varying delays is known and may be small. Thus, the results in Qiu et al. (2007) may have limited use. In our Theorem 1,
can be any value or unknown. Therefore, Theorem 1 is applicable to both fast and slow time-varying delays.
In fact, Theorem 1 gives a criterion for system (1) with
satisfying (2) and (3). In many cases, the information of the derivative of delay is unknown. Regarding this circumstance, a rate-independent criterion for a delay only satisfying (2) is derived as follows by choosing
in Theorem 1.
Corollary 1 For given scalars
, the neural network (7) is asymptotically stable, if there exist definite matrices





and
such that the following LMI holds:
![]() |
21 |
where
![]() |
![]() |
and
![]() |
![]() |
The following result provides the feasible robust stability criterion for systems with the admissible uncertainty.
Theorem 2 For given scalars
and
, the neural network (7) is robustly, asymptotically stable, if there exist definite matrices

and three positive scalars
such that the following LMI holds:
![]() |
22 |
where
![]() |
![]() |
![]() |
and
![]() |
![]() |
![]() |
![]() |
![]() |
Proof By Lemma 1, the system is robustly, asymptotically stable if the following inequality holds:
![]() |
23 |
where
![]() |
By Lemma 2, Eq. (23) holds if the following inequality satisfies:
![]() |
24 |
where 
![]() |
![]() |
![]() |
![]() |
Then, by Lemma 1, the inequality given in Eq. 24 is equivalent to the LMI Eq. 22. Thus, if the LMIs given in Eq. 22 holds, the system (1) is robust asymptotically stable. This completes the proof.
By setting
, the Corollary 2 is established from Theorem 2. □
Corollary 2 For given scalars
, the neural network (7) is robustly, asymptotically stable, if there exist definite matrices





and three positive scalars
, such that the following LMI holds:
![]() |
25 |
where
![]() |
![]() |
where
![]() |
![]() |
and other parameters are defined in Theorem 2.
Numerical examples
In this section, three examples are given to show the effectiveness and less conservativeness of our theoretical results.
Example 1 Consider the following neural networks with interval time-varying delays:
![]() |
The calculation results obtained by Theorem 1 in this paper for different cases are listed in Table 1.
Table 1.
Allowable upper bound of
with given 
![]() |
![]() |
![]() |
![]() |
![]() |
|
|---|---|---|---|---|---|
![]() |
11.0183 | 11.0727 | 11.4128 | 11.7128 | 11.9128 |
Remark 3 From Example 1, we can see that our results are correct for fast time-varying delay. And, it is easy to verify that out theoretical result is also applicable to slow time-varying delay.
In the following, the example in Yang et al. (2006) and Qiu et al. (2007) is used to demonstrate that our result is better than some existing results in the literature.
Example 2 Consider the following neural networks with interval time-varying delays:
![]() |
26 |
![]() |
Using Matlab LMI Control Toolbox, by our Corollary 1, we can find that the system (17) asymptotically stable and a part of the solutions of LMI (21) are given as follows:
![]() |
![]() |
Remark 4 It should be pointed out that, if we let
the conditions in Qiu et al. (2007) are not feasible, but using Corollary 1 in this paper, we can find that system (26) is still asymptotically stable. Therefore, our method is less conservative in some degree than that in Qiu et al. (2007).
Remark 5 If we let
we can also find the neural network (26) is asymptotically stable. Here, the solution of the LMI (21) is omitted. The same neural networks have been considered in Yang et al. (2006) and Qiu et al. (2007), where the maximal admissible time delay for stability was obtained to be
and
, respectively. By Corollary 1, we can have the maximal admissible time delay for stability is
. It is clear that
.
Further, another example is given to illustrate the effectiveness of Corollary 2.
Example 3 Consider the following neural networks with interval time-varying delays and parameter uncertainties:
Under different levels of the upper bounds of the time-delay, Table 2 lists the results of the maximum allowable delay bounds. It is seen from Table 2 that the results obtained from our method are less conservative than those obtained from Qiu et al. (2007).
Table 2.
Allowable upper bound of
with given 
![]() |
![]() |
![]() |
![]() |
|
|---|---|---|---|---|
| Qiu et al. (2007) | 2.082 | 2.182 | 2.582 | 2.882 |
| Corollary 2 | 3.883 | 3.905 | 4.139 | 4.333 |
Therefore, we can say that the results in this paper are much effective and less conservative than the one in Yang et al. (2006) and Qiu et al. (2007).
Conclusion
The stability problem for neural networks with interval time-varying delays and parameter uncertainties is considered. Based on the Lyapunov-Krasovskii functional approach, some delay-dependent stability criteria are derived by introducing free weighting matrices, which are used to reduce the conservatism of the obtained criterion. As a result, the new stability criteria in terms of LMIs are applicable to both fast and slow time-varying delays. Numerical examples are given to show the effectiveness of the method.
Acknowledgments
The work described in this paper was supported by grants from the National Nature Science and Foundation of China (No. 60574024, 60703035), and Scientific Research Fund of Chongqing Municipal Education Commission Grant (No. kj081501).
References
- Arik S (2000) Stability analysis of delayed neural networks. IEEE Trans Circuits Syst I 47(7):1089–1092. doi:10.1109/81.855465 [DOI]
- Arik S (2002) An analysis of global asymptotic stability of delayed cellular neural networks. IEEE Trans Neural Netw 13(5):1239–1242. doi:10.1109/TNN.2002.1031957 [DOI] [PubMed]
- Borkar V, Soumyanatha K (1997) An analog scheme for fixed point computation—Part I: Theory. IEEE Trans Circuits Syst I 44(4):351–355. doi:10.1109/81.563625 [DOI]
- Cao J, Wang J (2005) Global exponential stability and periodicity of recurrent neural networks with time delays. IEEE Trans. Circuits Syst I 52(5):920–931. doi:10.1109/TCSI.2005.846211 [DOI]
- Cao J, Yuan K, Li H (2006) Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays. IEEE Trans Neural Netw 17(6):1646–1651. doi:10.1109/TNN.2006.881488 [DOI] [PubMed]
- Chen Y, Fang S (2000) Neurocomputing with time delay analysis for solving convex quadratic programming problems. IEEE Trans Neural Netw 11(1):230–240. doi:10.1109/72.822526 [DOI] [PubMed]
- Chua L, Yang L (1988) Cellular neural networks: applications. IEEE Trans Circ Syst 35(10):1273–1290. doi:10.1109/31.7601 [DOI]
- Cichocki A, Unbehauen R (1993) Neural networks for optimization and signal processing. Wiley, New York
- Hale J, Lunel S (1993) Introduction of functional differential equations. Springer, New York
- He Y, Liu G, Rees D, Wu M (2007) Stability analysis for neural networks with time-varying interval delay. IEEE Trans Neural Netw 18(6):1850–1854. doi:10.1109/TNN.2007.903147 [DOI] [PubMed]
- Li C, Chen J, Huang T (2007) A new criterion for global robust stability of interval neural networks with discrete time delays. Chaos Solitons Fractals 31(3):561–570. doi:10.1016/j.chaos.2005.10.031 [DOI]
- Li C, Liao X, Wong K (2006) Delay-dependent and delay-independent stability criteria for cellular neural networks with delays. Int J Bifurcat Chaos 16(11):3323–3340. doi:10.1142/S0218127406016811 [DOI]
- Liao X, Chen G, Sanchez E (2002) LMI-based approach for asymptotically stability analysis of delayed neural networks. IEEE Trans Circuits Syst I 49(7):1033–1039. doi:10.1109/TCSI.2002.800842 [DOI]
- Liao T, Yan J, Cheng C, Hwang C (2005) Global exponential stability condition of a class of neural networks with time-varying delays. Phys Lett A 339(3–5):333–342. doi:10.1016/j.physleta.2005.03.034 [DOI]
- Michel A, Liu D (2002) Qualitative analysis and synthesis of recurrent neural networks. Marcel Dekker, New York
- Park J (2007) An analysis of global robust stability of uncertain cellular neural networks with discrete and distributed delays. Chaos Solitons Fractals 32(2):800–807. doi:10.1016/j.chaos.2005.11.106 [DOI]
- Qiu J, Yang H, Zhang J, Gao Z (2007) New robust stability criteria for uncertain neural networks with interval time-varying delays. Chaos Solitons Fractals. doi:10.1016/j.chaos.2007.01.087
- Singh V (2004) Robust stability of cellular neural networks with delay: linear matrix inequality approach. IEE Proc Contr Theory Appl 151(1):125–129. doi:10.1049/ip-cta:20040091 [DOI]
- Wang L (2007) Interactions between neural networks: a mechanism for tuning chaos and oscillations. Cogn Neurodynamics 1(2):185–188. doi:10.1007/s11571-006-9004-7 [DOI] [PMC free article] [PubMed]
- Yang H, Chu T, Zhang C (2006) Exponential stability of neural networks with variable delays via LMI approach. Chaos Solitons Fractals 30(1):133–139. doi:10.1016/j.chaos.2005.08.134 [DOI]
- Zhang H, Li C, Liao X (2005) A note on the robust stability of neural networks with time delay. Chaos Solitons Fractals 25(2):357–360. doi:10.1016/j.chaos.2004.11.017 [DOI]








































































