Abstract
This paper concerns the problem of enhanced results on robust finite time passivity for uncertain discrete time Markovian jumping BAM delayed neural networks with leakage delay. By implementing a proper Lyapunov–Krasovskii functional candidate, reciprocally convex combination method, and linear matrix inequality technique, we derive several sufficient conditions for varying the passivity of discrete time BAM neural networks. Further, some sufficient conditions for finite time boundedness and passivity for uncertainties are proposed by employing zero inequalities. Finally, the enhancement of the feasible region of the proposed criteria is shown via numerical examples with simulation to illustrate the applicability and usefulness of the proposed method.
Keywords: LMIs, Markovian jumping systems, Leakage delay, Bidirectional associative memory, Discrete time neural networks, Passivity and stability analysis
Introduction and problem statement with preliminaries
There has been a growing research interest in the field of recurrent neural networks (RNNs) largely studied by many researchers in recent years. The network architecture includes various types of neural networks such as bidirectional associative memory (BAM) neural networks, Hopfield neural networks, cellular neural networks, Cohen–Grossberg neural networks, neural and social networks which have received great attention due to their wide applications in the field of classification, signal and image processing, parallel computing, associate memories, optimization, cryptography, and so on. The bidirectional associative memory (BAM) neural network models were initially coined by Kosko, see [1, 2]. This network has an extraordinary class of RNNs which can have the ability to store bipolar vector pairs. It is composed of neurons and is arranged in two layers, one is the X-layer and the other is the Y-layer. The neurons in one layer are fully interconnected to the neurons in the other layer. The BAM neural networks are designed in such a way that, for a given external input, they can reveal only one global asymptotic or exponential stability equilibrium point. Hence, considerable efforts have been made in the study of stability analysis of neural networks, and as a credit to this, a large number of sufficient conditions have been proposed to guarantee the global asymptotic or exponential stability for the addressed neural networks.
Furthermore, the existence of time delays in the network will result in bad performance, instability or chaos. Accordingly, the time delays can be classified into two types: discrete and distributed delays. Here, we have taken both the delays into account while modeling our network system, because the length of the axon sizes is too large. So, it is noteworthy to inspect the dynamical behaviors of neural systems with both time delays, see, for instance, [3–11].
In [12], Shu et al. considered the BAM neural networks with discrete and distributed time delays. Some sufficient conditions were obtained to ensure the global asymptotic stability [12]. Also, time delays in the leakage term have great impact on the dynamic behavior of neural networks. However, so far, there have been a very few existing works on neural networks with time delay in the leakage term, see, for instance, [13–17].
Further the stability performance of state variable with leakage time delays was discussed by Lakshmanan et al. in [18]. While modeling a real nervous system, stochastic noises and parameter uncertainties are inevitable and should be taken into account. In the real nervous system, the synaptic transmission has created a noisy process brought on by apparent variation from the release of neurotransmitters and the connection weights of the neuron completely depend on undisputed resistance and capacitance values. Therefore, it is of practical significance to investigate the stochastic disruption in the stability of time-delayed neural networks with parameter uncertainties, see references cited therein [19–22]. Moreover, the hasty consequence (impulsive effect) is probable to exist in a wide variety of evolutionary processes that in turn make changes in the states abruptly at certain moments of time [23–28].
The conversion of the parameters jump will lead to a finite-state Markov process. Recently, the researchers in [29, 30] investigated the existence of Markovian jumps in BAMNNs and exploited the stochastic LKF approach, the new sufficient conditions were derived for the global exponential stability in the mean square.
The BAM-type NNs with Markovian jumping parameters and leakage terms were described by Wang et al. in [31]. In [32], a robust stability problem was studied and some delay-dependent conditions were derived for the neutral-type NNs with time-varying delays. The authors in [33–35] developed some conditions for the stability analysis of neural networks with integral inequality approach. The criteria to obtain the stability result of neural networks with time-varying delays were checked in[36–38]. It should be noted that, with all the consequences reported in the literature above, they are concerned only with Markovian jumping SNNs with Lipschitz model neuron activation functions. Up to now, very little attention has been paid to the problem of the global exponential stability of Markovian jumping SBAMNNs with non-Lipschitz type activation functions, which frequently appear in realistic neural networks. This situation motivates our present problem, i.e., α-inverse holder activation functions.
Our main objective of this paper is to study the delay-dependent exponential stability problem for a class of Markovian jumping uncertain BAM neural networks with mixed time delays, leakage delays, and α-inverse Holder activation functions under stochastic noise perturbation.
To the best of authors knowledge, so far, no result on the global exponential stability of Markovian jumping stochastic impulsive uncertain BAM neural networks with leakage, mixed time delays, and α-inverse Hölder activation functions has been available in the existing literature, which motivates our research to derive the following BAM neural networks:
| 1 |
where and denote the states at time t; , , and , , denote the neuron activation functions, , are positive diagonal matrices; , , , are the neural self inhibitions; , are the connection weight matrices; , are the discretely delayed connection weight matrices; and , are the distributively delayed connection weight matrices; and are the external inputs; and are the discrete time-varying delays which are bounded with , , and , , respectively; and are constant delays. The leakage delays , are constants; and denote the stochastic disturbances and are n-dimensional Brownian motions defined on a complete probability space with a filtration satisfying the usual conditions (i.e., it is right-continuous and contains all -null sets) and , ; , , are some continuous functions. The impulsive time satisfies , (i.e., ) and .
The main contributions of this research work are highlighted as follows:
- ∗
Uncertain parameters, Markovian jumping, stochastic noises, and leakage delays are taken into account in the stability analysis of designing BAM neural networks with mixed time delays.
- ∗
By fabricating suitable LKF, the global exponential stability of addressed neural networks is checked via some less conserved stability conditions.
- ∗
For novelty, some uncertain parameters are initially handled in Lyapunov–Krasovskii functional which ensures the sufficient conditions for global exponential stability of designed neural networks.
- ∗
In our proposed BAM neural networks, by considering both the time delay terms, the allowable upper bounds of discrete time-varying delay is large when compared with some existing literature, see Table 1 of Example 4.1. This shows that the approach developed in this paper is brand-new and less conservative than some available results.
Table 1.
Maximum allowable upper bounds of discrete time delays
Suppose that the initial condition of the stochastic BAM neural networks (1) has the form for and for , where and are continuous functions, and . Throughout this section, we assume that the activation functions , , , , , ; , satisfy the following assumptions:
Assumption 1
are monotonic increasing continuous functions.
- For any , there exist the respective scalars , and , which are correlated with , and , so that
Assumption 2
, and , are continuous and satisfy
, and , . Denote , and , respectively.
Remark 1.1
In [39], the function used in Assumption 1 is said to be an α-inverse Holder activation function which is a non-Lipschitz function. This activation function plays an important role in the stability issues of neural networks, and there exists a great number of results in the engineering mathematics, for example, and are 1-inverse Holder functions, is 3-inverse Holder function.
Remark 1.2
From Assumption 2, we can get that , and , are positive scalars. So E, Ẽ and K, K̃ are both positive definite diagonal matrices. The relations among the different activation functions , (which are α-inverse Holder activation functions) , and , are implicitly established in Theorem 3.2. Such relations, however, have not been provided by any of the authors in the reported literature.
In order to guarantee the global exponential stability of system (1), we assume that the system tends to its equilibrium point and the stochastic noise contribution vanishes, i.e.,
Assumption 3
; .
For such deterministic BAM neural networks, we have the following system of equations:
| 2 |
Thus system (1) admits one equilibrium point under Assumption 3. In this regard, let and , then system (1) can be rewritten in the following form:
| 3 |
where
Apparently, , is also an α-inverse Holder function, and , .
Let be a right-continuous Markov chain in a complete probability space and take values in a finite state space with generator given by
where and . Here is the transition probability rate from i to j if , while .
In this paper, we consider the following BAM neural networks with stochastic noise disturbance, leakage, mixed time delays, and Markovian jump parameters, which is actually a modification of system (3):
| 4 |
where , , , , , , , , , , , have the same meanings as those in (3), and are noise intensity function vectors, and for a fixed system mode, , , , , , , , , , and are known constant matrices with appropriate dimensions.
For our convenience, each possible value of and is denoted by i and j respectively; in the sequel. Then we have , , , , , , , , , , where , , , , , , , , , for any .
Assume that and are locally Lipschitz continuous and satisfy the following assumption.
Assumption 4
for all and , , ,where , , , , , and are known positive definite matrices with appropriate dimensions.
Consider a general stochastic system , with the initial value , where and is the Markov chain. Let denote a family of all nonnegative functions V on which are twice continuously differentiable in x and once differentiable in t. For any , define by
where
By generalized Ito’s formula, one can see that
Let and denote the state trajectory from the initial data on in and on in . Clearly, system (4) admits a trivial solution and corresponding to the initial data and , respectively. For simplicity, we write and .
Definition 1.3
The equilibrium point of neural networks (4) is said to be globally exponentially stable in the mean square if, for any , , there exist positive constants η, , , and correlated with ξ and ξ̃ such that, when , the following inequality holds:
Definition 1.4
We introduce the stochastic Lyapunov–Krasovskii functional of system (4), the weak infinitesimal generator of random process from to defined by
Lemma 1.5
([39])
If is an α-inverse Holder function, then for any , one has
Lemma 1.6
([39])
If is an α-inverse Holder function and , then there exist constants and such that , . Moreover, , .
Lemma 1.7
([21])
For any real matrix , scalars a and b with , vector function such that the following integrals are well defined, we have
Lemma 1.8
([39])
Let , and G is a positive definite matrix, then
Lemma 1.9
([21])
Given constant symmetric matrices , , and with appropriate dimensions, where and , if and only if
Lemma 1.10
([21])
For any constant matrix , , scalar , vector function , such that the integrations concerned are well defined, then
Lemma 1.11
([33])
For given matrices D, E, and F with and scalar , the following inequality holds:
Remark 1.12
Lakshmanan et al. in [18] analyzed the impact of time-delayed BAM neural networks for ensuring the stability performance when the leakage delay occurred. In [12], the authors discussed the stability behavior in the sense of asymptotic for BAM neural networks with mixed time delays and uncertain parameters. Moreover, the comparisons for maximum allowable upper bounds of discrete time-varying delays have been listed. Lou and Cui in [29] conversed the exponential stability conditions for time-delayed BAM NNs while Markovian jump parameters arose. Further, the stochastic effects on neural networks and stability criteria were conversed via exponential sense by Huang and Li in [40] by the aid of Lyapunov–Krasovskii functionals. In all the above mentioned references, the stability problem for BAM neural networks was considered only with leakage delays or mixed time delays, or stochastic effects, or Markovian jump parameters, or parameter uncertainties, but all the above factors have not been taken into one account and no one investigated exponential stability via delays at a time. Considering the above facts is very challenging and advanced in this research work.
Global exponential stability for deterministic systems
Theorem 2.1
Under Assumptions 1 and 2, the neural network system (4) is globally exponentially stable in the mean square if, for given (), there exist positive definite matrices S, T, S̃, T̃, , , , , , , , and , (), positive definite diagonal matrices P, Q, and positive scalars and () such that the following LMIs are satisfied:
| 5 |
| 6 |
| 7 |
| 8 |
| 9 |
| 10 |
where
Proof
Let us construct the following Lyapunov–Krasovskii functional candidate:
where
By Assumption 4, (5) and (6), we obtain
It is easy to prove that system (4) is equivalent to the following form:
By utilizing Lemmas 1.6 and 1.10, from (4) and Definition 1.4, one has
| 11 |
| 12 |
| 13 |
| 14 |
| 15 |
| 16 |
| 17 |
| 18 |
By combining Eqs. (11)–(18), we can obtain that
| 19 |
where
and
Let and . From conditions (9) and (10), it is easy to see that and . This fact together with (19) gives
| 20 |
Then, for , by some simple calculations, one gets
Therefore , , which implies that , . Using mathematical induction, we have that, for all and ,
Since , it follows from Dynkin’s formula that we have
Hence it follows from the definition of , the generalized Ito’s formula, and (20) that
| 21 |
By (21), we can get that and . Furthermore,
| 22 |
For and , by Lemma 1.6 there exist constants , , and , such that , , , and , , .
By (22), there exists a scalar , when , , , where and , , where . Hence when , one gets
| 23 |
where , and , . By (23), we get
where
| 24 |
Let
| 25 |
where
Let . It follows from (23), (24), and (25) that
Therefore
| 26 |
By Definition 1.3 and (26), we see that the equilibrium point of neural networks (4) is globally exponentially stable in the mean square sense. □
Remark 2.2
To the best of our knowledge, the global exponential stability criteria for impulsive effects of SBAMNNs with Markovian jump parameters and mixed time, leakage term delays, α-inverse Holder activation functions have not been discussed in the existing literature. Hence this paper reports a new idea and some sufficient conditions for global exponential stability conditions of neural networks, which generalize and improve the outcomes in [9, 11, 21, 37, 38].
Remark 2.3
The criteria given in Theorem 2.1 are dependent on the time delay. It is well known that the delay-dependent criteria are less conservative than the delay-independent criteria, particularly when the delay is small. Based on Theorem 2.1, the following result can be obtained easily.
Remark 2.4
If there are no stochastic disturbances in system (4), then the neural networks are simplified to
| 27 |
Global exponential stability of uncertain system
Now consider the following BAM neural networks with stochastic noise disturbance, Markovian jump parameters, leakage and mixed time delays, which are in the uncertainty case system:
| 28 |
Assumption 5
The perturbed uncertain matrices , , , , , , , and are time-varying functions satisfying: , , , , , , and , where M, , , , , , , , and are given constant matrices, respectively. () (where ) are unknown real time-varying matrices which have the following structure: , , , and , . We define the set as , where , invertible for and , , for and p, k̃, .
Also , , , , , , , , , , ΔS, ΔT, ΔS̃, ΔT̃, , , , , , and are positive definite diagonal matrices that are defined as follows: , and , where Ě, , , , , , , , , , , , , , , , , , , , and are positive diagonal matrices (i.e., , , where ()) and the remaining terms are defined in a similar way, which characterizes how the deterministic uncertain parameter in Σ enters the nominal matrices , , (), (), S, S̃, T, T̃, , , , , , and . The matrix Σ with real entries, which may be time-varying, is unknown and satisfies .
Remark 3.1
Overall, the stability of time-delayed neural networks fully depends on the Lyapunov–Krasovskii functional and LMI concepts. In particular, based on the neural networks, different types of LKF are chosen or handled to lead to the system stability. Up to now, no one has considered uncertain parameters in Lyapunov–Krasovskii functional terms. Without loss of generality, the gap is initially filled in this work, and also this kind of approach gives more advanced and less conserved stability results.
Theorem 3.2
Under Assumptions 1, 2, and 5, the neural network system (28) is global robust exponentially stable in the mean square if, for given (), there exist positive definite matrices S, T, S̃, T̃, , , , , , , , and , (), positive definite diagonal matrices ΔS, ΔT, , ΔS̃, ΔT̃, , , , , , , , , , P, Q and positive scalars and () such that the following LMIs are satisfied:
| 29 |
| 30 |
| 31 |
| 32 |
| 33 |
| 34 |
where
where
The remaining values of , are the same as in Theorem 2.1, and ∗ means the symmetric terms.
Proof
The matrices , , , , , , S, S̃, T, T̃, , , , , , and in the Lyapunov–Krasovskii functional of Theorem 2.1 are replaced by , , , , , , , , , , , , , , , and , respectively.
Hence, by applying the same procedure of Theorem 2.1 and using Assumption 5, Lemmas 1.8, 1.9, 1.10 and 1.11 and putting , we have from (28) and Definition 2 (weak infinitesimal operator ) that
where and are given in Theorem 2.1. The remaining proof of this theorem is similar to the procedure of Theorem 2.1, and we get that the uncertain neural network (28) is global robust exponentially stable in the mean square sense. □
Numerical examples
In this section, we provide two numerical examples with their simulations to demonstrate the effectiveness of our results.
Example 4.1
Consider the second order stochastic impulsive BAM neural networks (4) with , ; , are second order Brownian motions and , denote right-continuous Markovian chains taking values in with generator
The associated parameters of neural networks (4) take the values as follows:
Taking
The following activation functions play in neural network system (4):
It is easy to obtain that, for any with , there exists a scalar such that
Therefore and , , are 1-inverse Holder functions. In addition, for any , it is easy to check that
By a similar way, we get the same inequalities for and also (). That means the activation functions , , , , , () satisfy Assumptions 1 and 2.
Then, by Theorem 2.1, solving the LMIs using the Matlab LMI control toolbox, one can obtain the following feasible solutions:
Figure 1 narrates the time response of state variables , , , with and without stochastic noises, and Fig. 2 depicts the time response of Markovian jumps , . By solving LMIs (5)–(10), we get the feasible solutions. The obtained discrete time delay upper bounds of and for neural networks (4), which are given in Table 1, are very maximum. This shows that the contributions of this research work is more effective and less conservative than some existing results. Therefore, by Theorem 2.1, we can conclude that neural networks (4) are globally exponentially stable in the mean square for the maximum allowable upper bounds .
Figure 1.

The state response , , , of (1) with stochastic disturbances and without stochastic disturbances
Figure 2.

The state responses and denote Markovian jump in system (4)
Example 4.2
Consider the second order uncertain stochastic impulsive BAM neural networks (28) with , ; , are second order Brownian motions and , denote right-continuous Markovian chains taking values in with generator
The associated parameters of neural networks (28) are as follows:
Taking
, , . The following activation functions play in neural network system (28):
Therefore, by Theorem 3.2 in this paper, the uncertain delayed stochastic impulsive BAM neural networks (28) under consideration are global robust exponentially stable in the mean square.
Conclusions
In this paper, we have treated the problem of global exponential stability analysis for the leakage delay terms. By employing the Lyapunov stability theory and the LMI framework, we have attained a new sufficient condition to justify the global exponential stability of stochastic impulsive uncertain BAMNNs with two kinds of time-varying delays and leakage delays. The advantage of this paper is that different types of uncertain parameters were introduced into the Lyapunov–Krasovskii functionals, and the exponential stability behavior was studied. Additionally, two numerical examples have been provided to reveal the usefulness of our obtained deterministic and uncertain results. To the best of our knowledge, there are no results on the exponential stability analysis of inertial-type BAM neural networks with both time-varying delays by using Wirtinger based inequality, which might be our future research work.
Acknowledgements
This work was jointly supported by the National Natural Science Foundation of China under Grant No. 61573096, the Jiangsu Provincial Key Laboratory of Networked Collective Intelligence under Grant No. BM2017002, the Rajiv Gandhi National Fellowship under the University Grant Commission, New Delhi under Grant No. F1-17.1/2016-17/RGNF-2015-17-SC-TAM-21509, the Thailand research grant fund under Grant No. RSA5980019.
Authors’ contributions
All authors contributed equally and significantly in writing this article. All authors read and approved the final manuscript.
Competing interests
The authors declare that they have no competing interests.
Footnotes
Notations
is the set of real numbers; is the n-dimensional Euclidean space; denotes the set of all real matrices; is the set of all positive integers. For any matrix A, is the transpose of A, is the inverse of A; ∗ means the symmetric terms in a symmetric matrix. Positive definite matrix A is denoted by , negative definite A is denoted by , denotes minimum eigenvalues of a real symmetric matrix; is maximum eigenvalues of a real symmetric matrix; denotes the identity matrix; , are column vectors; , ; , denote the derivatives of and , respectively; ∗ is the symmetric form of a matrix; is the family of all nonnegative functions on which are continuously twice differentiable in u and once differentiable in t; is a complete probability space, where is the sample space, is the σ-algebra of subsets of the sample space, is the probability measure on and denotes the filtration; denotes the family of all -measurable -valued random variables such that , where stands for the mathematical expectation operator with respect to the given probability measure .
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Kosko B. Neural Networks and Fuzzy Systems—A Dynamical System Approach to Machine Intelligence. Englewood Cliffs: Prentice Hall; 1992. [Google Scholar]
- 2.Kosko B. Adaptive bidirectional associative memories. Appl. Opt. 1987;26(23):4947–4960. doi: 10.1364/AO.26.004947. [DOI] [PubMed] [Google Scholar]
- 3.Feng Z., Zheng W. Improved stability condition for Takagi-Sugeno fuzzy systems with time-varying delay. IEEE Trans. Cybern. 2017;47(3):661–670. doi: 10.1109/TCYB.2016.2523544. [DOI] [PubMed] [Google Scholar]
- 4.Joya G., Atencia M.A., Sandoval F. Hopfield neural networks for optimization: study of the different dynamics. Neurocomputing. 2002;43:219–237. doi: 10.1016/S0925-2312(01)00337-X. [DOI] [Google Scholar]
- 5.Li R., Cao J., Alsaedi A., Alsaadi F. Exponential and fixed-time synchronization of Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion terms. Appl. Math. Comput. 2017;313:37–51. doi: 10.1016/j.cam.2016.10.002. [DOI] [Google Scholar]
- 6.Li R., Cao J., Alsaedi A., Alsaadi F. Stability analysis of fractional-order delayed neural networks. Nonlinear Anal., Model. Control. 2017;22(4):505–520. doi: 10.15388/NA.2017.4.6. [DOI] [Google Scholar]
- 7.Nie X., Cao J. Stability analysis for the generalized Cohen–Grossberg neural networks with inverse Lipschitz neuron activations. Comput. Math. Appl. 2009;57:1522–1538. doi: 10.1016/j.camwa.2009.01.003. [DOI] [Google Scholar]
- 8.Tu Z., Cao J., Alsaedi A., Alsaadi F.E., Hayat T. Global Lagrange stability of complex-valued neural networks of neutral type with time-varying delays. Complexity. 2016;21:438–450. doi: 10.1002/cplx.21823. [DOI] [Google Scholar]
- 9.Zhang H., Wang Z., Lin D. Global asymptotic stability and robust stability of a class of Cohen-Grossberg neural networks with mixed delays. IEEE Trans. Circuits Syst. I. 2009;56:616–629. doi: 10.1109/TCSI.2008.2002556. [DOI] [Google Scholar]
- 10.Zhu Q., Cao J. Robust exponential stability of Markovian jump impulsive stochastic Cohen–Grossberg neural networks with mixed time delays. IEEE Trans. Neural Netw. 2010;21:1314–1325. doi: 10.1109/TNN.2010.2054108. [DOI] [PubMed] [Google Scholar]
- 11.Zhang X.M., Han Q.L., Seuret A., Gouaisbaut F. An improved reciprocally convex inequality and an augmented Lyapunov–Krasovskii functional for stability of linear systems with time-varying delay. Automatica. 2017;84:221–226. doi: 10.1016/j.automatica.2017.04.048. [DOI] [Google Scholar]
- 12.Shu H., Wang Z., Lu Z. Global asymptotic stability of uncertain stochastic bi-directional associative memory networks with discrete and distributed delays. Math. Comput. Simul. 2009;80:490–505. doi: 10.1016/j.matcom.2008.07.007. [DOI] [Google Scholar]
- 13.Balasundaram K., Raja R., Zhu Q., Chandrasekaran S., Zhou H. New global asymptotic stability of discrete-time recurrent neural networks with multiple time-varying delays in the leakage term and impulsive effects. Neurocomputing. 2016;214:420–429. doi: 10.1016/j.neucom.2016.06.040. [DOI] [Google Scholar]
- 14.Li R., Cao J. Stability analysis of reaction-diffusion uncertain memristive neural networks with time-varying delays and leakage term. Appl. Math. Comput. 2016;278:54–69. [Google Scholar]
- 15.Senthilraj S., Raja R., Zhu Q., Samidurai R., Yao Z. Exponential passivity analysis of stochastic neural networks with leakage, distributed delays and Markovian jumping parameters. Neurocomputing. 2016;175:401–410. doi: 10.1016/j.neucom.2015.10.072. [DOI] [Google Scholar]
- 16.Senthilraj S., Raja R., Zhu Q., Samidurai R., Yao Z. Delay-interval-dependent passivity analysis of stochastic neural networks with Markovian jumping parameters and time delay in the leakage term. Nonlinear Anal. Hybrid Syst. 2016;22:262–275. doi: 10.1016/j.nahs.2016.05.002. [DOI] [Google Scholar]
- 17.Li X., Fu X. Effect of leakage time-varying delay on stability of nonlinear differential systems. J. Franklin Inst. 2013;350:1335–1344. doi: 10.1016/j.jfranklin.2012.04.007. [DOI] [Google Scholar]
- 18.Lakshmanan S., Park J.H., Lee T.H., Jung H.Y., Rakkiyappan R. Stability criteria for BAM neural networks with leakage delays and probabilistic time-varying delays. Appl. Math. Comput. 2013;219:9408–9423. [Google Scholar]
- 19.Liao X., Mao X. Exponential stability and instability of stochastic neural networks. Stoch. Anal. Appl. 1996;14:165–185. doi: 10.1080/07362999608809432. [DOI] [Google Scholar]
- 20.Su W., Chen Y. Global robust exponential stability analysis for stochastic interval neural networks with time-varying delays. Commun. Nonlinear Sci. Numer. Simul. 2009;14:2293–2300. doi: 10.1016/j.cnsns.2008.05.001. [DOI] [Google Scholar]
- 21.Zhang H., Wang Y. Stability analysis of Markovian jumping stochastic Cohen-Grossberg neural networks with mixed time delays. IEEE Trans. Neural Netw. 2008;19:366–370. doi: 10.1109/TNN.2007.910738. [DOI] [PubMed] [Google Scholar]
- 22.Zhu Q., Cao J. Exponential stability of stochastic neural networks with both Markovian jump parameters and mixed time delays. IEEE Trans. Syst. Man Cybern. 2011;41:341–353. doi: 10.1109/TSMCB.2010.2053354. [DOI] [PubMed] [Google Scholar]
- 23.Bao H., Cao J. Stochastic global exponential stability for neutral-type impulsive neural networks with mixed time-delays and Markovian jumping parameters. Commun. Nonlinear Sci. Numer. Simul. 2011;16:3786–3791. doi: 10.1016/j.cnsns.2010.12.027. [DOI] [Google Scholar]
- 24.Li X., Song S. Stabilization of delay systems: delay-dependent impulsive control. IEEE Trans. Autom. Control. 2017;62(1):406–411. doi: 10.1109/TAC.2016.2530041. [DOI] [Google Scholar]
- 25.Li X., Wu J. Stability of nonlinear differential systems with state-dependent delayed impulses. Automatica. 2016;64:63–69. doi: 10.1016/j.automatica.2015.10.002. [DOI] [PubMed] [Google Scholar]
- 26.Li X., Bohner M., Wang C. Impulsive differential equations: periodic solutions and applications. Automatica. 2015;52:173–178. doi: 10.1016/j.automatica.2014.11.009. [DOI] [Google Scholar]
- 27.Stamova I., Stamov T., Li X. Global exponential stability of a class of impulsive cellular neural networks with supremums. Int. J. Adapt. Control Signal Process. 2014;28:1227–1239. doi: 10.1002/acs.2440. [DOI] [Google Scholar]
- 28.Pan L., Cao J. Exponential stability of stochastic functional differential equations with Markovian switching and delayed impulses via Razumikhin method. Adv. Differ. Equ. 2012;2012:61. doi: 10.1186/1687-1847-2012-61. [DOI] [Google Scholar]
- 29.Lou X., Cui B. Stochastic exponential stability for Markovian jumping BAM neural networks with time-varying delays. IEEE Trans. Syst. Man Cybern. 2007;37:713–719. doi: 10.1109/TSMCB.2006.887426. [DOI] [PubMed] [Google Scholar]
- 30.Wang Z., Liu Y., Liu X. State estimation for jumping recurrent neural networks with discrete and distributed delays. Neural Netw. 2009;22:41–48. doi: 10.1016/j.neunet.2008.09.015. [DOI] [PubMed] [Google Scholar]
- 31.Wang Q., Chen B., Zhong S. Stability criteria for uncertainty Markovian jumping parameters of BAM neural networks with leakage and discrete delays. Int. J. Math. Comput. Phys. Electr. Comput. Eng. 2014;8(2):391–398. [Google Scholar]
- 32.Balasubramaniam P., Krishnasamy R., Rakkiyappan R. Delay-interval-dependent robust stability results for uncertain stochastic systems with Markovian jumping parameters. Nonlinear Anal. Hybrid Syst. 2011;5:681–691. doi: 10.1016/j.nahs.2011.06.001. [DOI] [Google Scholar]
- 33.Gu K. An integral inequality in the stability problem of time-delay systems; Proceedings of 39th IEEE CDC; 1994. [Google Scholar]
- 34.Guo S., Huang L., Dai B., Zhang Z. Global existence of periodic solutions of BAM neural networks with variable coefficients. Phys. Lett. A. 2003;317:97–106. doi: 10.1016/j.physleta.2003.08.019. [DOI] [Google Scholar]
- 35.Haykin S. Neural Networks. New York: Prentice Hall; 1994. [Google Scholar]
- 36.Shi Y., Cao J., Chen G. Exponential stability of complex-valued memristor-based neural networks with time-varying delays. Appl. Math. Comput. 2017;313:222–234. [Google Scholar]
- 37.Wu H. Global exponential stability of Hopfield neural networks with delays and inverse Lipschitz neuron activations. Nonlinear Anal., Real World Appl. 2009;10:2297–2306. doi: 10.1016/j.nonrwa.2008.04.016. [DOI] [Google Scholar]
- 38.Yang X., Cao J. Synchronization of Markovian coupled neural networks with nonidentical node-delays and random coupling strengths. IEEE Trans. Neural Netw. 2012;23:60–71. doi: 10.1109/TNNLS.2011.2177671. [DOI] [PubMed] [Google Scholar]
- 39.Li Y., Wu H. Global stability analysis in Cohen–Grossberg neural networks with delays and inverse Holder neuron activation functions. Inf. Sci. 2010;180:4022–4030. doi: 10.1016/j.ins.2010.06.033. [DOI] [Google Scholar]
- 40.Huang T., Li C. Robust exponential stability of uncertain delayed neural networks with stochastic perturbation and impulse effects. IEEE Trans. Neural Netw. Learn. Syst. 2012;23:867–875. doi: 10.1109/TNNLS.2011.2178037. [DOI] [PubMed] [Google Scholar]
- 41.Balasubramaniam P., Vembarasan V. Robust stability of uncertain fuzzy BAM neural networks of neutral-type with Markovian jumping parameters and impulses. Comput. Math. Appl. 2011;62:1838–1861. doi: 10.1016/j.camwa.2011.06.027. [DOI] [Google Scholar]
- 42.Rakkiyappan R., Chandrasekar A., Lakshmana S., Park J.H. Exponential stability for Markovian jumping stochastic BAM neural networks with mode-dependent probabilistic time-varying delays and impulse control. Complexity. 2015;20(3):39–65. doi: 10.1002/cplx.21503. [DOI] [Google Scholar]
- 43.Park J.H., Park C.H., Kwon O.M., Lee S.M. A new stability criterion for bidirectional associative memory neural networks. Appl. Math. Comput. 2008;199:716–722. [Google Scholar]
- 44.Mohamad S. Lyapunov exponents of convergent Cohen–Grossberg-type BAM networks with delays and large impulses. Appl. Math. Sci. 2008;2(34):1679–1704. [Google Scholar]
- 45.Park J.H. A novel criterion for global asymptotic stability of BAM neural networks with time-delays. Chaos Solitons Fractals. 2006;29:446–453. doi: 10.1016/j.chaos.2005.08.018. [DOI] [Google Scholar]
