Skip to main content
PLOS One logoLink to PLOS One
. 2026 Mar 9;21(3):e0343312. doi: 10.1371/journal.pone.0343312

New conditions for stability of multiple delayed Cohen-Grossberg Neural Networks of neutral-type

Neyir Ozcan 1,*
Editor: Benjamin Z Webb2
PMCID: PMC12970979  PMID: 41801997

Abstract

In this research article, we essentially aim to examine the stability properties of a certain type of Cohen-Grossberg neural network. The analysed neural network involves multiple delay parameters. These delay parameters complicate the dynamical behaviour of the system, thereby increasing the risk of oscillations and chaotic behaviour, which adversely affect system stability. However, under specific system parameter constraints, the stability of the system can be ensured. In our study, we developed new adequate stability conditions that guarantee global asymptotic stability for neutral-type Cohen-Grossberg artificial neural networks with multiple delays. These conditions, which can serve as an alternative to the results in the literature, are derived by utilizing suitable Lyapunov functionals and the Lyapunov theorem. The proposed stability conditions are formulated as algebraic equations. Within this context, our proposed stability conditions can be easily examined by using some mathematical methods and software tools. By carrying out a detailed analysis of an instructive numerical example, the results obtained in this article are also shown to establish alternative stability criteria to the corresponding stability conditions given in the past literature.

1. Introduction

In recent years, stability analysis of many different types of neural systems has received some particular attention because these neural networks have been proved to possess the abilities to solve different important practical engineering related problems in some particular areas related to some signal and image processing problems, control theory problems, intelligent computational systems, fault diagnosis, pattern recognition, some optimization problems and associative memories applications, (e.g., see [15]). When using these neural networks in engineering related applications, one needs to guarantee convergence dynamics of designed neural network models. Because of these reasons, it is crucially important to set up the suitable network parameters which ensure the aimed equilibria properties and stability dynamics of the implemented neural network. In the recent literature, some neural network classes have been adequately applied to real time engineering related problems. In the case of real time applications of neural systems, the convergence dynamics and stability properties may be subject to variation due to the some delay parameters resulting from finite switching rates of electronic circuit components as well as signal transmission times of neurons in neural systems. For instance, introducing delay parameters into the mathematical model of a stable non-delayed neural network may turn this stable neural network into such a neural network that is unstable. Because of such reasons, stability analysis is a critical concept for the dynamical neural systems having some time delay parameters in their mathematically represented models in mathematical neurodynamics of neural systems. Recently, a very large number of the past literature papers have carried out stability investigation of neural networks accepting time delays into state variables of neurons (see, e.g., [614]). It should be remarked, at this point that, presenting these time delay parameters to state variables of network neurons may not provide an appropriate modeling approach for dynamical neural networks, as the time derivatives of network neurons state variables can additionally possess some different types of delays. This class of neural networks may show some unexpected, complicated and complex dynamical behaviours. That is why, we must introduce some delays into used mathematical model for these neural systems. Literature defines delays in state variables as time delays, whereas delays in the derivatives of state variables are defined as neutral delays. These type of neural models referred to as neutral-type were successfully employed to be applied to some specific problems including population ecologies, the distributed systems with lossless transmission lines, the propagation-diffusion models [1517]. Hence, conducting stability of neutral neural networks emerges a crucial importance.

In this research paper, our primary aim is to analyze the convergence dynamics and stability characteristics of Cohen-Grossberg neural systems involved multiple delay parameters. A most commonly used mathematical model for these neural systems is given by the following formulation:

x˙i(t)= di(xi(t))(ci(xi(t))+j=1naijfj(xj(t))+j=1nbijfj(xj(tτij))+ui) +j=1neijx˙j(tζij), i=1, 2,, n (1)

In this mathematical model, n represents the number of neurons, the component xi(t) is the state variable of neuron i, the components ci(xi(t)) and di(xi(t)) represent the behaved functions and amplification functions of neurons respectively. aij and bij are the neuronal interconnection parameters. These parameters are constant. In the literature, τij and ζij denote time delays and neutral delays respectively and they are also constant. eij are the fixed coefficients. fi(xi(t)) and ui denote nonlinear neuronal activation function for ith neuron and constant external input for ith neuron respectively. In neutral-type neural network represented by equation (1), take τ=max(τij) and ζ=max(ζij), i, j with μ=max(τ,ζ). Thus, the neural system described as mathematical model (1) possesses initial conditions defined as xi(t)=Θi(t) with x˙i(t)=Δi(t) that will be usually convered in a compact set C([μ,0], R). It is needed to note that C([μ,0], R) is a set representing the functions from [μ,0] to R.

The neural network given in the form described by (1) has n2 time delays and n2 neutral delays. Because of these delay parameters, neural system (1) is said to have multiple delays.

Various subclasses of the neural system (1) can be stated, where the neural system has fewer delay parameters. A neural network that involves n time delays and n neutral delays is said to exhibit discrete delay terms and is modeled by the following mathematical expression:

x˙i(t)=di(xi(t))(ci(xi(t))+j=1naijfj(xj(t))+j=1nbijfj(xj(tτj))+ui)+j=1neijx˙j(tζj), i=1, 2,, n (2)

The following vector and matrix representation can be used to express the delayed neural system defined by (2):

x˙(t)=D(x(t))(C(x(t))+Af(x(t))+Bf(x(tτ))+u)+Ex˙(tζ) (3)

in which τj are some discrete time delays, ζj are the discrete neutral delays, A=(aij) B=(bij) and E=(eij) represent the system matrices, u=(u1, u2,, un)T, the vector x(t)=(x1(t), x2(t),, xn(t))T is the state variables of neural systems (2), the state vector involving time delays is represented by x(tτ)=(x1(tτ1), x2(tτ2),, xn(tτn))T, x˙(tζ)=(x˙1(tζ1), x˙2(tζ2), ,x˙n(tζn))T, D(x(t))=diag(di(xi(t))>0) and C(x(t))=(c1(x1(t)), c2(x2(t)),,cn( xn(t)))T.

If a neural system possesses only one time delay parameter and only one neutral delay term, then this neural network is said to have two delays. Then, a Cohen-Grossberg neural network involving such delay terms will have the mathematical model given by:

x˙i(t)=di(xi(t))(ci(xi(t))+j=1naijfj(xj(t))+j=1nbijfj(xj(tτ))+ui)+j=1neijx˙j(tζ), i   (4)

In formulation (4), the parameter τ represents a fixed time delay, while the term ζ denotes a fixed neutral delay. Neutral neural system defined by the equation (4) can be written as

x˙(t)=D(x(t))(C(x(t))+Af(x(t))+Bf(x(tτ))+u)+Ex˙(tζ) (5)

with the new time delayed system vector of having a form x(tτ)=(x1(tτ), x2(tτ),, xn(tτ))T and the new neutral delayed system state vector of having a form x˙(tζ)=(x˙1(tζ), x˙2(tζ), ,x˙n(tζ))T.

Network functions denoted by di(xi(t)) and ci(xi(t)) and activation functions denoted by fi(xi(t)) have the nonlinear form. Some basic properties of these nonlinear functions are generally stated by the following conditions:

H1: The amplification functions di(x) hold the following conditions

0<vidi(x)ϕi, xR, i

with vi and ϕi are being some real positive constants.

H2: The behaved functions ci(x) hold the following conditions

0<γi(xy)2|ci(x)ci(y)||xy|ψi(xy)2, x, yR, xy, i.

with γi and ψi are being some real positive constants.

H3: The activation functions fi(x) hold the following conditions

|fi(x)fi(y)𝓁i||xy|, x, yR, i

with 𝓁i are being some real positive constants.

Since neural networks models (2) and (4) can be directly put in some suitable vector-matrix forms, it is relatively easier to study stability analysis of these systems when compared with stability analysis of neural system (1). In the past literature, various techniques, methods and approaches have been employed to determine various useful and sufficient criteria for stability of neural network models given by (2) and (4). In [18] and [19], the integral inequality techniques, in [20], general delay partitioning techniques and the Jensen inequality, in [21], the semimartingale convergence technique, in [22], the delay partitioning method, in [23], functionals of triple or quadruple integral terms, in [24] the Wirtinger –type integral inequalities and some convex combination techniques, in [25], Leibniz-Newton formula principle, in [26], the stochastic analysis theory principle, In [27], the descriptor transformation theory method has been combined with Lyapunov stability theorems. In addition to these stability conditions which are defined by different forms of linear matrix inequalities (LMI), some stability conditions of neural network models (2) and (4) in algebraic forms have also been presented in [2837].

The first results regarding global stability of neutral neural network models having multiple delay terms have been given in [38]. In this study, neutral-type Hopfield neural networks have been analyzed. We also note here that when the amplification functions di(xi(t)) are equal to 1 for all i, and for some positive constants ci, ci(xi(t))=cixi(t) for all i, the multiple delayed neutral neural system stated in (1) will take the form of a delayed-type Hopfield neural network. Hence, the author of [38] has initiated the area of stability issues for neutral neural network models admitting multiple delay parameters. Then, motivated by the results of [38], some other researchers have made some important contributions to the investigation of stability conditions of systems (1). In [3943], various sets of criteria for stability of neural system (1) have been presented. In this new research article, we will conduct and alternative Lyapunov stability analysis of neural network model (1) and obtain new alternative conditions that ensure stability of this neural systems.

2. Stability analysis

We will proceed, in this section, with obtaining new results which assure global asymptotic stability of neural network whose mathematical representation is stated in (1). For this objective, one can find it to be helpful to transfer equilibrium points that are possessed by neutral neural system given in (1) to the origin. If the constant vector denoted by x^ whose components are x^1, x^2, , x^n, represents a fixed equilibrium point of the neural network whose dynamics is described in (1), then the simple equation zi(t)=xi(t)x^i will directly transform the equilibrium point of the system (1) to the origin. By assigning zi(t) to the difference xi(t)x^i, i, in equation (1), a new neutral-type neural system is obtained, and dynamical behaviour of this system can be defined using the following mathematical equation:

z˙i(t)=i(zi(t))(βi(zi(t))+j=1naijgj(zj(t))+j=1nbijgj(zj(tτij)) )+j=1neijz˙j(tζij), i (6)

In the system given by (6), the transformed nonlinear strictly positive system functions i(zi(t)) and the transformed nonlinear strictly increasing system functions βi(zi(t)) are obtained to be in the forms i(zi(t))=di(zi(t)+x^i) and βi(zi(t))=ci(zi(t)+x^i)ci(x^i) respectively. Finally, it is stated that the transformed nonlinear activation functions gi(zi(t)) are expressed as gi(zi(t))=fi(zi(t)+x^i)fi(x^i). Considering original constraints conditions on system functions, it is obvious that the following conditions will be satisfied by these new functions under conditions H1, H2 and H3:

H~1:0<vii(zi(t))ϕi, i
H~2:γizi2(t)βi(zi(t))zi(t)ψizi2(t), i
H~3:|gi(zi(t))|𝓁i|zi(t)|, i

In the proceeding theorem, we derive the major results of our paper:

2.1. Theorem 1

For the system (6), assume that the system functions justify assumptions H~1, H~2 and H~3. In this case, the transformed neural system expressed by (6) is globally asymptotically stable if there exist real positive numbers ε1>0, ε2>0, ε3>0 and εi>0, i such that the criteria given below are satisfied.

δi=2ξiγi𝓁ij=1n(ξε1i|aij|+ε1j|aji|)j=1n(ξε2i|bij|+ε2j|bji|)ξε3j=1nivi|eij|>0, i (7)

and

κi=2(1ξ)iϕi1ξε1ij=1n|aij|1ξε2ij=1n|bij|1ξε3j=1nivi|eij|ε3j=1njvj|eji|>0, i (8)

where 0<ξ<1.

Proof:

Construct three different functionals:

 V1(t)=2i=1n0zi(t)i𝓁isi(s)ds (9)
V2(t)=2i=1ni0zi(t)βi(s)ds+2i=1ni0tz˙i2(θ)i(zi(θ))dθ2i=1ni0tz˙i2(θ)i(zi(θ))dθ (10)
V3(t)=i=1nj=1n(ε2tτjitϵj|bji|𝓁i2zi2(ξ)dξ+ε3tζjitϵjvjejiz˙i2(φ)dφ) +i=1nj=1n(pntτjitzi2(ξ)dξ+qntζjitz˙i2(φ)dφ) (11)

In the equation (11), p and q are positive constants. These values will be determined as needed.

Now, construct a positive real valued Lyapunov functional for neural system expressed by equation (6):

V(t)=ξV1(t)+(1ξ)V2(t)+V3(t) (12)

For V1(t), which is one of the functions within the constructed Lyapunov functional, V˙1(t) is calculated and the following mathematical equation is obtained:

       V˙1(t)=2i=1ni𝓁izi(t)z˙i(t)αi(zi(t)) (13)
=2i=1ni𝓁izi(t)(βi(zi(t))+j=1naijgj(zj(t))+j=1nbijgj(zj(tτij)))
       +2i=1nj=1ni𝓁iαi(zi(t))eijzi(t)z˙j(tζij)
     =2i=1ni𝓁iβi(zi(t))zi(t)+2i=1nj=1ni𝓁iaijgj(zj(t))zi(t)
       +2i=1nj=1ni𝓁ibijgj(zj(tτij))zi(t)+2i=1nj=1ni𝓁iαi(zi(t))eijzi(t)z˙j(tζij)

Condition H~2 ensures that βi(zi(t))zi(t)γizi2(t). Thus, (13) leads to

       V˙1(t)2i=1ni𝓁iγizi2(t)+2i=1nj=1ni𝓁iaijgj(zj(t))zi(t)
               +2i=1nj=1ni𝓁ibijgj(zj(tτij))zi(t)+2i=1nj=1ni𝓁iαi(zi(t))eijzi(t)z˙j(tζij) (14)

For V2(t), which is another function within the constructed Lyapunov functional, V˙2(t) is calculated:

V˙2(t)=2i=1n(iβi(zi(t))z˙i(t)+iz˙i2(t)αi(zi(t))iz˙i2(t)αi(zi(t))) (15)

Note that

2i=1niz˙i2(t)αi(zi(t))=2i=1n(iβi(zi(t))+j=1n(iaijgj(zj(t))+ibijgj(zj(tτij)))z˙i(t)) (16)
                   +2i=1nj=1niαi(zi(t))eijz˙i(t)z˙j(tζij)

Inserting [16] into [15] results in:

V˙2(t)=2i=1nj=1n(iaijz˙i(t)gj(zj(t))+ibijz˙i(t)gj(zj(tτij))+iαi(zi(t))eijz˙i(t)z˙j(tζij))2i=1niz˙i2(t)αi(zi(t)) (17)
  2i=1nj=1n(iaijz˙i(t)gj(zj(t))+ibijz˙i(t)gj(zj(tτij))+iαi(zi(t))eijz˙i(t)z˙j(tζij))2i=1niϕiz˙i2(t)

One may state the following inequalities that are important in the proofs:

2i=1nj=1ni𝓁izi(t)aijgj(zj(t))2i=1nj=1ni𝓁i|zi(t)||aij||gj(zj(t))|  
                              i=1nj=1n(1ε1i|aij|𝓁i2zi2(t)+ε1i|aij|gj2(zj(t))) 
                              i=1nj=1n(1ε1i|aij|𝓁i2zi2(t)+ε1i|aij|𝓁j2zj2(t))  
                             =i=1nj=1n(1ε1i|aij|𝓁i2zi2(t)+ε1j|aji|𝓁i2zi2(t)) (18)
2i=1nj=1nibij𝓁izi(t)gj(zj(tτij))2i=1nj=1ni𝓁i|bij||zi(t)||gj(zj(tτij))| (19)
                               i=1nj=1n(1ε2i|bij|𝓁i2zi2(t)+ε2i|bij|gj2(zj(tτij)))
                                  i=1nj=1n(1ε2i|bij|𝓁i2zi2(t)+ε2i|bij|𝓁j2zj2(tτij))
                                 =i=1nj=1n(1ε2i|bij|𝓁i2zi2(t)+ε2j|bji|𝓁i2zi2(tτji))
2i=1nj=ni𝓁iαi(zi(t))eijzi(t)z˙j(tζij)i=1nj=n2|eij|i𝓁iαi(zi(t))|zi(t)||z˙j(tζij)|
                      2i=1nj=1ni𝓁ivi|eij||zi(t)||z˙j(tζij)|
                                  i=1nj=1n(1ε3ivi𝓁i2|eij|zi2(t)+ε3ivieijz˙j(tζij))
                                  =i=1nj=1n(1ε3ivi𝓁i2|eij|zi2(t)+ε3jvjejiz˙i(tζji)) (20)
 2i=1nj=1niaijz˙i(t)gj(zj(t))2i=1nj=1ni|aij||z˙i(t)||gj(zj(t))|
                             i=1nj=1n(1ε1i|aij|z˙i2(t)+ε1i|aij|gj2(zj(t)))
                              i=1nj=1n(1ε1i|aij|z˙i2(t)+ε1i|aij|𝓁j2zj2(t))  
                        =i=1nj=1n(1ε1i|aij|z˙i2(t)+ε1j|aji|𝓁i2zi2(t)) (21)
 2i=1nj=1nibijz˙i(t)gj(zj(tτij))2i=1nj=1ni|bij||z˙i(t)||gj(zj(tτij))|
                                  i=1nj=1n(1ε2i|bij|z˙i2(t)+ε2i|bij|gj2(zj(tτij)))
                                  i=1nj=1n(1ε2i|bij|z˙i2(t)+ε2i|bij|𝓁j2zj2(tτij))
                             =i=1nj=1n(1ε2i|bij|z˙i2(t)+ε2j|bji|𝓁i2zi2(tτji)) (22)

and

2i=1nj=1niαi(zi(t))eijz˙i(t)z˙j(tζij)2i=1nj=1niαi(zi(t))|eij||z˙i(t)||z˙j(tζij)| (23)
2i=1nj=1nivi|eij||z˙i(t)||z˙j(tζij)|
                      i=1nj=1n(1ε3ivi|eij|z˙i2(t)+ε3ivi|eij|z˙j2(tζij))
                     =i=1nj=1n(1ε3ivi|eij|z˙i2(t)+ε3jvj|eji|z˙i2(tζji))

Based on (18)-(23), (14) and (17) will respectively lead to

ξV˙1(t)2ξi=1ni𝓁iγizi2(t)+ξε1i=1nj=1ni|aij|𝓁i2zi2(t)+ξε1i=1nj=1nj|aji|𝓁i2zi2(t) (24)
                 +ξε2i=1nj=1ni|bij|𝓁i2zi2(t)+ξε2i=1nj=1nj|bji|𝓁i2zi2(tτji)  
                 +ξε3i=1nj=1niυi𝓁i2|eij|zi2(t)+ξε3i=1nj=1njυjejiz˙i(tζji)

and

(1ξ)V˙2(t)i=1n(2(1ξ)qiϕiz˙i2(t)+j=1n((1ξ)ε1qi|aij|z˙i2(t)+(1ξ)ε1qj|aji|𝓁i2zi2(t)))+i=1nj=1n((1ξ)ε2i|bij|z˙i2(t)+(1ξ)ε2qj|bji|𝓁i2zi2(tτji))+i=1nj=1n((1ξ)ε3iυi|eij|z˙i2(t)+(1ξ)ε3qjυj|eji|z˙i2(tζji)) (25)

Combining (24) and (25) results in

ξV˙1(t)+(1ξ)V˙2(t)
        i=1n(2ξi𝓁iγizi2(t)+j=1n(ξε1i|aij|𝓁i2zi2(t)+ε1j|aji|𝓁i2zi2(t)))  
              +i=1nj=1n(ξε2i|bij|𝓁i2zi2(t)+ξε3iυi𝓁i2|eij|zi2(t)+ε2j|bji|𝓁i2zi2(tτji))
            +i=1n(2(1ξ)iϕiz˙i2(t)+j=1n((1ξ)ε1i|aij|z˙i2(t)+i|bij|𝓁i2z˙i2(t)))
            +i=1nj=1n((1ξ)ε3iυi|eij|z˙i2(t)+ε3jυj|eji|z˙i2(tζji)) (26)

Finally, following inequality is calculated for V˙3(t):

V˙3(t)=ε2i=1nj=1n(j|bji|𝓁i2zi2(t)j|bji|𝓁i2zi2(tτji)) (27)
        +ε3i=1nj=1n(jυjejiz˙i2(t)jυjejiz˙i2(tζji)) 
               +i=1nj=1n(pn(zj2(t)zj2(tτij))+qn(z˙j2(t)z˙j2(tζij)))
      ε2i=1nj=1n(j|bji|𝓁i2zi2(t)j|bji|𝓁i2zi2(tτji))
     +ε3i=1nj=1n(jυjejiz˙i2(t)jυjejiz˙i2(tζji))+i=1nj=1n(pnzj2(t)+qnz˙j2(t))

Combining equation (27) with (26)

V˙(t)=ξV˙1(t)+(1ξ)V˙2(t)+V˙3(t) 
   2ξi=1ni𝓁iγizi2(t)+ξε1i=1nj=1ni|aij|𝓁i2zi2(t)+ε1i=1nj=1nj|aji|𝓁i2zi2(t)  
    +ξε2i=1nj=1ni|bij|𝓁i2zi2(t)  +ε2i=1nj=1nj|bji|𝓁i2zi2(t)+ξε3i=1nj=1niυi𝓁i2|eij|zi2(t)
     2(1ξ)i=1niϕiz˙i2(t)+(1ξ)ε1i=1nj=1ni|aij|z˙i2(t)+(1ξ)ε2i=1nj=1ni|bij|z˙i2(t)
      +(1ξ)ε3i=1nj=1niυi|eij|z˙i2(t)+ε3i=1nj=1niυj|eji|z˙i2(t)+i=1nj=1n(pnzj2(t)+qnz˙j2(t))
=i=1n𝓁i2(2ξiγi𝓁i j=1n(ξε1i|aij|+ε1j|aji|)j=1n(ξε2i|bij|+ε2j|bji|)ξε3j=1niυieij)zi2(t)
   i=1n(2(1ξ)iϕi(1ξ)ε1j=1ni|aij|(1ξ)ε2j=1ni|bij|(1ξ)ε3j=1niυi|eij|ε3j=1njυj|eji|)z˙i2(t)+i=1nj=1n(pnzj2(t)+qnz˙j2(t))
=i=1n𝓁i2δizi2(t)i=1nκiz˙i2(t)+pi=1nzi2(t)+qi=1nz˙i2(t)
𝓁m2δmi=1nzi2(t)κmi=1nz˙i2(t)+pi=1nzi2(t)+qi=1nz˙i2(t)
 =(𝓁m2δmp)z(t)22(κmq)z˙(t)22 (28)

in which 𝓁m=min(𝓁i), δm=min(δi) and κm=min(κi). In (28), the conditions p<𝓁m2δm and q<κm assure that V˙(t)<0 hold if z(t)0. Additionally, it can be examined that V˙(t)<0 is also satisfied under the condition that z˙(t)0. Moreover, one can also derive that if z(t)=z˙(t)=0, then V˙1(t)=0 and V˙2(t)=0, which signifies that V˙(t)=V˙3(t). Hence, equation (27) directly yields the inequality.

V˙(t)=i=1nj=1n(ε2j|bji|𝓁i2zi2(tτji)ε3jvjejiz˙i2(tζji))
                  1ni=1nj=1n(pzj2(tτij)+qz˙j2(tζij))
                1ni=1nj=1n(pzj2(tτij)+qz˙j2(tζij)) (29)

It is obvious from inequality (29) that if the states satisfy the conditions zj(tτij)0 for some arbitrary chosen indices i and j, in which case, (29) directly guarantees that V˙(t)<0. Additionaly, if z˙j(tτij)0 for some arbitrary chosen indices i and j, then (29) also directly guarantees that V˙(t)<0. We can also observe that, in all cases of zj(tτij) as well as z˙j(tτij)=0, i, j together with the condition z˙(t)=z(t)=0, it follows that V˙1(t)=V˙2(t)=V˙3(t)=0, assuring the condition of V˙(t)=0. For the sake of ensuring exact global convergence and stability properties of origin, we require to control whether V(t) is radially unbounded. One may write:

V(t)i=1n2i0zi(t)βi(s)ds2i=1nm0zi(t)γi(s)sds
   2mγmi=1n0zi(t)sds=mγmz(t)22 (30)

From (30), we may realise that V(t) holds if z(t), proving that the positive real valued V(t) is certainly radially unbounded. Q.E.D.

Some useful particular cases of results of Theorem 1 are stated below. For ε1=ε2=ε3=ξ, Theorem 1 directly yields the following stability criteria:

2.2. Corollary 1

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if there exist some positive real constants i>0 satisfying the following algebraic criteria:

δ^i=2ξγi𝓁ij=1n(i(|aij|+|bij|)+j(|aji|+|bji|)) iυij=1n|eij|>0, i (31)

and

κ^i=2(1ξ)iϕi1ξξi(j=1n(|aij|+|bij|+1υi|eij|))ξj=1njυj|eji|>0, i (32)

where 0<ξ<1. For i =1, i, r=min(γi𝓁i), υm=min(υi), ϕM=max(ϕi), ε1=ξAA1, ε2=ξBB1 and ε3=ξEE1.

The following corollary results from Theorem 1.

2.3. Corollary 2

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if the algebraic criteria given in the following form are satisfied,

δ=2ξr2AA12BB11υmEE1>0 (33)

and

κ=(1ξ)2ϕM(1ξ)ξ(AA1+BB1)1ξυmEE1>0, (34)

where 0<ξ<1.

When the results of Corollary 1 and Corollary 2 are compared, the following conclusions can be drawn: The result of Corollary 2 is a special case of Corollary 1. Corollary 1 imposes less restrictive constraint conditions on the network parameters of neural system (1) than those imposed by Corollary 2. However, it is more difficult to validate the conditions stated in Corollary 1 than the validation of the conditions stated in Corollary 2.

2.4. An example and comparisons

The aim of the current section will be the analysis of an example for the sake of comparisons between the results of this study and some existing stability conditions proposed in the literature. Firstly, we are required to restate some previous literature results.

2.4.1. Theorem 2 [39].

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if the algebraic criteria given in the following form are satisfied,

ρi=γi2𝓁i2j=1n|k=1nakiakj|j=1nk=1n(|aji||bjk|+|aji||ejk|+|ajk||bji|+|bji||ejk|+|bji||bjk|)>0 (35)

and

ϱij=1nϕj2|eji|υi2k=1n(|ejk|+|ajk|+|bjk|)>0, i,j. (36)

2.4.2 Theorem 3 [40].

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if the algebraic criteria given in the following form are satisfied,

vi= 2υiγij=1n(ϕi𝓁j|aij|+ϕj𝓁i|aji|)j=1n(ϕi𝓁j|bij|+ϕj𝓁i|bji|)j=1n(ϕiψi|eij|+ϕjψj|eji|) 𝓁ij=1nk=1n(ϕi|aki||ekj|+ϕk|bki||ekj|)|eji|j=1nk=1n𝓁k(ϕj|ajk|+ϕj|bjk|)>0,i (37)

and

ei=1j=1n|eji|>0, i (38)

2.4.3 Theorem 4 [41].

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if there exist real constants ki>0 satisfying the following algebraic criteria:

mi=kiυiγi𝓁ij=1nkjϕj(|aji|+|bji|)>0, i (39)

and

e^i=kij=1nkj|eji|>0, i. (40)

2.4.4 Theorem 5 [42].

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if there exist real constants ki>0 satisfying the following algebraic criteria:

σi=kiγi𝓁ij=1nkj(|aji|+|bji|)>0, i (41)

and

e~i=j=1n|eji|<1σ, i,    σkjϕjkiυi0, i, j, σkMϕMkmυm1 (42)

2.4.5 Theorem 6 [43].

Let neural network (1) be governed by system functions that satisfy conditions H1, H2 and H3. Under these conditions, the neural network (1) is globally asymptotically stable if there exist real constants Ωi>0, (i=1, 2, 3, 4, 5) satisfying the following algebraic criteria:

πi= 2υiγiϕi2(Ω1+Ω2)1Ω3ψi2(Ω3+Ω4+Ω5)j=1nk=1nϕj2|ejiejk|(1Ω1+1Ω4)j=1n|k=1nakiakj|𝓁i2(1Ω2+1Ω5)j=1nk=1n|bkibkj|𝓁i2>0,i (43)

The following example will lead us to compare the proposed results with some of the existing stability conditions.

2.4.6 Example.

Consider system (1) that has system parameters:

A=a[1111111111111111], B=b[1111111111111111], E=e[1104101401141104]

a and b are real-valued positive constants, and υm=1, ϕM=1, r=1, e=115, ψ1=ψ2=ψ3=ψ4=6,

Let ξ=16. Then, the conditions in Corollary 2 are calculated as:

δ=2rξ8a8b46e=2(ξ4a4b26e)=2(1564a4b) (44)

and

κ=5324e(1ξ)ξ(4a+4b)=532415(1ξ)ξ(4a+4b)
 =115(1ξ)ξ(4a+4b)=11556(4a+4b) (45)

If a+b<1206, then δ>0, and if a+b<1506, then κ>0. Therefore, the condition a+b<1506 ensures that the conditions stated in Corollary 2 hold.

For this example, let a=0,003, b=0,004. The state responses of the system (1) are illustrated using graphical representations respectively for the different network functions given in the following equations:

f(x)= d(x)=tanh(x), c(x)=sigmoid(x) and
f(x)=0,5tanh(x), d(x)=10,5cosx, c(x)=x

We also choose different amplified functions in the following forms for each neuron:

f(x)=0.05tanh(x), c(x)=0.5x, d1(x)=1.5|tanh(x)|, d2(x)=1.5tanh(x),

d3(x)=1.5+|tanh(x)|, d4(x)=1.5+tanh(x).

The state responses of system (1) for the given network functions which are different from each other are shown in Fig 3.

Fig 3. The time response of the system (1). The time response of the system (1) with f(x)=0.05tanh(x),c(x)=0.5x,d1(x)=1.5|tanh(x)|,d2(x)=1.5tanh(x),d3(x)=1.5+|tanh(x)|,d4(x)=1.5+tanh(x).

Fig 3

Fig 2. The time response of the system (1).

Fig 2

The time response of the system (1) with f(x)=0.5tanh(x),  d(x)=10.5cosx, c(x)=x.

It can be observed from Figs 13, that the equilibrium point of system (1) is stable.

Fig 1. The time response of the system (1).

Fig 1

The time response of the system (1) with f(x)= d(x)=tanh(x), c(x)=sigmoid(x).

In Theorem 2, the parameter ϱ44 is calculated as:

ϱ44=1n|e44|k=14(|ejk|+|ajk|+|bjk|)=144(6e+4a+4b)=1424e16a16b (46)

For e=115, it follows that ϱ44<16a16b<0. Hence, the conditions of Theorem 2 are not applicable to the case of the parameter values given in the example.

In Theorem 3, the parameter e4 is calculated as:

e4=1j=1n|ej4|=1|e14||e24||e34||e44|=116e (47)

For e=115, it follows that e4=116e=115<0. Hence, the conditions of Theorem 3 are not applicable to the case of the parameter values given in the example.

In this example, we observe that the first stability condition set in Theorem 4 can be obtained as follows:

m1=k1(a+b)j=14kj>0,       m2=k2(a+b)j=14kj>0,
m3=k3(a+b)j=14kj>0,        m4=k4(a+b)j=14kj>0, (48)

We note that, for this example, there are positive constants k1, k2, k3 and k4  for which m1>0, m2>0, m3>0 and m4>0 if I(|A|+|B|) is determined to satisfy the nonsingular M-matrix condition (requiring that the real parts of the eigenvalues associated with this newly formed matrix I|A||B| are positive, see reference [44], regarding some useful properties of M-matrices). For this example, I|A||B| is obtained as

I|A||B|=[ 1(a+b)(a+b)(a+b)(a+b)(a+b)1(a+b)(a+b)(a+b)(a+b)(a+b)1(a+b)(a+b)(a+b)(a+b)(a+b)1(a+b) ]

If this formed matrix I|A||B| possesses the nonsingular M-matrix property, then it is a fact that at least one column of I|A||B| is strictly dominant [44]. Thus, this property of nonsingular M-matrices implies that 4(a+b)<1 and k1= k2= k3=k4=1. In this case, we can obtain the following equation for the conditions in Theorem 4:

e^4=1j=14|ej4|=1|e14||e24||e34||e44|=116e (49)

For e=115, it follows that e^4=116e=115<0. Thus, the conditions of Theorem 4 are not applicable to the case of the parameter values given in the example.

For the parameters of the given example, the conditions of Theorem 5 are largely identical to the conditions of Theorem 4. Therefore, the conditions σ1>0, σ2>0, σ3>0 and σ4>0 are satisfied if and only if 4(a+b)<1 and k1= k2= k3=k4=1. In this case, σkMϕMkmϕm=1. Let σ=1. Then, we can observe that

e~4=1j=14|ej4|=1|e14||e24||e34||e44|=116e (50)

For e=115, it follows that e~4=116e=115<0. Thus, the conditions of Theorem 5 are not applicable to the case of the parameter values given in the example.

In applying the conditions of Theorem 6 to the case of this example, we obtain

π4=2(Ω1+Ω2)1Ω3ψ42(Ω3+Ω4+Ω5)j=1nk=1n|ejiejk|(1Ω1+1Ω4)j=1n|k=1nakiakj|(1Ω2+1Ω5)j=1n|k=1nbkibkj|
       =2(Ω1+Ω2)1Ω3ψ42(Ω3+Ω4+Ω5)96e2(1Ω1+1Ω4)16a2(1Ω2+1Ω5)16b2   (51)

For π4, we can derive that π4<21Ω3ψ42Ω396e2. Note that 1Ω3ψ42+Ω396e2 takes its minimum value when Ω3=ψ496e. In this case,

π4<22ψ496e=248e (52)

For e=115, it follows that π4<248e=1815<0. Thus, the conditions of Theorem 6 are not applicable to the case of the parameters given in this example.

Based on the above comparisons, it can be concluded that this paper obtains some novel and alternative conditions that guarantee stability of Cohen-Grossberg systems with neutral model where the dynamical model representation of the system admits multiple delay parameters.

3. Conclusions

This paper examines the stability characteristics of neutral-type Cohen-Grossberg artificial neural networks. By making appropriate modifications to certain classes of Lyapunov functionals, novel sufficient algebraic conditions are formulated to guarantee the global asymptotic stability of Cohen-Grossberg neural systems with multiple constant delay parameters. It has been shown that the obtained stability criteria are of the forms of some algebraic equations that involve only some system parameters of the examined delayed Cohen-Grossberg neural network. Therefore, our proposed stability criteria can be checked by using various simple mathematical methods and software tools. By carrying out a detailed analysis of an instructive numerical example, the results obtained in this article are also shown to establish alternative stability criteria to some corresponding stability conditions given in the past literature.

Supporting information

S1 Data. This file archive contains the minimal data set required to replicate all study findings, including the raw numerical values used to generate the graphical representations of system responses under different network functions. All data files are provided in CSV format.

(ZIP)

pone.0343312.s001.zip (841.5KB, zip)

Data Availability

All relevant data are within the manuscript and its Supporting Information files.

Funding Statement

The author(s) received no specific funding for this work.

References

  • 1.Chua LO, Yang L. Cellular neural networks: applications. IEEE Trans Circuits Syst. 1988;35(10):1273–90. doi: 10.1109/31.7601 [DOI] [Google Scholar]
  • 2.Cohen MA, Grossberg S. Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans Syst, Man, Cybern. 1983;SMC-13(5):815–26. doi: 10.1109/tsmc.1983.6313075 [DOI] [Google Scholar]
  • 3.Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci U S A. 1982;79(8):2554–8. doi: 10.1073/pnas.79.8.2554 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Guez A, Protopopsecu V, Barhen J. On the stability, storage capacity, and design of nonlinear continuous neural networks. IEEE Trans Syst, Man, Cybern. 1988;18(1):80–7. doi: 10.1109/21.87056 [DOI] [Google Scholar]
  • 5.Wang J, Cai Y, Yin J. Multi-start stochastic competitive Hopfield neural network for frequency assignment problem in satellite communications. Expert Systems with Applications. 2011;38(1):131–45. doi: 10.1016/j.eswa.2010.06.027 [DOI] [Google Scholar]
  • 6.Chen Y, Wang Z, Liu Y, Alsaadi FE. Stochastic stability for distributed delay neural networks via augmented Lyapunov–Krasovskii functionals. Applied Mathematics and Computation. 2018;338:869–81. doi: 10.1016/j.amc.2018.05.059 [DOI] [Google Scholar]
  • 7.Zhu Q, Cao J, Rakkiyappan R. Exponential input-to-state stability of stochastic Cohen–Grossberg neural networks with mixed delays. Nonlinear Dyn. 2014;79(2):1085–98. doi: 10.1007/s11071-014-1725-2 [DOI] [Google Scholar]
  • 8.Manivannan R, Samidurai R, Cao J, Alsaedi A, Alsaadi FE. Stability analysis of interval time-varying delayed neural networks including neutral time-delay and leakage delay. Chaos, Solitons & Fractals. 2018;114:433–45. doi: 10.1016/j.chaos.2018.07.041 [DOI] [Google Scholar]
  • 9.Zhu H, Rakkiyappan R, Li X. Delayed state-feedback control for stabilization of neural networks with leakage delay. Neural Netw. 2018;105:249–55. doi: 10.1016/j.neunet.2018.05.013 [DOI] [PubMed] [Google Scholar]
  • 10.Ma J, Xu S, Li Y, Chu Y, Zhang Z. Neural networks-based adaptive output feedback control for a class of uncertain nonlinear systems with input delay and disturbances. Journal of the Franklin Institute. 2018;355(13):5503–19. doi: 10.1016/j.jfranklin.2018.05.045 [DOI] [Google Scholar]
  • 11.Song Q, Yu Q, Zhao Z, Liu Y, Alsaadi FE. Boundedness and global robust stability analysis of delayed complex-valued neural networks with interval parameter uncertainties. Neural Netw. 2018;103:55–62. doi: 10.1016/j.neunet.2018.03.008 [DOI] [PubMed] [Google Scholar]
  • 12.Song Q, Chen X. Multistability analysis of quaternion-valued neural networks with time delays. IEEE Trans Neural Netw Learn Syst. 2018;29(11):5430–40. doi: 10.1109/TNNLS.2018.2801297 [DOI] [PubMed] [Google Scholar]
  • 13.Wang J, Jiang H, Ma T, Hu C. Delay-dependent dynamical analysis of complex-valued memristive neural networks: Continuous-time and discrete-time cases. Neural Netw. 2018;101:33–46. doi: 10.1016/j.neunet.2018.01.015 [DOI] [PubMed] [Google Scholar]
  • 14.Zhu Q, Cao J. Exponential stability analysis of stochastic reaction-diffusion Cohen–Grossberg neural networks with mixed delays. Neurocomputing. 2011;74(17):3084–91. doi: 10.1016/j.neucom.2011.04.030 [DOI] [Google Scholar]
  • 15.Chen H, Shi P, Lim C-C, Hu P. Exponential Stability for Neutral Stochastic Markov Systems With Time-Varying Delay and Its Applications. IEEE Trans Cybern. 2016;46(6):1350–62. doi: 10.1109/TCYB.2015.2442274 [DOI] [PubMed] [Google Scholar]
  • 16.Kolmanovskii VB, Nosov VR. Stability of Functional Differential Equations. London: Academic Press; 1986. [Google Scholar]
  • 17.Kuang Y. Delay Differential Equations with Applications in Population Dynamics. Boston: Academic Press; 1993. [Google Scholar]
  • 18.Song Q, Long L, Zhao Z, Liu Y, Alsaadi FE. Stability criteria of quaternion-valued neutral-type delayed neural networks. Neurocomputing. 2020;412:287–94. doi: 10.1016/j.neucom.2020.06.086 [DOI] [Google Scholar]
  • 19.Muralisankar S, Manivannan A, Balasubramaniam P. Mean square delay dependent-probability-distribution stability analysis of neutral type stochastic neural networks. ISA Trans. 2015;58:11–9. doi: 10.1016/j.isatra.2015.03.004 [DOI] [PubMed] [Google Scholar]
  • 20.Lee SM, Kwon OM, Park JH. A novel delay-dependent criterion for delayed neural networks of neutral type. Physics Letters A. 2010;374(17–18):1843–8. doi: 10.1016/j.physleta.2010.02.043 [DOI] [Google Scholar]
  • 21.Shi K, Zhu H, Zhong S, Zeng Y, Zhang Y, Wang W. Stability analysis of neutral type neural networks with mixed time-varying delays using triple-integral and delay-partitioning methods. ISA Trans. 2015;58:85–95. doi: 10.1016/j.isatra.2015.03.006 [DOI] [PubMed] [Google Scholar]
  • 22.Zhang Z, Liu K, Yang Y. New LMI-based condition on global asymptotic stability concerning BAM neural networks of neutral type. Neurocomputing. 2012;81:24–32. doi: 10.1016/j.neucom.2011.10.006 [DOI] [Google Scholar]
  • 23.Dharani S, Rakkiyappan R, Cao J. New delay-dependent stability criteria for switched Hopfield neural networks of neutral type with additive time-varying delay components. Neurocomputing. 2015;151:827–34. doi: 10.1016/j.neucom.2014.10.014 [DOI] [Google Scholar]
  • 24.Shi K, Zhong S, Zhu H, Liu X, Zeng Y. New delay-dependent stability criteria for neutral-type neural networks with mixed random time-varying delays. Neurocomputing. 2015;168:896–907. doi: 10.1016/j.neucom.2015.05.035 [DOI] [Google Scholar]
  • 25.Samidurai R, Marshal Anthoni S, Balachandran K. Global exponential stability of neutral-type impulsive neural networks with discrete and distributed delays. Nonlinear Analysis: Hybrid Systems. 2010;4(1):103–12. doi: 10.1016/j.nahs.2009.08.004 [DOI] [Google Scholar]
  • 26.Liu P-L. Improved delay-dependent stability of neutral type neural networks with distributed delays. ISA Trans. 2013;52(6):717–24. doi: 10.1016/j.isatra.2013.06.012 [DOI] [PubMed] [Google Scholar]
  • 27.Huang H, Du Q, Kang X. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Trans. 2013;52(6):759–67. doi: 10.1016/j.isatra.2013.07.016 [DOI] [PubMed] [Google Scholar]
  • 28.Arik S. A modified Lyapunov functional with application to stability of neutral-type neural networks with time delays. Journal of the Franklin Institute. 2019;356(1):276–91. doi: 10.1016/j.jfranklin.2018.11.002 [DOI] [Google Scholar]
  • 29.Arik S. An analysis of stability of neutral-type neural systems with constant time delays. Journal of the Franklin Institute. 2014;351(11):4949–59. doi: 10.1016/j.jfranklin.2014.08.013 [DOI] [Google Scholar]
  • 30.Cheng C-J, Liao T-L, Yan J-J, Hwang C-C. Globally asymptotic stability of a class of neutral-type neural networks with delays. IEEE Trans Syst Man Cybern B Cybern. 2006;36(5):1191–5. doi: 10.1109/tsmcb.2006.874677 [DOI] [PubMed] [Google Scholar]
  • 31.Akça H, Covachev V, Covacheva Z. Global asymptotic stability of cohen–grossberg neural networks of neutral type. J Math Sci. 2015;205(6):719–32. doi: 10.1007/s10958-015-2278-8 [DOI] [Google Scholar]
  • 32.Ozcan N. New conditions for global stability of neutral-type delayed Cohen-Grossberg neural networks. Neural Netw. 2018;106:1–7. doi: 10.1016/j.neunet.2018.06.009 [DOI] [PubMed] [Google Scholar]
  • 33.Samli R, Senan S, Yucel E, Orman Z. Some generalized global stability criteria for delayed Cohen-Grossberg neural networks of neutral-type. Neural Netw. 2019;116:198–207. doi: 10.1016/j.neunet.2019.04.023 [DOI] [PubMed] [Google Scholar]
  • 34.Zhang Y, Xie T, Ma Y. Robustness analysis of exponential stability of Cohen-Grossberg neural network with neutral terms. MATH. 2025;10(3):4938–54. doi: 10.3934/math.2025226 [DOI] [Google Scholar]
  • 35.Samli R, Arik S. New results for global stability of a class of neutral-type neural systems with time delays. Applied Mathematics and Computation. 2009;210(2):564–70. doi: 10.1016/j.amc.2009.01.031 [DOI] [Google Scholar]
  • 36.Zhang Z, Liu W, Zhou D. Global asymptotic stability to a generalized Cohen-Grossberg BAM neural networks of neutral type delays. Neural Netw. 2012;25(1):94–105. doi: 10.1016/j.neunet.2011.07.006 [DOI] [PubMed] [Google Scholar]
  • 37.Wan L, Zhou Q. Stability analysis of neutral-type cohen-grossberg neural networks with multiple time-varying delays. IEEE Access. 2020;8:27618–23. doi: 10.1109/access.2020.2971839 [DOI] [Google Scholar]
  • 38.Arik S. New criteria for stability of neutral-type neural networks with multiple time delays. IEEE Trans Neural Netw Learn Syst. 2020;31(5):1504–13. doi: 10.1109/TNNLS.2019.2920672 [DOI] [PubMed] [Google Scholar]
  • 39.Faydasicok O. New criteria for global stability of neutral-type Cohen-Grossberg neural networks with multiple delays. Neural Netw. 2020;125:330–7. doi: 10.1016/j.neunet.2020.02.020 [DOI] [PubMed] [Google Scholar]
  • 40.Ozcan N. Stability analysis of Cohen-Grossberg neural networks of neutral-type: Multiple delays case. Neural Netw. 2019;113:20–7. doi: 10.1016/j.neunet.2019.01.017 [DOI] [PubMed] [Google Scholar]
  • 41.Wan L, Zhou Q. Exponential stability of neutral-type cohen-grossberg neural networks with multiple time-varying delays. IEEE Access. 2021;9:48914–22. doi: 10.1109/access.2021.3068191 [DOI] [Google Scholar]
  • 42.Faydasicok O. An improved Lyapunov functional with application to stability of Cohen-Grossberg neural networks of neutral-type with multiple delays. Neural Netw. 2020;132:532–9. doi: 10.1016/j.neunet.2020.09.023 [DOI] [PubMed] [Google Scholar]
  • 43.Zhang Z, Zhang X, Yu T. Global exponential stability of neutral-type cohen–grossberg neural networks with multiple time-varying neutral and discrete delays. Neurocomputing. 2022;490:124–31. doi: 10.1016/j.neucom.2022.03.068 [DOI] [Google Scholar]
  • 44.Horn RA, Johnson CR. Topics in Matrix Analysis. Cambridge University Press; 1991. doi: 10.1017/cbo9780511840371 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

S1 Data. This file archive contains the minimal data set required to replicate all study findings, including the raw numerical values used to generate the graphical representations of system responses under different network functions. All data files are provided in CSV format.

(ZIP)

pone.0343312.s001.zip (841.5KB, zip)

Data Availability Statement

All relevant data are within the manuscript and its Supporting Information files.


Articles from PLOS One are provided here courtesy of PLOS

RESOURCES