Skip to main content
Computational Intelligence and Neuroscience logoLink to Computational Intelligence and Neuroscience
. 2022 Sep 29;2022:1779582. doi: 10.1155/2022/1779582

Bifurcations of a Fractional-Order Four-Neuron Recurrent Neural Network with Multiple Delays

Yu Fei 1, Rongli Li 1, Xiaofang Meng 1, Zhouhong Li 1,2,
PMCID: PMC9536962  PMID: 36210995

Abstract

This paper investigates the bifurcation issue of fractional-order four-neuron recurrent neural network with multiple delays. First, the stability and Hopf bifurcation of the system are studied by analyzing the associated characteristic equations. It is shown that the dynamics of delayed fractional-order neural networks not only depend heavily on the communication delay but also significantly affects the applications with different delays. Second, we numerically demonstrate the effect of the order on the Hopf bifurcation. Two numerical examples illustrate the validity of the theoretical results at the end.

1. Introduction

Recurrent neural network (RNN) is a type of recursive neural network that takes sequence data as input, recurses in the evolution direction of the sequence, and all nodes (recurrent units) are connected in a chain. Till now, several recurrent neural networks (RNNs) have been widely considered in various fields such as signal processing, optimizations control, image processing, robotics, pattern recognitions, and automatic control, so they have attracted extensive attention of researchers in recent years [17]. Since the applications of RNNs depend more heavily on dynamical neural networks, quite a few efforts have been undertaken to study their dynamical properties and a large number of useful results have been investigated, including oscillation, stability, bifurcation, synchronization, and chaos of various RNNs [814].

As the matter of fact, for some applications of nonlinear dynamical models, time delay has a significant impact, and in addition to affecting stability, it causes oscillations and other unstable phenomena, such as chaos [15]. Communication delays and the response times of neurons are considered key factors in the performance of neural networks, and this is caused by the finite switching speed of amplifiers and the noninstantaneous signal transmission between neurons [16]. In recently years, many scholars have been interested in studying the dynamics of neural networks with such time delays [1719]. It must be pointed out that exponential stabilization of memristor-based RNNs with disturbance and mixed time delays by periodically intermittent control has been considered by Wang et al. [20]. Using the appropriate Lyapunov–Krasovski functionals and applying matrix inequality approach methods, Zhou [21] discussed the passivity of a class of recurrent neural networks with impulse and multiproportional delays. Zhou and Zhao [22] investigated the exponential synchronization and polynomial synchronization of recurrent neural networks with and without proportional delays. Robust stability analysis of recurrent neural networks is studied in Refs. [23, 24]. Furthermore, time delays are ubiquitous and unavoidable in the real world. Due to the existence of delays, the system can become unstable, and the dynamic behavior of nonlinear systems becomes more difficult. Moreover, since the solution space of the delay dynamical is infinite, it makes the systems more complex and bifurcation occurs. Hence, it is necessary to consider the properties and dynamics of neural networks via delays, such as time delay [25, 26], multiple delays [27, 28] time-varying delays [29, 30], and so on. In 2013, Zhang and Yang [31] studied a four-neuron recurrent neural network with multiple delays, described as follows:

x1.t=x1t+fx2tτ1,x2.t=x2t+fx3tτ1,x3.t=x3t+fx4tτ1,x4.t=x4t+ω1fx1tτ2+ω2fx2tτ2+ω3fx3tτ2, (1)

C5(α1β1+α2β2)/(α12+α22) ≠ 0where xi(t)(i=1,2,3,4) stand for state of the ith neuron at time t, ωkR(k=1,2,3) are the network parameters or weight, f(·) is the connection function between neurons, and τj ≥ 0(j=1,2) are the communication time delay. By using the distribution of the solutions of the associated characteristic equation, the Hopf bifurcation and local stability of the four-dimensional RNNs with two delays are studied. For more recurrent neural network research results, see references [5, 10, 12, 20].

In more than three centuries, fractional calculus has developed into a classical mathematical concept. Nonlinear dynamics systems have shown that it has an exceptionally important role in generalizing ordinary differentiation and integration to arbitrary noninteger order. Therefore, if we study the effects of the memory and genetics factors, fractional neural network is sometimes more realistic and more general than integer neural networks. In recent years, the application of fractional order neural networks has developed rapidly, and the complex dynamical behaviors of fractional neural networks has become a very important research hot points, such as stability or multistability, Hopf bifurcation, synchronization, chaos, and so on. For instance, in Ref. [32], the multistability of a fractional-order competitive neural networks with delay is investigated by using the fractional calculus and partitioning of state space. In Lu and Xue [33] study, adaptive synchronization is investigated for fractional delayed stochastic neural networks. Yuan and Huang [34] considered the quantitative analysis of fractional-order neural networks with time delay. Udhayakumar and Rajan [35] discussed Hopf bifurcation of a delayed fractional-order octonion-valued neural networks.

We also know that Hopf bifurcations, which include subcritical and supercritical ones, can be used to efficiently design biochemical oscillators. Furthermore, fractional order neural networks with same delay cannot accurately describe the dynamical properties of real world neural networks compared with the ones with different delays. In recent years, some researchers have considered the dynamical behavior of fractional models with time delay [3648]. In 2019 [49], we also investigated the existence of Hopf bifurcation for four-neuron fractional neural networks with leakage delays. To the best of our knowledge, so far there are few results on the Hopf bifurcation of four-dimensional fractional-order recurrent neural network with multiple delays are reported, and therefore, the study of Hopf bifurcation of fractional-order dynamical systems with multiple delays remains an open problem.

Based on the above motivations, we are dedicated to presenting a theoretical exploration of stability and Hopf bifurcation for a four-neuron fractional-order recurrent neural network with multiple delays in this work. The main contributions can be highlighted as follows:

  1. A novel delayed fractional-order recurrent neural network with four-neuron and two different delays is studied

  2. Double main dynamical properties of the fractional-order recurrent neural network with two delays are investigated: stability and oscillation

  3. The Hopf bifurcation is discussed in terms of delays and order

In the article, we shall give some some lemmas and definitions of fractional-order calculus in Section 2, and models description in Section 3. In Section 4, the local stability of the trivial steady state of delayed fractional-order RNNs is examined by applying the associated characteristic equation. In addition, the authors will care about the Hopf bifurcation of fractional-order RNNs with multiple delays. In Section 5, two numerical examples are provided to demonstrate the theoretical results. The last section gives some conclusions.

2. Preliminaries

This section we will give some Caputo definitions and lemma for fractional calculus as a basis for the theoretical analysis and simulation proofs.

Definition 1 . —

(see [50]). The fractional integral of order ϕ for a function f(x) is defined as follows:

Iϕfx=1Γϕx0xxsϕ1fsds, (2)

where ϕ > 0, and Γ(·) is the Gamma function satisfying Γ(s)=∫0xs−1exdx.

Definition 2 . —

(see [50]). Caputo fractional derivative of order ϕ for a function ψ(x) ∈ Ck[x0, ), R) is defined by

Dϕψx=1Γnϕx0xψksxsϕk+1ds, (3)

where xx0 and k − 1 ≤ ϕ < k, kN+.

Moreover, when ϕ ∈ (0,1), then

Dϕψx=1Γ1ϕx0xψsxsϕds. (4)

Lemma 1 (see [51]). —

Consider the following fractional order autonomous model.

Dϕu=Ju,u0=u0, (5)

in which 0 < ϕ ≤ 1, u ∈ Rk, and J ∈ Rk×k. Then the zero solution of the system (5) is asymptotically stable in the Lyapunov sense if all roots λi are the system (5) of character equation satisfy |arg(λi)| > ϕπ/2(i=1,2,…, k), and then each component of the states decays towards 0 like tϕ. In addition, this model is stable if and only if |arg(λi)| ≥ ϕπ/2 and those critical eigenvalues that satisfy |arg(λi)|=ϕπ/2 have geometric multiplicity one.

3. Mathematics Model Elaboration

This article considers the following four-neuron fractional-order recurrent neural network with two delays:

Dϕx1t=x1t+fx2tτ1,Dϕx2t=x2t+fx3tτ1,Dϕx3t=x3t+fx4tτ1,Dϕx4t=x4t+ω1fx1tτ2+ω2fx2tτ2+ω3fx3tτ2, (6)

where ϕ ∈ (0,1] are fractional order; xi(t)(i = 1,2,3,4) stand for state variables; ωi(i = 1,2,3) denote the connection weights; the function of connecting neurons is denoted by f(x(·)); and τ1 and τ2 are the communication time delays.

Remark 1 . —

In fact, if ϕ=1, the fractional delayed neural networks (6) changes into the general neural network (1).

Accordingly, the main purpose of this article is to investigate the stability and the application of Hopf bifurcations of the neural networks (6) taking different time delays τ1 and τ2 as the bifurcation parameters by the method of stability analysis [52]. In addition, the effects of the order on the creation of the Hopf bifurcation for the proposed fractional order neural network with multiple delays are also numerically discussed.

Throughout of this paper, assume that the following condition holds true:

(C1)f(·) ∈ C(R, R), f(0)=0, xf(x) > 0, for x ≠ 0.

4. Main Results

This section chooses τ1 or τ2 as a bifurcation parameter to study the stability analysis and Hopf bifurcation for the fractional order RNNs (6) and to study the bifurcation points accurately.

4.1. Bifurcation Depending on τ1 in Equation (6)

In this subsection, we first study the effects of τ1 on bifurcations of system (6) by establishing τ2.

Applying Taylor series formula, the following form of equation (6) at the origin is

Dϕx1t=x1t+m1x2tτ1,Dϕx2t=x2t+m2x3tτ1,Dϕx3t=x3t+m3x4tτ1,Dϕx4t=x4t+m4x1tτ2+m5x2tτ2+m6x3tτ2. (7)

By applying Laplace transformation, its characteristic equation is given as

detsϕ+1m1esτ1000sϕ+1m2esτ1000sϕ+1m3esτ1m4esτ2m5esτ2m6esτ2sϕ+1=0, (8)

where mk=f′(0)(k=1,2,3), mk=ωjf′(0)(j=1,2,3, k=4,5,6).

From (8), we have

K1s+K2sesτ1+K3se2sτ1+K4se3sτ1=0, (9)

where

K1s=s4ϕ+4s3ϕ+6s2ϕ+4sϕ+1,K2s=m3m6s2ϕ+2sϕ+1esτ2,K3s=m2m3m5sϕ+1esτ2,K4s=m1m2m3m4esτ2. (10)

Multiplying e1 and e21 on both sides of equation (9), respectively, we can obtain

K1se2sτ1+K2sesτ1+K3s+K4sesτ1=0,K1sesτ1+K2s+K3sesτ1+K4se2sτ1=0. (11)

Let K1(s)=A1+iB1, K2(s)=A2+iB2, K3(s)=A3+iB3, K4(s)=A4+iB4, and from equation (9), we have

A1+iB1e2sτ1+A2+iB2esτ1+A3+iB3+A4+iB4esτ1=0,A1+iB1esτ1+A2+iB2+A3+iB3esτ1+A4+iB4e2sτ1=0. (12)

Take s=iw=w(cosπ/2+i sinπ/2)(ω > 0) be a purely imaginary root of equation (11). Apply inserting s into equation (11) and separating the imaginary and real parts yields the following equations:

A1cos2ωτ1B1sin2ωτ1+A2+A4cosωτ1+B4B2sinωτ1=A3,B1cos2ωτ1+A1sin2ωτ1+B2+B4cosωτ1+A2A4sinωτ1=B3,A1cos2ωτ1B1sin2ωτ1+A2+A4cosωτ1+B4B2sinωτ1=A3,B1cos2ωτ1+A1sin2ωτ1+B2+B4cosωτ1+A2A4sinωτ1=B3. (13)

Evidently,

cosωτ1=F12ωF11ω=Fc1ω,sinωτ1=F22ωF21ω=Fs1ω, (14)

where A1, A2, A3, A4, B1, B2, B3, B4, F11, F12, F21, and F22 are given Appendix A. Obviously, from fist to second equation of system (14), it can be implied that

Fc12ω+Fs12ω=1. (15)

From equation (13), one can obtain

τ1l=1warccosF12wF11w+2lπ,l=0,1,2,. (16)

Remark 2 . —

This is an inhomogeneous system of linear equations (13), and the independent variables are cos (2ωτ1), sin (2ωτ1), cos (ωτ1), sin (ωτ1), respectively. According Cramer's rule of linear equation, if the coefficient determinant of the system of linear equations is not equal to 0, we can easily solve solutions of linear equations (13). That is, we can obtain cos (ωτ1) and sin (ωτ1) or cos (ω2τ1) and sin (2ωτ1).

Define the bifurcation point of fractional neural network with multiple delays (6) as

τ10=minτ1l,l=0,1,2,. (17)

If τ1 vanishes, then equation (9) becomes

H1s+H2sesτ2=0, (18)

where

H1s=s4ϕ+4s3ϕ+6s2ϕ+4sϕ+1,H2s=m3m6s2ϕ2m3m6sϕm3m6m2m3m5sϕm2m3m5m1m2m3m4. (19)

If τ2=0, then the equation (18) becomes

0=s4ϕ+4s3ϕ+6s2ϕ+4sϕ+1m3m6s2ϕ2m3m6sϕm3m6m2m3m5sϕm2m3m5m1m2m3m4. (20)

Suppose that all roots s of the equation (18) obey Lemma 1, then we get that both roots λi in equation (18) have negative real parts.

The imaginary and real parts of Hj(s)(j=1,2) can be denoted by HjI an d HjR, respectively. Multiplying e22 on both sides of equation (18), we can obtain

H1sesτ2+H2s=0. (21)

Also, let s=iv=v(cosπ/2+i sinπ/2)(v > 0) be a purely imaginary root of equation (11) if and only if

H1Rcosvτ2H1Isinvτ2=H2R,H1Icosvτ2+H1Rsinvτ2=H2I. (22)

This leads to form

cosvτ2=H2RH2R+H1IH2IH1R2+H1I2=fc1v,sinvτ1=H2RH1I+H1RH2IH1R2+H1R2=fs1v. (23)

It is not difficult to see that

fc12w+fs12w=1. (24)

Additionally, we will give the following assumptions which hold true.

(C2) The equation (24) has at least a positive real root.

From equation (24), the values ofv can be obtained according to Mathematics software Mathematica 10.0, and then the Hopf bifurcation point τ20 of fractional order recurrent neural network (6) with τ1 = 0 can be derived.To demonstrate our main results, we further present the following hypothesis: (C3)Υ1Ω1 + Υ2Ω212 + Ω22 ≠ 0, where

Υ1=w0A2sinw0τ10B2cosw0τ10+2A3sin2w0τ10B2cos2w0τ10+3A4cos3w0τ10+B4sin3w0τ10,Υ2=w0A2cosw0τ10+B2sinw0τ10+2A3cos2w0τ10+B2sin2w0τ10+3A4cos3w0τ10+B4sin3w0τ10,Ω1=A1+A2τ1A2cosw0τ10+B2τ1B2sinw0τ10+A32τ1A3cos2w0τ10+B2τ1B3sin2w0τ10+A43τ1A4cos3w0τ10+B43τ1B4sin3w0τ10,Ω2=B1+B2τ1B2cosw0τ10A2τ1A2sinw0τ10+B32τ1B3cos2w0τ10A2τ1A3sin2w0τ10+B43τ1B4cos3w0τ10A43τ1A4sin3w0τ10. (25)

Lemma 2 . —

Let s(τ1)=ν(τ10)+iw(τ1) be a root of equation (9) near τ1=τ1j satisfying ν(τ1j)=0, w(τ1j)=w0, then the following transversality condition is satisfied.

Redsdτ1|w=w0,τ1=τ100. (26)

Proof —

With implicit function theorem, we can differentiate equation (9) with respect to τ1, and thus we get

0=K1sdsdτ1+K2sesτ1dsdτ1+K2sesτ1τ1dsdτ1s+K3se2sτ1dsdτ1+K3se2sτ12τ1dsdτ12s+K4se3sτ1dsdτ1+K4se3sτ13τdsdτ13s.dsdτ1=ΥsΩs, (27)

where

Υs=sK2sesτ1+2K3se2sτ1+3K4se3sτ1,Ωs=K1s+K2sτ1K2se2sτ1+K3s2τ1K3se2sτ1+K4s3τ1K4se3sτ1. (28)

We further suppose that Υ1 and Υ2 are the real and imaginary parts of Υ(s), respectively, and Ω1 and Ω2 are the real and imaginary parts of Ω(s), respectively, then

Redsdτ|τ=τ0,w=w0=Υ1Ω1+Υ2Ω2Ω12+Ω22. (29)

From (C3), we conclude that the transversality condition holds true. This completes the proof of Lemma 2.

From the above investigation, we can obtain the following results.

Theorem 1 . —

assumptions (C1)–(C3) hold true, then the following results can be given:

  1. The zero equilibrium point of fractional order four-neuron recurrent neural network with multiple delays (6) is asymptotically stable when τ1 ∈ [0, τ10).

  2. If τ1 ∈ [0, τ10), then fractional order four neurons recurrent neural network with multiple delays (6) causes Hopf bifurcation at the origin when τ1=τ10. That is, a branch of periodic solutions can bifurcate from the zero equilibrium point at τ1=τ10.

4.2. Bifurcation Depending on τ2 in Equation (6)

As in the previous subsection, next we change another delay τ2 to the bifurcation parameter to account for the bifurcation of the model (6). It is hard to point out that equation (8) changes as follows:

q1s+q2sesτ2=0, (30)

where

q1s=1+4s2ϕ+6s2ϕ+4s3ϕ+s4ϕ,q2s=m3m61+2sϕ+s2ϕesτ1m2m3m51+sϕe2sτ1m1m2m3m4e3sτ1. (31)

Multiplying e22 on both sides of equation (30), we can obtain

q1sesτ2+q2s=0. (32)

Suppose q1(s)=a1+ib1 and q2(s)=a2+ib2, and from equation (32), we have

a1+ib1esτ2+a2+ib2=0, (33)

where a1, a2, b1, b2 are given in Appendix B.

Take s=iw˜=w˜cosπ/2+isinπ/2ω˜>0 as a root of equation (33) if and only if

a1cosw˜τ2b1sinw˜τ2=a2,b1cosw˜τ2+a1sinw˜τ2=b2, (34)

that is,

cosw˜τ2=a1a2+b1b2a12+b12=ρw˜,sinw˜τ2=a2b1+a1b2a12+b12=ϱw˜. (35)

It is simple to derive the following equation.

ρ2w˜+ϱ2w˜=1. (36)

From (35), one can obtain

τ2l=1w˜arccosϱw˜+2lπ,l=0,1,2,. (37)

The bifurcation point is defined by ωk(k=1,2,3)(C3)(Υ1Ω12Ω2)(Ω1222) ≠ 0

τ20=minτ˜2l,l=0,1,2,. (38)

C5(α1β1+α2β2)/(α12+α22) ≠ 0 here τ2lis defined by equation (38)

If τ2=0, then the equation (32) becomes

M1s+M2sesτ1+M3se2sτ1+M4se3sτ1=0, (39)

where

M1s=1+4sϕ+6s2ϕ+4s3ϕ+s4ϕ,M2s=m3m61+2sϕ+s2ϕM3s=m2m3m51+sϕ,M4s=m1m2m3m4. (40)

Assume that all roots s of equation (39) observe Lemma 1, then we get that both roots of equation (39) have negative real parts.

The imaginary and real parts of Mi(s)(i = 1,2,3,4) can be expressed as Mil an d MiR, respectively. Multiplying both sides of the equation (39) by e21 and e1 yields

M1se2sτ1+M2sesτ1+M3s+M4sesτ1=0,M1sesτ1+M2s+M3sesτ1+M4se2sτ1=0. (41)

Let s=iv˜=v˜cosπ/2+isinπ/2v˜>0 be a solution of equation (41). Substituting s into equation (41) and separating the imaginary and real units yields the following equations:

M1Rcos2v˜τ1M1Isin2v˜τ1+M2R+M4Rcosv˜τ1+M4IP2Isinv˜τ1=M3R,M1Icos2v˜τ1+M1Rsin2v˜τ1+M2I+M4Icosv˜τ1+M2RP4Rsinv˜τ1=M3I,P1Rcos2v˜τ1M1Isin2v˜τ1+M2R+P4Rcosv˜τ1+M4IP2Isinv˜τ1=M3R,P1Icos2v˜τ1+M1Rsin2v˜τ1+M2I+P4Rcosv˜τ1+M2RP4Rsinv˜τ1=M3I, (42)

which lead to

cosv˜τ1=E12v˜E11v˜=Cv˜2,sinv˜τ1=E22v˜E21v˜=Sv˜2. (43)

Obviously, from first and second equation of system (43), we get

Cv˜2+Sv˜2=1. (44)

To theoretically gain the sufficient conditions for the Hopf bifurcation, we assume that the following assumptions hold true:

(C4) Equation (36) has at least a positive real root.

By means of equation (36), the values of ω˜ can be obtained according to mathematical software Mathematica 10.0, and then the bifurcation point τ10 of recullrent fractional four-neuron neural networks (6) with τ2 = 0 can be derived.As a summary of our main results, we provide the following assumption: (C5)α1β1 + α2β2/α12 + α22 ≠ 0, where

α1=a1+a2τ20a2cosw˜0τ20+b2τ2b2sinw˜0τ20,α2=b1+b2τ20b2cosw˜0τ20a2τ2a2sinw˜0τ20,β1=w˜0a2sinw˜0τ20b2cosw˜0τ20,β2=w˜0a2cosw˜0τ20+b2sinw˜0τ20. (45)

Lemma 3 . —

Let sτ2=ητ2+iw˜τ2 be a root of equation (9) near τ2=τ2j satisfying η(τ2j)=0, wτ2j=w˜0, then we get the following transversality condition

Redsdτ2|w˜=w˜0,τ2=τ200. (46)

Proof —

Similar to Lemma 2, by utilizing the implicit function theorem and differentiating (9) with respect to τ2, we get

0=q1sdsdτ2+q2sesτ2dsdτ2+q2sesτ2τ2dsdτ2s,dsdτ2=βsαs, (47)

where

βs=sq2sesτ2,αs=q1s+q2sesτ2τ2q2sesτ2. (48)

We further suppose that α1 and α2 are the real and imaginary units of α(s), respectively, and β1 and β2 are the real and imaginary parts of β(s), respectively, then we get

Redsdτ2|w˜=w˜0,τ2=τ20=α1β1+α2β2α12+α22. (49)

As a direct consequence of (C5), we can conclude that the transversality condition is satisfied. Then the proof of Lemma 3 is complete.

Based on the above analysis, the following conclusions can be drawn.

Theorem 2 . —

By assuming that assumptions (C1), (C4), and (C5) are valid, the following conditions can be inferred:

  1. The zero equilibrium point of fractional order four-neuron recurrent neural network with multiple delays (6) is asymptotically stable when τ2 ∈ [0, τ20))

  2. The fractional order four-neuron recurrent neural network with multiple delays (6) experiences a Hopf bifurcation at its origin when τ2=τ20; that is, a family of periodic solutions can bifurcate from the zero equilibrium point near τ2=τ20

5. Numerical Examples

To demonstrate the validity and feasibility of the conclusions reached in this paper, we provide two examples. The simulations were based on a prediction and correction scheme [53] of Adama–Bashforth–Moulton and step-size h=0.01.

5.1. Example 1

Consider the four-neuron fractional recurrent neural networks with multiple delays as

Dϕx1t=x1t+fx2tτ1,Dϕx2t=x2t+fx3tτ1,Dϕx3t=x3t+fx4tτ1,Dϕx4t=x4t+ω1fx1tτ2+ω2fx2tτ2+ω3fx3tτ2. (50)

Choose parameters ϕ=0.9, ω1=2, ω2=ω3=−2, action function f(·)=tanh(·); therefore, f(0)=tanh(0)=0, f′(0)=1.

Let the initial values be selected as (x1(0), x2(0), x3(0), x4(0))=(0.15, −0.14,0.1,0.2) for the system (50). First, taking fixed τ2 such that τ2=0.6 by complex computing, we get ω10=5.23599, and then τ10=0.312709. Obviously, it is easy to verify that the conditions in Theorem 1 are satisfied. The numerical simulations in Figures 1 and 2 that the zero equilibrium point of system (50) is locally asymptotically stable when τ1=0.25 < τ10=0.312709. Moreover, Figures 3 and 4 simulates that the zero equilibrium point of system (50) is unstable, and Hopf bifurcation occurs when τ1=0.35 > τ10=0.312709. The bifurcation diagrams are plotted in Figure 5, which illustrates the theoretical results.

Figure 1.

Figure 1

Time responses of system (50) with ϕ=0.9, τ1=0.25 < τ10=0.312709.

Figure 2.

Figure 2

Phase diagrams of system (50) with ϕ=0.9, τ1=0.25 < τ10=0.312709.

Figure 3.

Figure 3

Time responses of system (50) with ϕ=0.9, τ1=0.36 > τ10=0.312709.

Figure 4.

Figure 4

Phase diagrams of system (50) with ϕ=0.9, τ1=0.36 > τ10=0.312709.

Figure 5.

Figure 5

Bifurcation diagram of system (50) with ϕ=0.9, τ1=0.36 > τ10=0.312709.

5.2. Example 2

The same as example 1, let ϕ=0.95, and now we consider the following four-neurons fractional current network with double different delays:

D0.95x1t=x1t+fx2tτ1,D0.95x2t=x2t+fx3tτ1,D0.95x3t=x3t+fx4tτ1,D0.95x4t=x4t+ω1fx1tτ2+ω2fx2tτ2+ω3fx3tτ2. (51)

Taking ω1 = 1, ω2 = ω3 = −1.5, ϕ = 0.95, action function f(·) = tanh(·), then f(0) = tanh(0) = 0, f′(0) = 1, and we first also set τ1 = 0.8, in the next step, we apply a complex calculation, and it obtains a ω˜20=1.02089 and τ20 = 0.329454. Thus, Theorem 2 yields that the zero solution (0,0,0,0) of the system (51) is locally asymptotically stable when τ2 = 0.22 < τ20, which is simulated in Figures 6 and 7 which describes the impact of fractional order on τ20. In addition, the zero equilibrium point of the system (51) is unstable, and Hopf bifurcation occurs when τ2 = 0.38 > τ20, as shown in Figures 8 and 9. Moreover, the bifurcation diagrams are plotted in Figure 10, which illustrates the theoretical results.

Figure 6.

Figure 6

Time responses of system (51) with ϕ=0.95, τ1=0.22 < τ20=0.329454.

Figure 7.

Figure 7

Phase diagrams of system (51) with ϕ=0.95, τ1=0.22 < τ20=0.329454.

Figure 8.

Figure 8

Time responses of system (51) with ϕ=0.95, τ1=0.38 > τ20=0.329454.

Figure 9.

Figure 9

Phase diagrams of system (51) with ϕ=0.95, τ1=0.38 > τ20=0.329454.

Figure 10.

Figure 10

Bifurcation diagrams of system (51) with ϕ=0.95, τ1=0.38 > τ20=0.329454.

Remark 3 . —

In fact, in order to better reflect the influence of different time delays at the bifurcation point of the systems (50) and (51), the corresponding bifurcation point τ10 and τ20 and τ10 and τ20 can be determined by changing the order of ϕ. This means that systems (50) and (51) involving different two delays are prone to earlier Hopf bifurcation for some fixed fractional order ϕ.

6. Conclusion

This paper examines the Hopf bifurcation problem of fractional recurrent neural networks with four neurons and two delays. Using time delay as the bifurcation parameter, several criteria are destabilized in order to ensure the Hopf bifurcation for the fractional four-neuron of recurrent neural networks. Based on our analysis, different communication time delays and order effects have quantitatively changed the dynamic behavior of the system (6). These results can contribute to our understanding of delayed fractional recurrent neural networks as a continuation of the previous work. The results of the simulations are illustrated by two numerical examples.

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grant 11971421, and Yunnan Fundamental Research Projects under Grant 202201AU070170, and Yunnan Provincial Department of Education Science Research Fund Project under Grants 2022J0480, 2022J0477 and 2022Y489.

Appendix

A

A1=6ω2ϕcosπϕ+4ω3ϕcos3πϕ2+ω4ϕcos2πϕ+4ωϕcosπϕ2+1,B1=6ω2ϕsinπϕ+4ω3ϕsin3πϕ2+ω4ϕsin2πϕ+4ωϕsinπϕ2,A2=m3m6ω2ϕsinπϕsinωτ2m3m6ω2ϕcosπϕcosωτ22m3m6ωϕsinπϕ2sinωτ22m3m6ωϕcosπϕ2cosωτ2m3m6cosωτ2,B2=m3m6ω2ϕsinπϕcosωτ2+m3m6ω2ϕcosπϕsinωτ22m3m6ωϕsinπϕ2cosωτ2+2m3m6ωϕcosπϕ2sinωτ2+m3m6sinωτ2,A3=m2m3m5ωϕsinπϕ2sinωτ2m2m3m5ωϕcosπϕ2cosωτ2m2m3m5cosωτ2,B3=m2m3m5ωϕsinπϕ2cosωτ2+m2m3m5ωϕcosπϕ2sinωτ2+m2m3m5sinωτ2,A4=m1m2m3m4cosωτ2,B4=m1m2m3m4sinωτ2,F11=B42A1B4+A4B1A14A12A32+2A422B12+B32+2B42+2A1A2A3A4+A2B3B4+A3B2B4A4B2B3A22A42+B42+2A2B1A4B3A3B4A32B12+2A3A4B1B2+A442A42B12A42B22+2A42B42+B14B12B322B12B42+2B1B2B3B4B22B42+B44,F12=B4A42+B42A2B1+A3B4A1B4+A4B1A2A4+B2B4A12B1B3+A1A2B4A4B2+B1A2A4A42+B12B1B3+B2B4B42B4A12A2+A1A3A4+B3B4B1A2B1+A3B4A4B3B4A12+B12B1B3+B4B2B4A3A1B4+A4B1+A1A4B3+A42B2B4,F21=A14A12A32+2A422B12+B32+2B42+2A1A2A3A4+A2B3B4+A3B2B4A4B2B3A22A42+B42+2A2B1A4B3A3B4A32B12+2A3A4B1B2+A442A42B12A42B22+2A42B42+B14B12B322B12B42+2B1B2B3B4B22B42+B44,F22=A13B2+A12A2B1+B3+A3B4B2A4B3+A1A22B4+2A2A4B2+A32B42A3A4B3+A42B2B12B2+B22B4+B2B42B32B4A22A4B1+A2A42B3B1+B13+B12B3B1B42B2+B4+B3B42+A32A4B1A3A42B2A3A42B4A3B12B2+A3B12B4+2A3B1B3B4A3B2B42A3B43+A43B3A4B12B3+A4B1B22A4B1B32+A4B3B42. (A.1)

B

a1=6ω˜2ϕcosπϕ+4ω˜3ϕcos3πϕ2+ω˜4ϕcos2πϕ+4ω˜ϕcosπϕ2+1,b1=6ω˜2ϕsinπϕ+4ω˜3ϕsin3πϕ2+ω˜4ϕsin2πϕ+4ω˜ϕsinπϕ2,a2=m2m3m1m4cos3ω˜τ1+m5ω˜ϕsinπϕ2sin2ω˜τ1m2m3m5ω˜ϕcosπϕ2cos2ω˜τ1+cos2ω˜τ1m3m6ω˜2ϕsinπϕsinω˜τ1+ω˜2ϕcosπϕcosω˜τ1+2ω˜ϕsinπϕ2sinω˜τ1+2ω˜ϕcosπϕ2cosω˜τ1+cosω˜τ1,b2=m2m3m1m4sin3ω˜τ1+m5ω˜ϕsinπϕ2cos2ω˜τ1+m2m3m5ω˜ϕcosπϕ2×sin2ω˜τ1+sin2ω˜τ1m3m6ω˜2ϕsinπϕcosω˜τ1ω˜2ϕcosπϕsinω˜τ1+2ω˜ϕsinπϕ2cosω˜τ12ω˜ϕcosπϕ2sinω˜τ1sinω˜τ1. (B.1)

Data Availability

Data sharing not is applicable in this article as no datasets were generated or analysed during the current paper.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  • 1.Hopfield J. J. Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of the National Academy of Sciences . 1984;81(10):3088–3092. doi: 10.1073/pnas.81.10.3088. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Williams R. J., Zipser D. A Learning algorithm for continually running fully recurrent neural networks. Neural Computation . 1989;1(2):270–280. doi: 10.1162/neco.1989.1.2.270. [DOI] [Google Scholar]
  • 3.Angeline P. J., Saunders G. M., Pollack J. B. An evolutionary algorithm that constructs recurrent neural networks. IEEE Transactions on Neural Networks . 1994;5(1):54–65. doi: 10.1109/72.265960. [DOI] [PubMed] [Google Scholar]
  • 4.Wong K. F. A recurrent network mechanism of time integration in perceptual decisions. Journal of Neuroscience . 2006;26(4):1314–1328. doi: 10.1523/jneurosci.3733-05.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Wang Z. D., Liu Y. R., Yu L., Liu X. H. Exponential stability of delayed recurrent neural networks with Markovian jumping parameters. Physics Letters A . 2006;356(4-5):346–352. doi: 10.1016/j.physleta.2006.03.078. [DOI] [Google Scholar]
  • 6.Kobayashi M. Hyperbolic Hopfield neural networks with directional multistate activation function. Neurocomputing . 2018;275:2217–2226. doi: 10.1016/j.neucom.2017.10.053. [DOI] [Google Scholar]
  • 7.Civalleri P. P., Gilli M., Pandolfi L. On stability of cellular neural networks with delay. IEEE Transactions on Circuits and Systems I Fundamental Theory and Applications . 1993;40(3):157–165. doi: 10.1109/81.222796. [DOI] [Google Scholar]
  • 8.Wan L., Zhou Q. H., Liu J. Delay-dependent attractor analysis of hopfield neural networks with time-varying delays. Chaos, Solitons & Fractals . 2017;101:68–72. doi: 10.1016/j.chaos.2017.05.017. [DOI] [Google Scholar]
  • 9.Rech P. C. Chaos and hyperchaos in a Hopfield neural network. Neurocomputing . 2011;74(17):3361–3364. doi: 10.1016/j.neucom.2011.05.016. [DOI] [Google Scholar]
  • 10.Cao J. D., Wang J. Global asymptotic stability of a general class of recurrent neural networks with time-varying delays. IEEE Transactions on Circuits and Systems I Fundamental Theory and Applications . 2003;50(1):34–44. doi: 10.1109/tcsi.2002.807494. [DOI] [Google Scholar]
  • 11.Cheng C. J., Liao T. L., Yan J. J., Hwang C. C. Exponential synchronization of a class of neural networks with time-varying delays. IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics . 2006;36(1):209–215. doi: 10.1109/TSMCB.2005.856144. [DOI] [PubMed] [Google Scholar]
  • 12.Zhang H., Zeng Z. G. Synchronization of recurrent neural networks with unbounded delays and time-varying coefficients via generalized differential inequalities. Neural Networks . 2021;143:161–170. doi: 10.1016/j.neunet.2021.05.022. [DOI] [PubMed] [Google Scholar]
  • 13.Coolen A. C. C., Del Prete V. Statistical mechanics beyond the Hopfield model: solvable problems in neural network theory. Reviews in the Neurosciences . 2003;14(1-2):181–193. doi: 10.1515/revneuro.2003.14.1-2.181. [DOI] [PubMed] [Google Scholar]
  • 14.Liao X. F., Wong K. W., Wu Z. Bifurcation analysis on a two-neuron system with distributed delays. Physica D: Nonlinear Phenomena . 2001;149(1-2):123–141. doi: 10.1016/s0167-2789(00)00197-4. [DOI] [Google Scholar]
  • 15.Gu K., Kharitonov V. L., Jie C. Stability of Time-Delay Systems . Switzerland: Birkhuser Boston; 2003. [Google Scholar]
  • 16.Biggio M., Storace M., Mattia M. Non-instantaneous synaptic transmission in spiking neuron networks and equivalence with delay distribution. BMC Neuroscience . 2013;14(S1):2677–P271. doi: 10.1186/1471-2202-14-s1-p267. [DOI] [Google Scholar]
  • 17.Aouiti C. Oscillation of impulsive neutral delay generalized high-order Hopfield neural networks. Neural Computing & Applications . 2018;29(9):477–495. doi: 10.1007/s00521-016-2558-3. [DOI] [Google Scholar]
  • 18.Smith K., Wang L. Chaos in the discretized analog Hopfield neural network and potential applications to optimization. Protein Science: A Publication of the Protein Society . 1998;2(2):1224–1231. [Google Scholar]
  • 19.Xia Y. H., Romanovski V. G. Bifurcation analysis of a population dynamics in a critical state. Bulletin of the Malaysian Mathematical Sciences Society . 2015;38(2):499–527. doi: 10.1007/s40840-014-0033-9. [DOI] [Google Scholar]
  • 20.Wang J. M., Liu F. Q., Qin S. T. Exponential stabilization of memristor-based recurrent neural networks with disturbance and mixed time delays via periodically intermittent control. International Journal of Control, Automation and Systems . 2021;19(6):2284–2296. doi: 10.1007/s12555-020-0083-8. [DOI] [Google Scholar]
  • 21.Zhou L. Delay-dependent and delay-independent passivity of a class of recurrent neural networks with impulse and multi-proportional delays. Neurocomputing . 2021;308(25):235–244. [Google Scholar]
  • 22.Zhou L. Q., Zhao Z. X. Exponential synchronization and polynomial synchronization of recurrent neural networks with and without proportional delays. Neurocomputing . 2020;372(1):109–116. doi: 10.1016/j.neucom.2019.09.046. [DOI] [Google Scholar]
  • 23.Cao J. D., Wang J. Global Asymptotic and robust stability of recurrent neural networks with time delays. IEEE Transactions on Circuits and Systems I: Regular Papers . 2005;52(2):417–426. doi: 10.1109/tcsi.2004.841574. [DOI] [Google Scholar]
  • 24.Zhang W., Li C. D., Huang T. W. Global robust stability of complex-valued recurrent neural networks with time-delays and uncertainties. International Journal of Biomathematics . 2014;07(02) doi: 10.1142/s1793524514500168.1450016 [DOI] [Google Scholar]
  • 25.Berezansky L., Braverman E. Exponential stability for systems of delay differential equations with block matrices. Applied Mathematics Letters . 2021;121(2) doi: 10.1016/j.aml.2021.107364.107364 [DOI] [Google Scholar]
  • 26.Lv Y. Y., Chen L. J., Chen F. D., Li Z. Stability and bifurcation in an SI epidemic model with additive Allee effect and time delay. International Journal of Bifurcation and Chaos . 2021;31(04) doi: 10.1142/s0218127421500607.2150060 [DOI] [Google Scholar]
  • 27.Chen W. H., Guan Z. H., Lu X. Delay-dependent exponential stability of uncertain stochastic systems with multiple delays: an LMI approach. Systems & Control Letters . 2005;54(6):547–555. doi: 10.1016/j.sysconle.2004.10.005. [DOI] [Google Scholar]
  • 28.Zhang G. D., Shen Y., Yin Q., Sun J. W. Global exponential periodicity and stability of a class of memristor-based recurrent neural networks with multiple delays. Information Sciences . 2013;232:386–396. doi: 10.1016/j.ins.2012.11.023. [DOI] [Google Scholar]
  • 29.Qiu F., Cui B. T., Wu W. Global exponential stability of high order recurrent neural network with time-varying delays. Applied Mathematical Modelling . 2009;33(1):198–210. doi: 10.1016/j.apm.2007.10.021. [DOI] [Google Scholar]
  • 30.Wang Y. J., Yang C. L., Zuo Z. Q. On exponential stability analysis for neural networks with time-varying delays and general activation functions. Communications in Nonlinear Science and Numerical Simulation . 2012;17(3):1447–1459. doi: 10.1016/j.cnsns.2011.08.016. [DOI] [Google Scholar]
  • 31.Zhang Z. Z., Yang H. Z. Hopf bifurcation analysis for a four-dimensional recurrent neural network with two delays. Journal of Applied Mathematics . 2013;2013:1–13. doi: 10.1155/2013/436254.436254 [DOI] [Google Scholar]
  • 32.Zhang F. H., Huang T. W., Wu Q. J., Zeng Z. G. Multistability of delayed fractional-order competitive neural networks. Neural Networks . 2021;140:325–335. doi: 10.1016/j.neunet.2021.03.036. [DOI] [PubMed] [Google Scholar]
  • 33.Lu J. X., Xue H. Adaptive synchronization for fractional stochastic neural network with delay. Advances in Difference Equations . 2021;2021:1–12. [Google Scholar]
  • 34.Yuan J., Huang C. D. Quantitative analysis in delayed fractional-order neural networks. Neural Processing Letters . 2019;51(2):1631–1651. doi: 10.1007/s11063-019-10161-2. [DOI] [Google Scholar]
  • 35.Udhayakumar K., Rajan R. Hopf bifurcation of a fractional-order octonion-valued neural networks with time delays. Discrete & Continuous Dynamical Systems - S . 2020;13(9):2537–2559. doi: 10.3934/dcdss.2020137. [DOI] [Google Scholar]
  • 36.Xiao M., Zheng W. X., Jiang G., Cao J. D. Stability and bifurcation of delayed fractional-order dual congestion control algorithms. IEEE Transactions on Automatic Control . 2017;62(9):4819–4826. doi: 10.1109/tac.2017.2688583. [DOI] [Google Scholar]
  • 37.Xiao M., Zheng W. X., Jiang G. P., Cao J. D. Undamped oscillations generated by Hopf bifurcations in fractional-order recurrent neural networks with caputo derivative. IEEE Transactions on Neural Networks and Learning Systems . 2015;26(12):3201–3214. doi: 10.1109/tnnls.2015.2425734. [DOI] [PubMed] [Google Scholar]
  • 38.Huang C. D., Meng Y., Cao J. D., Alsaedi A., Alsaadi F. E. New bifurcation results for fractional BAM neural network with leakage delay. Chaos, Solitons & Fractals . 2017;100:31–44. doi: 10.1016/j.chaos.2017.04.037. [DOI] [Google Scholar]
  • 39.Xiao M., Jiang G., Cao J. D., Zheng W. Local bifurcation analysis of a delayed fractional-order dynamic model of dual congestion control algorithms. IEEE/CAA Journal of Automatica Sinica . 2017;4(2):361–369. doi: 10.1109/jas.2016.7510151. [DOI] [Google Scholar]
  • 40.Cao Y., Li Y., Ren W., Chen Y. Distributed coordination of networked fractional-order systems. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) . 2010;40(2):362–370. doi: 10.1109/tsmcb.2009.2024647. [DOI] [PubMed] [Google Scholar]
  • 41.Wang H., Yu Y., Wen G., Zhang S. Stability analysis of fractional-order neural networks with time delay. Neural Processing Letters . 2015;42(2):479–500. doi: 10.1007/s11063-014-9368-3. [DOI] [Google Scholar]
  • 42.Huang C. D., Cao J. D., Xiao M. Hybrid control on bifurcation for a delayed fractional gene regulatory network. Chaos, Solitons & Fractals . 2016;87:19–29. doi: 10.1016/j.chaos.2016.02.036. [DOI] [Google Scholar]
  • 43.Sun Q., Xiao M., Tao B., et al. Hopf bifurcation analysis in a fractional-order survival red blood cells model and PDα control. Advances in Differential Equations . 2018;2018:1–10. [Google Scholar]
  • 44.Xu C. J., Liu Z. X., Liao M. X., Li P. H., Xiao Q. M., Yuan S. Fractional-order bidirectional associate memory (BAM) neural networks with multiple delays: the case of Hopf bifurcation. Mathematics and Computers in Simulation . 2021;182:471–494. doi: 10.1016/j.matcom.2020.11.023. [DOI] [Google Scholar]
  • 45.Cao H., Yan D., Xu X. Hopf bifurcation for an SIR model with age structure. Mathematical Modelling of Natural Phenomena . 2021;16:p. 7. doi: 10.1051/mmnp/2021003. [DOI] [Google Scholar]
  • 46.Kaslik E., Rădulescu I. R. Stability and bifurcations in fractional-order gene regulatory networks. Applied Mathematics and Computation . 2022;421 doi: 10.1016/j.amc.2022.126916.126916 [DOI] [Google Scholar]
  • 47.Yu F., Wang Y. S. Hopf bifurcation and Bautin bifurcation in a prey-predator model with prey’s fear cost and variable predator search speed. Mathematics and Computers in Simulation . 2022;196:192–209. doi: 10.1016/j.matcom.2022.01.026. [DOI] [Google Scholar]
  • 48.Soresina C. Hopf Bifurcations in the Full SKT Model and where to Find Them. 2022. https://arxiv.org/abs/2202.04168 .
  • 49.Li Z. H., Huang C. D., Zhang Y. Comparative analysis on bifurcation of four-neuron fractional ring networks without or with leakage delays. Advances in Differential Equations . 2019;179:1–22. [Google Scholar]
  • 50.Podlubny I. Fractional Differential Equations . New York, NY, USA: Academic Press; 1999. [Google Scholar]
  • 51.Matignon D. Stability results for fractional differential equations with applications to control processing. IEEE-SMC Proceedings, Lille, France . 1996;2:963–968. [Google Scholar]
  • 52.Deng W., Li C., Lü J. Stability analysis of linear fractional differential system with multiple time delays. Nonlinear Dynamics . 2007;48(4):409–416. doi: 10.1007/s11071-006-9094-0. [DOI] [Google Scholar]
  • 53.Bhalekar S., Varsha D. A predictor-corrector scheme for solving nonlinear delay differential equations of fractional order. Fractional Calculus and Applied Analysis . 2011;1:1–9. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data sharing not is applicable in this article as no datasets were generated or analysed during the current paper.


Articles from Computational Intelligence and Neuroscience are provided here courtesy of Wiley

RESOURCES