Skip to main content
Entropy logoLink to Entropy
. 2022 Oct 29;24(11):1555. doi: 10.3390/e24111555

Dynamics of Hopfield-Type Neural Networks with Modulo Periodic Unpredictable Synaptic Connections, Rates and Inputs

Marat Akhmet 1,*, Madina Tleubergenova 2,3, Akylbek Zhamanshin 1,2
Editors: Carla MA Pinto, Julio Rebelo, Helena Reis
PMCID: PMC9689789  PMID: 36359644

Abstract

In this paper, we rigorously prove that unpredictable oscillations take place in the dynamics of Hopfield-type neural networks (HNNs) when synaptic connections, rates and external inputs are modulo periodic unpredictable. The synaptic connections, rates and inputs are synchronized to obtain the convergence of outputs on the compact subsets of the real axis. The existence, uniqueness, and exponential stability of such motions are discussed. The method of included intervals and the contraction mapping principle are applied to attain the theoretical results. In addition to the analysis, we have provided strong simulation arguments, considering that all the assumed conditions are satisfied. It is shown how a new parameter, degree of periodicity, affects the dynamics of the neural network.

Keywords: Hopfield-type neural networks, modulo periodic unpredictable synaptic connections, rates and inputs, unpredictable solutions, exponential stability, numerical simulations

1. Introduction

It is well-known that HNNs [1,2] are widely used in the fields of signal and image processing, pattern recognition, associative memory and optimization computation, among others [3,4,5,6,7,8]. Hence, they have been the object of intensive analysis by numerous authors in recent decades. With the increasing improvement in neural networks, the aforementioned systems are being modernized, and the dynamics of models with various types of coefficients are being investigated [9,10,11,12,13]. Special attention is being paid to the problem of the existence and stability of periodic and almost periodic solutions of HNNs [14,15,16,17,18,19,20,21], for which appropriate coefficients and conditions are necessary.

A few years ago, the boundaries of the classical theory of dynamical systems, founded by H. Poincare [22] and G. Birkhoff [23], were expanded by the concepts of unpredictable points and unpredictable functions [24]. It was proven that the unpredictable point leads to the existence of chaos in quasi-minimal sets. That is, the proof of the unpredictability simultaneously confirms Poincare chaos. Unpredictable functions were defined as unpredictable points in the Bebutov dynamical system [25], where the topology of convergence on compact sets of the real axis is used instead of the metric space. The use of such convergence significantly simplifies the problem of proving the existence of unpredictable solutions for differential equations and neural networks, and a new method of included intervals has been introduced and developed in several papers [26,27,28,29,30,31].

Let us commence with the main definitions.

Definition 1

([25]). A uniformly continuous and bounded function ψ:RR is unpredictable if there exist positive numbers ϵ0,δ and sequences tn,sn, both of which diverge to infinity such that |ψ(t+tn)ψ(t)|0 as n uniformly on compact subsets of R and |ψ(t+tn)ψ(t)|>ϵ0 for each t[snδ,sn+δ] and nN.

In Definition 1, the sequences tn,sn,n=1,2, are said to be the convergence and divergence sequences of the function ψ(t), respectively. We call the uniform convergence on compact subsets of R the convergence property, and the existence of the sequence sn and positive numbers ϵ0,δ is called the separation property. It is known [32] that an unpredictable function without separation property is said to be a Poisson stable function.

Let us introduce a new type of unpredictable functions, which are important objects for investigation in the paper.

Definition 2.

The sum ϕ(t)+ψ(t) is said to be a modulo periodic unpredictable function if ϕ(t) is a continuous periodic function and ψ(t) is an unpredictable function.

In this study, we focus on the Hopfield-type neural network with two-component coefficients and inputs:

xi(t)=(ai(t)+bi(t))xi(t)+j=1p(cij(t)+dij(t))fj(xj(t))+ui(t)+vi(t),i=1,2,,p, (1)

where xi(t) stands for the state vector of the ith unit at time t. The synaptic connections, rates and external inputs are modulo periodic unpredictable; they consist of two components such that ai(t),cij(t),ui(t) are periodic and bi(t),dij(t),vi(t) are unpredictable. cij(t) and dij(t) denote components of the synaptic connection weights of the jth unit with the ith unit at time t; the functions fj(xj(t)) denote the measures of activation to its incoming potentials of the unit j at time t.

Consider the convergence sequence tn of the unpredictable function ψ(t). For fixed real number ω>0, one can write that tnτn(modω), where 0τn<ω for all n1. The boundedness of the sequence τn implies that there exists a subsequence τnl, which converges to a number τω, 0τω<ω. That is, there exists a subsequence tnl of the convergence sequence tn and a number τω such that tnlτω(modω) as l. We called the number τω the Poisson shift for the convergence sequence tn with respect to the ω [33]. Denote by Tω the set of all Poisson shifts. The number κω=infTω, 0κω<ω, is said to be the Poisson number for the convergence sequence tn. If κω=0, then we say that the sequence tn satisfies the kappa property.

2. Methods

Due to the development of neural networks and their applications, classical types of functions such as periodic and almost periodic are no longer sufficient to study their dynamics. This is especially seen in analysis of the chaotic behavior of the systems. Therefore, in order to meet requirements of progress, many more functions are needed. To satisfy the demands, we have combined periodic and unpredictable components in rates and inputs. If the periodicity is inserted to serve for stability, the unpredictability guarantees chaotic dynamics. According to Definition 1, verification of the convergence and separation properties is necessary to prove the existence of unpredictable solutions. To provide constructive conditions for the existence of unpredictable solutions, we have determined the special kappa property of the convergence sequence tn, with respect to the period ω.

The method of included intervals, which was introduced in paper [26] and has been developed in [27,28,29,33], is a powerful instrument for verifying convergence properties. This technique has been applied in the study of continuous unpredictable solutions of Hopfield-type neural networks with delayed and advanced arguments [30] and in the study of discontinuous unpredictable solutions of impulsive neural networks with Hopfield structures [31]. All the previous models in [30,31] are considered with constant rates, while in the present research, the rates are variable, and we have designed the new model of Hopfield-type neural networks with modulo periodic unpredictable rates ai(t)+bi(t), connection weights cij(t)+dij(t) and external inputs ui(t)+vi(t). The periodic components, ai(t), serve the stability of the model, while the unpredictable components bi(t) and vi(t) cause chaotic outputs.

3. Main Results

Throughout the paper, we will use the norm v=max1ipvi, where · is the absolute value, v=(v1,,vp) and viR,i=1,2,,p.

Following the results in [34], it can be shown that the function y(t)=(y1(t),y2(t),,yp(t)) is a solution of (1) if and only if it satisfies the following integral equation:

yi(t)=testai(u)dubi(s)yi(s)+j=1p(cij(s)+dij(s))fj(yj(s))+ui(s)+vi(s)ds, (2)

for all i=1,,p.

Denote by S the set of vector-functions φ=(φ1,φ2,,φp), where φi, i=1,2,,p, satisfy the convergence property with the common convergence sequence tn. Moreover, |φi|<H, i=1,2,,p, where H is a positive number. In the set, S determines the norm φ(t)0=max(i)|φi(t)|.

Define on S the operator T such that Tϕ(t)=(T1ϕ(t),T2ϕ(t),,Tpϕ(t)), where:

Tiϕ(t)testai(u)dubi(s)ϕi(s)+j=1p(cij(s)+dij(s))fj(ϕj(s))+ui(s)+vi(s)ds, (3)

for each i=1,2,, p. We will need the following conditions:

  • (C1)

    The functions ai(t), cij(t) and ui(t) are continuous ω—periodic, and 0ωai(u)du>0 for each i,j=1,,p;

  • (C2)

    The functions bi(t), dij(t) and vi(t), i,j=1,2,,p, are unpredictable with the same convergence and divergence sequences tn,sn. Moreover, |vi(t+tn)vi(t)|>ϵ0 for all t[snδ;sn+δ],i=1,2,,p, and positive numbers δ,ϵ0;

  • (C3)

    The convergence sequence tn satisfies the kappa property with respect to the period ω;

  • (C4)

    There exists a positive number mf such that sup|s|<H|f(s)|=mf;

  • (C5)

    There exists a positive number L such that the function f(s) satisfies the inequality |f(s1)f(s2)|  L|s1s2| if |s1|<H,|s2|<H.

According the condition (C1), for all i=1,, p, the numbers Ki1 and λi>0 exist, such that

estai(u)duKieλi(ts). (4)

For convenience, we introduce the following notations:

mia=suptR|ai(t)|,mib=suptR|bi(t)|,miu=suptR|ui(t)|,miv=suptR|vi(t)|,mijc=suptR|cij(t)|,mijd=suptR|dij(t)|,

for each i=1,2,,p.

The following conditions are required:

  • (C6)

    KiλiKimib(j=1p(mijc+mijd)mf+miu+miv)<H;

  • (C7)

    Ki(mib+j=1p(mijc+mijd)L)<λi;

  • (C8)

    Hmib+j=1pmijdmf<ϵ04;

for all i,j=1,,p.

Lemma 1.

The set S is a complete space.

Proof. 

Consider a Cauchy sequence ϕk(t) in S, which converges to a limit function ϕ(t) on R. Fix a closed and bounded interval IR. We obtain:

ϕ(t+tn)ϕ(t)  ϕ(t+tn)ϕk(t+tn) + ϕk(t+tn)ϕk(t) + ϕk(t)ϕ(t). (5)

One can choose sufficiently large n and k, such that each term on the right side of (5) is smaller than ϵ3 for an arbitrary ϵ>0 and tI. Thus, we conclude that the sequence ϕ(t+tn) is uniformly converging to ϕ(t) on I. That is, the set S is complete. □

Lemma 2.

The operator T is invariant in S.

Proof. 

For a function φ(t)S and fixed i=1,2,,p, we have that

|Tiφ(t)|=tetsai(u)dubi(s)φi(s)+j=1p(cij(s)+dij(s))fj(φj(s))+ui(s)+vi(s)dstKieλi(ts)|bi(s)φi(s)|+j=1p(|cij(s)|+|dij(s)|)|fj(φj(s))|+|ui(s|+|vi(s)|dsKiλimibH+j=1p(mijc+mijd)mf+miu+miv.

The last inequality and condition (C6) imply that Tφ0<H.

Next, applying the method of included intervals, we will show that Tφ(t+tn)Tφ(t) as n uniformly on compact subsets of R.

Let us fix an arbitrary ϵ>0 and a section [α,β], <α<β<. There exist numbers γ, ξ such that γ<α and ξ>0, which satisfy the following inequalities:

Kiλieλi(αγ)mibH+14j=1p(mijc+mijd)(LH+mf)+miu+miv<ϵ8, (6)
Kiλi(eξ(βγ)1)(mibH+j=1p(mijc+mijd)mf+miu+miv)<ϵ4, (7)

and

Kiξλi(mib+H+j=1p(mijc+mijd)L+2pmf+2)<ϵ4, (8)

for all i=1,2,,p.

Since the functions bi(t), dij(t) and vi(t), i,j=1,2,,p, are unpredictable, φ(t) belongs to S, and the convergence sequence, tn, is common to all of them and satisfies the kappa property. Then, the following inequalities are true: |bi(t+tn)bi(t)|<ξ, |dij(t+tn)dij(t)|<ξ, |vi(t+tn)vi(t)|<ξ, |φi(t+tn)φi(t)|<ξ for t[γ,β]. Moreover, applying condition (C3), one can attain that |ai(t+tn)ai(t)|<ξ, |cij(t+tn)cij(t)|<ξ, and |ui(s+tn)ui(s)|<ξ for tR, i,j=1,2,,p. We have that:

|Tiφ(t+tn)Tiφ(t)|  |testai(u+tn)du(bi(s+tn)φi(s+tn)+j=1p(cij(s+tn)+dij(s+tn))fj(φj(s+tn))+ui(s+tn)+vi(s+tn))dstestai(u)dubi(s)φi(s)+j=1p(cij(s)+dij(s))fj(φj(s))+ui(s)+vi(s)ds|t|estai(u+tn)duestai(u)du||bi(s+tn)φij(s+tn)+j=1p(cij(s+tn)+dij(s+tn))fj(φj(s+tn))+ui(s+tn)+vi(s+tn)|ds+testai(u)du|bi(s+tp)φi(s+tp)+bi(s)φi(s)+j=1p(cij(s+tn)+dij(s+tn))(fj(φj(s+tn))fj(φj(s)))+j=1p(cij(s+tn)cij(s)+dij(s+tn)dij(s))fj(φj(s))+ui(s+tn)+vi(s+tn)ui(s)vi(s)|ds,

for all i=1,2,,p. Consider the terms in the last inequality separately on intervals (,γ] and (γ,t]. By using inequalities (6)–(8), we obtain:

I1=γ|estai(u+tn)duestai(u)du||bi(s+tn)φij(s+tn)+j=1p(cij(s+tn)+dij(s+tn))fj(φj(s+tn))+ui(s+tn)+vi(s+tn)|ds+γestai(u)du|bi(s+tp)φi(s+tp)+bi(s)φi(s)+j=1p(cij(s+tn)+dij(s+tn))(fj(φj(s+tn))fj(φj(s)))+j=1p(cij(s+tn)cij(s)+dij(s+tn)dij(s))fj(φj(s))+ui(s+tn)+vi(s+tn)ui(s)vi(s)|dsγ2Kieλi(ts)mibH+j=1p(mijc+mijd)mf+miu+mivds+γKieλi(ts)2mibH+j=1p(mijc+mijd)LH+2j=1p(mijc+mijd)mf+2miu+2mivds2Kiλieλi(αγ)mibH+j=1p(mijc+mijd)mf+miu+miv+Kiλieλi(αγ)2mibH+j=1p(mijc+mijd)(LH+2mf)+2miu+2miv4Kiλieλi(αγ)mibH+14j=1p(mijc+mijd)(LH+mf)+miu+miv<ϵ2,

and

I2=γt|estai(u+tn)duestai(u)du||bi(s+tn)φij(s+tn)+j=1p(cij(s+tn)+dij(s+tn))fj(φj(s+tn))+ui(s+tn)+vi(s+tn)|dsγtestai(u)du|bi(s+tp)φi(s+tp)+bi(s)φi(s)+j=1p(cij(s+tn)+dij(s+tn))(fj(φj(s+tn))fj(φj(s)))+j=1p(cij(s+tn)cij(s)+dij(s+tn)dij(s))fj(φj(s))+ui(s+tn)+vi(s+tn)ui(s)vi(s)|dsγtKieλi(ts)(eξ(βγ)1)mibH+j=1p(mijc+mijd)mf+miu+mivds+γtKieλi(ts)(mib+H)ξ+j=1p(mijc+mijd)Lξ+2ξpmf+2ξdsKiλi(eξ(βγ)1)(mibH+j=1p(mijc+mijd)mf+miu+miv)+Kiλi((mib+H)ξ+j=1p(mijc+mijd)Lξ+2ξpmf+2ξ)<ϵ4+ϵ4=ϵ2,

for each i=1,2,,p. This is why, for all t[α,β] and i=1,2,,p, we have that |Tiφ(t+tn)Tiφ(t)|I1+I2<ϵ. So, the function Tφ(t+tn) uniformly convergences to Tφ(t) on compact subsets of R, and it is true that T:SS. □

Lemma 3.

The operator T is contractive in S, provided that the conditions (C1)(C7) are valid.

Proof. 

For two functions φ,ψS, and fixed i=1,2,,p, it is true that

|Tiφ(t)Tiψ(t)|testai(u)du(|bi(s)||φi(s)ψi(s)|+j=1pcij(s)|fj(φj(s))fj(ψj(s))|ds+j=1pdij(s)|fj(φj(s))fj(ψj(s))|ds)dsKiλi(mib|φi(s)ψi(s)|+j=1pmijcL|φj(s)ψj(s)|+j=1pmijdL|φj(s)ψj(s)|)dsKiλi(mib+j=1p(mijc+mijd)L)φψ0.

The last inequality yields Tφ(t)Tψ(t)0maxiKiλi(mib+j=1p(mijc+mijd)L)φ(t)ψ(t)0. Hence, in accordance with condition (C7), the operator T is contractive in S.

Theorem 1.

The neural network (1) admits a unique exponentially stable unpredictable solution provided that conditions (C1)(C8) are fulfilled.

Proof. 

By Lemma 1, the set S is complete; by Lemma 2, the the operator T is invariant in S; and by Lemma 3, the operator T is contractive in S. Applying the contraction mapping theorem, we obtain that there exists a fixed point ωS of the operator T, which is a solution of the neural network (1) and satisfies the convergence property.

Next, we show that the solution ω(t) of (1) satisfies the separation property.

Applying the relations

ωi(t)=ωi(sn)sntai(s)ωi(s)dssntbi(s)ωi(s)ds+sntj=1pcij(s)fj(ωj(s))ds+sntj=1pdij(s)fj(ωj(s))ds+sntui(s)ds+sntvi(s)ds

and

ωi(t+tn)=ωi(sn+tn)sntai(s+tn)ωi(s+tn)dssntbi(s+tn)ωi(s+tn)ds+sntj=1pcij(s+tn)fj(ωj(s+tn))ds+sntj=1pdij(s+tn)fj(ωj(s))ds+sntui(s+tn)ds+sntvi(s+tn)ds

we obtain:

ωi(t+tn)ωi(t)=ωi(sn+tn)ωi(sn)sntai(s+tn)(ωi(s+tn)ωi(s))dssntωi(s)(ai(s+tn)ai(s))dssntbi(s+tn)(ωi(s+tn)ωi(s))dssntωi(s)(bi(s+tn)bi(s))ds+sntj=1pcij(s+tn)(fi(ωi(s+tn))fi(ωi(s)))ds+sntj=1p(cij(s+tn)cij(s))fi(ωi(s))ds+sntj=1pdij(s+tn)(fi(ωi(s+tn))fi(ωi(s)))ds+sntj=1p(dij(s+tn)dij(s))fi(ωi(s))ds+snt(ui(s+tn)ui(s))ds+snt(vi(s+tn)vi(s))ds.

There exist positive numbers δ1 and integers l, k such that, for each i=1,2,,p, the following inequalities are satisfied:

6l<δ1<δ; (9)
|ai(t+s)ai(s)|<ϵ0(1l+2k),tR, (10)
|cij(t+s)cij(s)|<ϵ0(1l+2k),tR, (11)
|ui(t+s)ui(s)|<ϵ0(1l+2k),tR, (12)
mia+H+mib+j=1p(mijc+mijd)L+1(1l+2k)<14, (13)
|ωi(t+s)ωi(t)|<ϵ0min(1k,14l),tR,|s|<δ1. (14)

Let the numbers δ1, l and k, as well as numbers nN, and i=1,,p, be fixed. Consider the following two alternatives: (i) |ωi(tn+sn)ωi(sn)| <ϵ0/l; (ii) |ωi(tn+sn)ωi(sn)| ϵ0/l.

(i) Using (14), one can show that

|ωi(t+tn)ωi(tn)||ωi(t+tn)ωi(sn+tn)|+|ωi(sn+tn)ωi(sn)|+|ωi(sn)ωi(t)|<ϵ0l+ϵ0k+ϵ0k=ϵ0(1l+2k),i=1,2,,p, (15)

if t[sn,sn+δ1]. Therefore, the condition (C8) and inequalities (9)–(15) imply that

|ωi(t+tn)ωi(t)|snt|vi(s+tn)vi(s)|dssnt|ωi(s)||bi(s+tn)bi(s)|dssntj=1p|dij(s+tn)dij(s)||fi(ωi(s)|dssnt|ai(s+tn)||ωi(s+tn)ωi(s)|dssnt|ωi(s)||ai(s+tn)ai(s)|dssnt|bi(s+tn)||ωi(s+tn)ωi(s)|dssntj=1p|cij(s+tn)||fi(ωi(s+tn))fi(ωi(s))|dssntj=1p|cij(s+tn)cij(s)||fi(ωi(s))|dssntj=1p|dij(s+tn)||fi(ωi(s+tn))fi(ωi(s))|dssnt|ui(s+tn)ui(s)|ds|ωi(sn+tn)ωi(sn)|δ1ϵ02δ1Hmib2δ1j=1pmijdmfδ1miaϵ0(1l+2k)δ1Hϵ0(1l+2k)δ1mibϵ0(1l+2k)δ1j=1pmijcLϵ0(1l+2k)δ1j=1pmijdLϵ0(1l+2k)δ1ϵ0(1l+2k)ϵ0l=δ1ϵ02Hmib2j=1pmijdmf(mia+H+mib+j=1p(mijc+mijd)L+1)ϵ0(1l+2k)ϵ0l>ϵ02l

for t[sn,sn+δ1].

(ii) If |ωi(tn+sn)ωi(sn)|ϵ0/l, it is not difficult to find that (14) implies:

|ωi(t+tn)ωi(t)||ωi(tn+sn)ωi(sn)||ωi(sn)ωi(t)||ωi(t+tn)ωi(tn+sn)|>ϵ0lϵ04lϵ04l=ϵ02l,i=1,2,,p, (16)

if t[snδ1,sn+δ1] and nN. Thus, it can be concluded that ω(t) is an unpredictable solution with sequences tn, sn and positive numbers δ12, ϵ02l.

Next, we will prove the stability of the solution ω(t). It is true that

ωi(t)=et0tai(u)duωi(t0)t0testai(u)dubi(s)ωi(s)+j=1p(cij(s)+dij(s))fj(ωj(s))+ui(s)+vi(s)ds,

for all i=1,,p.

Let y(t)=(y1(t),y2(t),,yp(t)), be another solution of system (1). Then,

yi(t)=et0tai(u)duyi(t0)t0testai(u)dubi(s)yi(s)+j=1p(cij(s)+dij(s))fj(yj(s))+ui(s)+vi(s)ds,

for all i=1,,p.

Making use of the relation:

yi(t)ωi(t)=et0tai(u)duyi(t0)ωi(t0)t0tet0tai(u)du(bi(s)yi(s)bi(s)ωi(s)+j=1pcijfj(yj(s))j=1pcijfj(ωj(s))+j=1pdijfj(yj(s))j=1pdijfj(ωj(s)))ds,

we obtain that:

yi(t)ωi(t)Kieλi(tt0)|yi(t0)ωi(t0)|+t0tKieλi(tt0)mib+j=1p(mijc+mijd)L|yi(s)ωi(s)|ds,

for all i=1,2,,p.

Applying the Gronwall–Belman Lemma, one can obtain:

|yi(t)ωi(t)|Kiyi(t0)ωi(t0)|e(Ki(mib+Lj=1p(mijc+mijd))λi)(tt0), (17)

for each i=1,2,,p. So, (C7) implies that ω(t)=(ω1(t),ω2(t),,ωp(t)) is an exponentially stable unpredictable solution of the neural network (1). The theorem is proven. □

4. Numerical Examples

Let ψi,iZ, be a solution of the logistic discrete equation:

λi+1=μλi(1λi), (18)

with μ=3.91.

In the paper [25], an example was constructed of the unpredictable function Θ(t). The function Θ(t)=te3(ts)Ω(s)ds, where Ω(t) is a piecewise constant function defined on the real axis through the equation Ω(t)=ψi for t[i,i+1),iZ.

In what follows, we will define the piecewise constant function, Ω(t), for t[hi,h(i+1)), where iZ and h is a positive real number. The number h is said to be the length of step of the functions Ω(t) and Θ(t). We call the ratio of the period and the length of step, =ω/h the degree of periodicity.

Below, using numerical simulations, we will show how the degree of periodicity affects the dynamics of a neural network.

Example 1.

Let us consider the following Hopfield-type neural network:

xi(t)=(ai(t)+bi(t))xi(t)+j=13(cij(t)+dij(t))fj(xj(t))+ui(t)+vi(t), (19)

 where i=1,2,3, f(x(t))=0.2tanh(x(t)). The functions ai(t), cij(t) and ui(t) are π/2—periodic such that a1(t)=2+sin2(2t), a2(t)=3+cos(4t), a3(t)=4+cos2(2t), c11(t)=0.1cos(4t), c12(t)=0.3sin(2t), c13(t)=0.1cos(8t), c21(t)=0.2sin(8t), c22(t)=0.05cos(4t), c23(t)=0.4sin(2t), c31(t)=0.3cos(2t), c32(t)=0.5sin(4t), c33(t)=0.1sin(8t), u1(t)=sin(8t), u2(t)=sin(4t), u3(t)=cos(4t). The unpredictable functions bi(t), dij(t) and vi(t) such that b1(t)=0.2Θ(t), b2(t)=0.6Θ(t), b3(t)=0.4Θ(t), d11(t)=0.02Θ(t), d12(t)=0.05Θ(t), d13(t)=0.03Θ(t), d21(t)=0.04Θ(t), d22(t)=0.01Θ(t), d23(t)=0.06Θ(t), d31(t)=0.06Θ(t), d32(t)=0.06Θ(t), d33(t)=0.05Θ(t), v1(t)=3Θ(t), v2(t)=5Θ(t), v3(t)=4Θ(t), where Θ(t)=te2.5(ts)Ω(s)ds with the length of step h=4π. Condition (C1) is valid, and Ki=1, i=1,2,3, λ1=5π/4, λ2=6π/4, λ3=9π/4. Since the elements of the convergence sequence are multiples of h=4π, and the period ω is equal to π/2, condition (C3) is valid. The degree of periodicity is equal to 1/8. Conditions (C4)–(C8) are satisfied with H=1, mf=0.2, L=0.2, m1b=0.08, m2b=0.24, m3b=0.16, m11c=0.1, m12c=0.3, m13c=0.1, m21c=0.2, m22c=0.05, m23c=0.4, m31c=0.3, m32c=0.5, m33c=0.1, m11d=0.008, m12d=0.02, m13d=0.012, m21d=0.016, m22d=0.004, m23d=0.024, m31d=0.024, m32d=0.024, m33d=0.02, m1u=m2u=m3u=1, m1v=1.2, m2v=2, m3v=1.6. According Theorem 1, the neural network (19) admits a unique asymptotically stable, unpredictable solution ω(t)=(ω1(t),ω2(t),ω3(t)). In Figure 1 and Figure 2, the coordinates and the trajectory of the neural network are shown (19), which asymptotically convergence to the coordinates and trajectory of the unpredictable solution ω(t). Moreover, utilizing (17), we have that:

|x1(t)ω1(t)||x1(0)ω1(0)|e3.62(tt0)2e3.62(tt0),|x2(t)ω2(t)||x2(0)ω2(0)|e4.26(tt0)2e4.26(tt0),|x3(t)ω3(t)||x3(0)ω3(0)|e6.72(tt0)2e6.72(tt0).

Thus, if t>13.62(5ln10+ln2)3.38, then x(t)ω(t)0<105. In other words, what is seen in Figure 1 and Figure 2 for sufficiently large time can be accepted as parts of the graph and trajectory of the unpredictable solution.

Figure 1.

Figure 1

The time series of the coordinates x1(t), x2(t) and x3(t) of the solution of system (19) with the initial conditions x1(0)=0.5, x2(0)=0.7, x3(0)=0.3 and =1/8.

Figure 2.

Figure 2

The trajectory of the neural network (19).

Example 2.

Let us show the simulation results for the following Hopfield-type neural network:

yi(t)=(ai(t)+bi(t))yi(t)+j=13(cij(t)+dij(t))fj(yj(t))+ui(t)+vi(t), (20)

 where i=1,2,3, f(y(t))=0.5arctg(y(t)).

The functions ai(t), cij(t) and ui(t) are periodic with common period ω=1, and a1(t)=5+cos(2πt), a2(t)=4+sin2(πt), a3(t)=6+0.5sin(2πt), c11(t)=0.4cos(2πt), c12(t)=0.2sin(4πt), c13(t)=0.1cos(8πt), c21(t)=0.1cos(4πt), c22(t)=0.4cos(2πt), c23(t)=0.4sin(4πt), c31(t)=0.3sin(2πt), c32(t)=0.5cos(4πt), c33(t)=0.2cos(2πt), u1(t)=cos(2πt), u2(t)=0.5sin(4πt), u3(t)=sin(2πt). The functions bi(t), dij(t) and vi(t) are unpredictable such that b1(t)=0.5Θ(t), b2(t)=0.3Θ(t), b3(t)=0.7Θ(t), d11(t)=0.3Θ(t), d12(t)=0.6Θ(t), d13(t)=0.2Θ(t), d21(t)=0.3Θ(t), d22(t)=0.5Θ(t), d23(t)=0.3Θ(t), d31(t)=0.1Θ(t), d32(t)=0.2Θ(t), d33(t)=0.5Θ(t), v1(t)=6Θ(t), v2(t)=8Θ(t), v3(t)=7Θ(t), where Θ(t)=te3(ts)Ω(s)ds with the length of step h=1. Condition (C1) is valid, and Ki=1, i=1,2,3, λ1=5, λ2=4.5, λ3=6. Conditions (C2) and (C3) are satisfied since the elements of the convergence sequence are multiples of h=1 and the period ω is equal to 1. The degree of periodicity equals to 1. Conditions (C4)–(C8) are satisfied with H=1, mf=π/4, L=0.5, m1b=1/6, m2b=1/10, m3b=7/30, m11c=0.4, m12c=0.2, m13c=0.1, m21c=0.1, m22c=0.4, m23c=0.4, m31c=0.3, m32c=0.5, m33c=0.2, m11d=0.1, m12d=0.2, m13d=0.07, m21d=0.1, m22d=0.17, m23d=0.1, m31d=0.34, m32d=0.07, m33d=0.17, m1u=1, m2u=0.5, m3u=1, m1v=2, m2v=8/3, m3v=7/3. Figure 3 and Figure 4 demonstrate the coordinates and the trajectory of the solution y(t)=(y1(t),y2(t,y3(t))), of the neural network (20), with initial values y1(0)=0.2, y2(0)=0.4, y3(0)=0.6. The solution y(t)=(y1(t),y2(t,y3(t))) asymptotically converges to the unpredictable solution ω(t). By estimation (17), one can obtain that y(t)ω(t)0<106 for t>14.175(6ln10+ln2)3.48..

Figure 3.

Figure 3

The time series of the coordinates y1(t), y2(t) and y3(t) of the solution of system (20) with the initial conditions y1(0)=0.5, y2(0)=0.7, y3(0)=0.3, and =1.

Figure 4.

Figure 4

The trajectory of the neural network (20).

Example 3.

Finally, we will show how the degree of periodicity, >1, effects the dynamics of the Hopfield-type neural network:

zi(t)=(ai(t)+bi(t))zi(t)+j=13(cij(t)+dij(t))fj(zj(t))+ui(t)+vi(t), (21)

 where i=1,2,3, f(z(t))=0.25arctg(z(t)). The functions ai(t), cij(t) and ui(t) are periodic with common period ω=10π, and a1(t)=5+sin(2t), a2(t)=6+cos(4t), a3(t)=4+0.5sin(2t), c11(t)=0.01sin(2t), c12(t)=0.04cos(4t), c13(t)=0.02sin(8t), c21(t)=0.05cos(4t), c22(t)=0.03sin(2t), c23(t)=0.03cos(8t), c31(t)=0.02sin(4t), c32(t)=0.05cos(2t), c33(t)=0.01cos(4t), u1(t)=sin(0.4t), u2(t)=cos(0.4t), u3(t)=cos(0.2t). The unpredictable functions bi(t), dij(t) and vi(t) are such that b1(t)=0.8Θ(t), b2(t)=0.3Θ(t), b3(t)=0.4Θ(t), d11(t)=0.04Θ(t), d12(t)=0.05Θ(t), d13(t)=0.02Θ(t), d21(t)=0.05Θ(t), d22(t)=0.01Θ(t), d23(t)=0.06Θ(t), d31(t)=0.01Θ(t), d32(t)=0.06Θ(t), d33(t)=0.03Θ(t), v1(t)=1.6Θ(t), v2(t)=1.4Θ(t), v3(t)=1.8Θ(t), where Θ(t)=te2(ts)Ω(s)ds with the length of step h=0.1π. All conditions (C1)–(C8) are valid with Ki=1, i=1,2,3, λ1=50π, λ2=40π, λ3=60π, H=1, mf=π/4, L=0.25, m1b=0.4, m2b=0.15, m3b=0.2, m11c=0.01, m12c=0.04, m13c=0.02, m21c=0.05, m22c=0.03, m23c=0.03, m31c=0.02, m32c=0.05, m33c=0.01, m11d=0.02, m12d=0.025, m13d=0.01, m21d=0.025, m22d=0.005, m23d=0.03, m31d=0.005, m32d=0.03, m33d=0.015, m1u=m2u=m3u=1, m1v=0.8, m2v=0.7, m3v=0.9. The degree of periodicity is equal to 100. In Figure 5 and Figure 6, we depict the coordinates and the trajectory of the solution z(t)=(z1(t),z2(t),z3(t)) of the neural network (21), with initial values z1(0)=0.8, z2(0)=0.2, z3(0)=0.5. The solution z(t) asymptotically converges to the unpredictable solution ω(t).

Figure 5.

Figure 5

The coordinates z1(t), z2(t) and z3(t) of the solution of system (21) with the initial conditions z1(0)=0.8, z2(0)=0.2, z3(0)=0.5 and =100.

Figure 6.

Figure 6

The trajectory of the neural network (21).

Observing the graphs in Figure 1 and Figure 3, if 1, we see that the unpredictability prevails. More preciously, periodicity appears only locally on separated intervals if <1, and is not seen at all for =1. Oppositely, if >1, one can see in Figure 5 that the solution admits clear periodic shape, which is enveloped by the unpredictability.

5. Conclusions

In this paper, we consider HNNs with variable two-component connection matrix, rates and external inputs. Sufficient conditions are obtained to ensure the existence of exponentially stable unpredictable solutions for HNNs. We introduced and utilized the quantitative characteristic, the degree of periodicity, which differentiates contribution of components, that is, the periodicity and the unpredictability, in the outputs of the model. The obtained results make it possible to find effects of periodicity in chaotic oscillations, which is very important for synchronization, stabilization and control of chaos.

Author Contributions

M.A.: conceptualization; investigation; validation; writing—original draft. M.T.: investigation; writing—review and editing. A.Z.: investigation; software; writing—original draft. All authors have read and agreed to the published version of the manuscript.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Funding Statement

M.A. and A.Z. have been supported by the 2247-A National Leading Researchers Program of TUBITAK, Turkey, N 120C138. M. Tleubergenova has been supported by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan, grant No. AP08856170.

Footnotes

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA. 1982;79:2554–2558. doi: 10.1073/pnas.79.8.2554. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Hopfield J.J. Neurons with graded response have collective computational properties like those of two-stage neurons. Proc. Natl. Acad. Sci. USA. 1984;81:3088–3092. doi: 10.1073/pnas.81.10.3088. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Pajares G. A Hopfield Neural Network for Image Change Detection. IEEE Trans. Neural Netw. 2006;17:1250–1264. doi: 10.1109/TNN.2006.875978. [DOI] [PubMed] [Google Scholar]
  • 4.Koss J.E., Newman F.D., Johnson T.K., Kirch D.L. Abdominal organ segmentation using texture transforms and a Hopfield neural network. IEEE Trans. Med. Imaging. 1999;18:640–648. doi: 10.1109/42.790463. [DOI] [PubMed] [Google Scholar]
  • 5.Cheng K.C., Lin Z.C., Mao C.W. The Application of Competitive Hopfield Neural Network to Medical Image Segmentation. IEEE Trans. Med. Imaging. 1996;15:560–567. doi: 10.1109/42.511759. [DOI] [PubMed] [Google Scholar]
  • 6.Soni N., Sharma E.K., Kapoor A. Application of Hopfield neural network for facial image recognition. IJRTE. 2019;8:3101–3105. [Google Scholar]
  • 7.Sang N., Zhang T. Segmentation of FLIR images by Hopfield neural network with edge constraint. Pattern Recognit. 2001;34:811–821. doi: 10.1016/S0031-3203(00)00041-8. [DOI] [Google Scholar]
  • 8.Amartur S.C., Piraino D., Takefuji Y. Optimization neural networks for the segmentation of magnetic resonance images. IEEE Trans. Med. Imaging. 1992;11:215–220. doi: 10.1109/42.141645. [DOI] [PubMed] [Google Scholar]
  • 9.Mohammad S. Exponential stability in Hopfield-type neural networks with impulses. Chaos Solitons Fractals. 2007;32:456–467. doi: 10.1016/j.chaos.2006.06.035. [DOI] [Google Scholar]
  • 10.Chen T., Amari S.I. Stability of asymmetric Hopfield networks. IEEE Trans. Neural Netw. 2001;12:159–163. doi: 10.1109/72.896806. [DOI] [PubMed] [Google Scholar]
  • 11.Shi P.L., Dong L.Z. Existence and exponential stability of anti-periodic solutions of Hopfield neural networks with impulses. Appl. Math. Comput. 2010;216:623–630. doi: 10.1016/j.amc.2010.01.095. [DOI] [Google Scholar]
  • 12.Juang J. Stability analysis of Hopfield type neural networks. IEEE Trans. Neural Netw. 1999;10:1366–1374. doi: 10.1109/72.809081. [DOI] [PubMed] [Google Scholar]
  • 13.Yang H., Dillon T.S. Exponential stability and oscillation of Hopfield graded response neural network. IEEE Trans. Neural Netw. 1994;5:719–729. doi: 10.1109/72.317724. [DOI] [PubMed] [Google Scholar]
  • 14.Liu B. Almost periodic solutions for Hopfield neural networks with continuously distributed delays. Math. Comput. Simul. 2007;73:327–335. doi: 10.1016/j.matcom.2006.05.027. [DOI] [Google Scholar]
  • 15.Liu Y., Huang Z., Chen L. Almost periodic solution of impulsive Hopfield neural networks with finite distributed delays. Neural Comput. Appl. 2012;21:821–831. doi: 10.1007/s00521-011-0655-x. [DOI] [Google Scholar]
  • 16.Guo S.J., Huang L.H. Periodic oscillation for a class of neural networks with variable coefficients. Nonlinear Anal. Real World Appl. 2005;6:545–561. doi: 10.1016/j.nonrwa.2004.11.004. [DOI] [Google Scholar]
  • 17.Liu B.W., Huang L.H. Existence and exponential stability of almost periodic solutions for Hopfield neural networks with delays. Neurocomputing. 2005;68:196–207. doi: 10.1016/j.neucom.2005.05.002. [DOI] [Google Scholar]
  • 18.Liu Y.G., You Z.S., Cao L.P. On the almost periodic solution of generalized Hopfield neural networks with time-varying delays. Neurocomputing. 2006;69:1760–1767. doi: 10.1016/j.neucom.2005.12.117. [DOI] [Google Scholar]
  • 19.Yang X.F., Liao X.F., Evans D.J., Tang Y. Existence and stability of periodic solution in impulsive Hopfield neural networks with finite distributed delays. Phys. Lett. A. 2005;343:108–116. doi: 10.1016/j.physleta.2005.06.008. [DOI] [Google Scholar]
  • 20.Zhang H., Xia Y. Existence and exponential stability of almost periodic solution for Hopfield type neural networks with impulse. Chaos Solitons Fractals. 2008;37:1076–1082. doi: 10.1016/j.chaos.2006.09.085. [DOI] [Google Scholar]
  • 21.Bai C. Existence and stability of almost periodic solutions of Hopfield neural networks with continuously distributed delays. Nonlinear Anal. Theory Methods Appl. 2009;71:5850–5859. doi: 10.1016/j.na.2009.05.008. [DOI] [Google Scholar]
  • 22.Poincare H. New Methods of Celestial Mechanics. Dover Publications; New York, NY, USA: 1957. [Google Scholar]
  • 23.Birkhoff G. Dynamical Systems. American Mathematical Society; Providence, RI, USA: 1927. [Google Scholar]
  • 24.Akhmet M., Fen M.O. Unpredictable points and chaos. Commun. Nonlinear Sci. Nummer. Simulat. 2016;40:1–5. doi: 10.1016/j.cnsns.2016.04.007. [DOI] [Google Scholar]
  • 25.Akhmet M., Fen M.O. Poincare chaos and unpredictable functions. Commun. Nonlinear Sci. Nummer. Simulat. 2017;48:85–94. doi: 10.1016/j.cnsns.2016.12.015. [DOI] [Google Scholar]
  • 26.Akhmet M., Tleubergenova M., Zhamanshin A. Poincare chaos for a hyperbolic quasilinear system. Miskolc Math. Notes. 2019;20:33–44. doi: 10.18514/MMN.2019.2826. [DOI] [Google Scholar]
  • 27.Akhmet M., Seilova R., Tleubergenova M., Zhamanshin A. Shunting inhibitory cellular neural networks with strongly unpredictable oscillations. Commun. Nonlinear Sci. Numer. Simul. 2020;89:105287. doi: 10.1016/j.cnsns.2020.105287. [DOI] [Google Scholar]
  • 28.Akhmet M., Tleubergenova M., Akylbek Z. Inertial neural networks with unpredictable oscillations. Mathematics. 2020;8:1797. doi: 10.3390/math8101797. [DOI] [Google Scholar]
  • 29.Akhmet M. Domain Structured Dynamics: Unpredictability, Chaos, Randomness, Fractals, Differential Equations and Neural Networks. IOP Publishing; Bristol, UK: 2021. [Google Scholar]
  • 30.Akhmet M., ÇinÇin D.A., Tleubergenova M., Nugayeva Z. Unpredictable oscillations for Hopfield–type neural networks with delayed and advanced arguments. Mathematics. 2020;9:571. doi: 10.3390/math9050571. [DOI] [Google Scholar]
  • 31.Akhmet M., Tleubergenova M., Nugayeva Z. Unpredictable Oscillations of Impulsive Neural Networks with Hopfield Structure. Lect. Notes Data Eng. Commun. Technol. 2021;76:625–642. [Google Scholar]
  • 32.Sell G. Topological Dynamics and Ordinary Differential Equations. Van Nostrand Reinhold Company; London, UK: 1971. [Google Scholar]
  • 33.Akhmet M., Tleubergenova M., Zhamanshin A. Modulo periodic Poisson stable solutions of quasilinear differential equations. Entropy. 2021;23:1535. doi: 10.3390/e23111535. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Hartman P. Ordinary Differential Equations. Birkhauser; Boston, MA, USA: 2002. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Not applicable.


Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES