Abstract
In this paper, we rigorously prove that unpredictable oscillations take place in the dynamics of Hopfield-type neural networks (HNNs) when synaptic connections, rates and external inputs are modulo periodic unpredictable. The synaptic connections, rates and inputs are synchronized to obtain the convergence of outputs on the compact subsets of the real axis. The existence, uniqueness, and exponential stability of such motions are discussed. The method of included intervals and the contraction mapping principle are applied to attain the theoretical results. In addition to the analysis, we have provided strong simulation arguments, considering that all the assumed conditions are satisfied. It is shown how a new parameter, degree of periodicity, affects the dynamics of the neural network.
Keywords: Hopfield-type neural networks, modulo periodic unpredictable synaptic connections, rates and inputs, unpredictable solutions, exponential stability, numerical simulations
1. Introduction
It is well-known that HNNs [1,2] are widely used in the fields of signal and image processing, pattern recognition, associative memory and optimization computation, among others [3,4,5,6,7,8]. Hence, they have been the object of intensive analysis by numerous authors in recent decades. With the increasing improvement in neural networks, the aforementioned systems are being modernized, and the dynamics of models with various types of coefficients are being investigated [9,10,11,12,13]. Special attention is being paid to the problem of the existence and stability of periodic and almost periodic solutions of HNNs [14,15,16,17,18,19,20,21], for which appropriate coefficients and conditions are necessary.
A few years ago, the boundaries of the classical theory of dynamical systems, founded by H. Poincare [22] and G. Birkhoff [23], were expanded by the concepts of unpredictable points and unpredictable functions [24]. It was proven that the unpredictable point leads to the existence of chaos in quasi-minimal sets. That is, the proof of the unpredictability simultaneously confirms Poincare chaos. Unpredictable functions were defined as unpredictable points in the Bebutov dynamical system [25], where the topology of convergence on compact sets of the real axis is used instead of the metric space. The use of such convergence significantly simplifies the problem of proving the existence of unpredictable solutions for differential equations and neural networks, and a new method of included intervals has been introduced and developed in several papers [26,27,28,29,30,31].
Let us commence with the main definitions.
Definition 1
([25]). A uniformly continuous and bounded function is unpredictable if there exist positive numbers and sequences , both of which diverge to infinity such that as uniformly on compact subsets of and for each and .
In Definition 1, the sequences are said to be the convergence and divergence sequences of the function respectively. We call the uniform convergence on compact subsets of the convergence property, and the existence of the sequence and positive numbers is called the separation property. It is known [32] that an unpredictable function without separation property is said to be a Poisson stable function.
Let us introduce a new type of unpredictable functions, which are important objects for investigation in the paper.
Definition 2.
The sum is said to be a modulo periodic unpredictable function if is a continuous periodic function and is an unpredictable function.
In this study, we focus on the Hopfield-type neural network with two-component coefficients and inputs:
| (1) |
where stands for the state vector of the ith unit at time t. The synaptic connections, rates and external inputs are modulo periodic unpredictable; they consist of two components such that are periodic and are unpredictable. and denote components of the synaptic connection weights of the jth unit with the ith unit at time t; the functions denote the measures of activation to its incoming potentials of the unit j at time
Consider the convergence sequence of the unpredictable function For fixed real number one can write that where for all The boundedness of the sequence implies that there exists a subsequence which converges to a number That is, there exists a subsequence of the convergence sequence and a number such that as We called the number the Poisson shift for the convergence sequence with respect to the [33]. Denote by the set of all Poisson shifts. The number is said to be the Poisson number for the convergence sequence If then we say that the sequence satisfies the kappa property.
2. Methods
Due to the development of neural networks and their applications, classical types of functions such as periodic and almost periodic are no longer sufficient to study their dynamics. This is especially seen in analysis of the chaotic behavior of the systems. Therefore, in order to meet requirements of progress, many more functions are needed. To satisfy the demands, we have combined periodic and unpredictable components in rates and inputs. If the periodicity is inserted to serve for stability, the unpredictability guarantees chaotic dynamics. According to Definition 1, verification of the convergence and separation properties is necessary to prove the existence of unpredictable solutions. To provide constructive conditions for the existence of unpredictable solutions, we have determined the special kappa property of the convergence sequence with respect to the period
The method of included intervals, which was introduced in paper [26] and has been developed in [27,28,29,33], is a powerful instrument for verifying convergence properties. This technique has been applied in the study of continuous unpredictable solutions of Hopfield-type neural networks with delayed and advanced arguments [30] and in the study of discontinuous unpredictable solutions of impulsive neural networks with Hopfield structures [31]. All the previous models in [30,31] are considered with constant rates, while in the present research, the rates are variable, and we have designed the new model of Hopfield-type neural networks with modulo periodic unpredictable rates , connection weights and external inputs The periodic components, serve the stability of the model, while the unpredictable components and cause chaotic outputs.
3. Main Results
Throughout the paper, we will use the norm where is the absolute value, and .
Following the results in [34], it can be shown that the function is a solution of (1) if and only if it satisfies the following integral equation:
| (2) |
for all
Denote by the set of vector-functions where satisfy the convergence property with the common convergence sequence Moreover, where H is a positive number. In the set, determines the norm
Define on the operator T such that , where:
| (3) |
for each We will need the following conditions:
-
(C1)
The functions and are continuous —periodic, and for each
-
(C2)
The functions and are unpredictable with the same convergence and divergence sequences Moreover, for all and positive numbers
-
(C3)
The convergence sequence satisfies the kappa property with respect to the period
-
(C4)
There exists a positive number such that
-
(C5)
There exists a positive number L such that the function satisfies the inequality if
According the condition for all the numbers and exist, such that
| (4) |
For convenience, we introduce the following notations:
for each .
The following conditions are required:
-
(C6)
;
-
(C7)
;
-
(C8)
;
for all
Lemma 1.
The set is a complete space.
Proof.
Consider a Cauchy sequence in , which converges to a limit function on . Fix a closed and bounded interval We obtain:
(5) One can choose sufficiently large n and such that each term on the right side of (5) is smaller than for an arbitrary and . Thus, we conclude that the sequence is uniformly converging to on That is, the set is complete. □
Lemma 2.
The operator T is invariant in
Proof.
For a function and fixed , we have that
The last inequality and condition (C6) imply that
Next, applying the method of included intervals, we will show that as uniformly on compact subsets of
Let us fix an arbitrary and a section There exist numbers such that and which satisfy the following inequalities:
(6)
(7) and
(8) for all
Since the functions and are unpredictable, belongs to and the convergence sequence, is common to all of them and satisfies the kappa property. Then, the following inequalities are true: for Moreover, applying condition (C3), one can attain that and for We have that:
for all Consider the terms in the last inequality separately on intervals and By using inequalities (6)–(8), we obtain:
and
for each This is why, for all and we have that So, the function uniformly convergences to on compact subsets of and it is true that □
Lemma 3.
The operator T is contractive in provided that the conditions – are valid.
Proof.
For two functions and fixed it is true that
The last inequality yields . Hence, in accordance with condition (C7), the operator T is contractive in □
Theorem 1.
The neural network (1) admits a unique exponentially stable unpredictable solution provided that conditions – are fulfilled.
Proof.
By Lemma 1, the set is complete; by Lemma 2, the the operator T is invariant in ; and by Lemma 3, the operator T is contractive in Applying the contraction mapping theorem, we obtain that there exists a fixed point of the operator which is a solution of the neural network (1) and satisfies the convergence property.
Next, we show that the solution of (1) satisfies the separation property.
Applying the relations
and
we obtain:
There exist positive numbers and integers such that, for each the following inequalities are satisfied:
(9)
(10)
(11)
(12)
(13)
(14) Let the numbers and k, as well as numbers and , be fixed. Consider the following two alternatives: (i) (ii)
(i) Using (14), one can show that
(15) if Therefore, the condition (C8) and inequalities (9)–(15) imply that
for
(ii) If , it is not difficult to find that (14) implies:
(16) if and Thus, it can be concluded that is an unpredictable solution with sequences and positive numbers
Next, we will prove the stability of the solution It is true that
for all
Let be another solution of system (1). Then,
for all
Making use of the relation:
we obtain that:
for all
Applying the Gronwall–Belman Lemma, one can obtain:
(17) for each So, (C7) implies that is an exponentially stable unpredictable solution of the neural network (1). The theorem is proven. □
4. Numerical Examples
Let be a solution of the logistic discrete equation:
| (18) |
with
In the paper [25], an example was constructed of the unpredictable function The function where is a piecewise constant function defined on the real axis through the equation for
In what follows, we will define the piecewise constant function, for where and h is a positive real number. The number h is said to be the length of step of the functions and We call the ratio of the period and the length of step, the degree of periodicity.
Below, using numerical simulations, we will show how the degree of periodicity affects the dynamics of a neural network.
Example 1.
Let us consider the following Hopfield-type neural network:
(19) where The functions and are —periodic such that The unpredictable functions and such that where with the length of step Condition (C1) is valid, and Since the elements of the convergence sequence are multiples of , and the period ω is equal to condition (C3) is valid. The degree of periodicity is equal to 1/8. Conditions (C4)–(C8) are satisfied with According Theorem 1, the neural network (19) admits a unique asymptotically stable, unpredictable solution In Figure 1 and Figure 2, the coordinates and the trajectory of the neural network are shown (19), which asymptotically convergence to the coordinates and trajectory of the unpredictable solution Moreover, utilizing (17), we have that:
Thus, if then In other words, what is seen in Figure 1 and Figure 2 for sufficiently large time can be accepted as parts of the graph and trajectory of the unpredictable solution.
Figure 1.
The time series of the coordinates , and of the solution of system (19) with the initial conditions , , and .
Figure 2.
The trajectory of the neural network (19).
Example 2.
Let us show the simulation results for the following Hopfield-type neural network:
(20) where
The functions and are periodic with common period and The functions and are unpredictable such that where with the length of step Condition (C1) is valid, and Conditions (C2) and (C3) are satisfied since the elements of the convergence sequence are multiples of and the period ω is equal to The degree of periodicity equals to 1. Conditions (C4)–(C8) are satisfied with Figure 3 and Figure 4 demonstrate the coordinates and the trajectory of the solution of the neural network (20), with initial values The solution asymptotically converges to the unpredictable solution By estimation (17), one can obtain that for .
Figure 3.
The time series of the coordinates , and of the solution of system (20) with the initial conditions , , and .
Figure 4.
The trajectory of the neural network (20).
Example 3.
Finally, we will show how the degree of periodicity, effects the dynamics of the Hopfield-type neural network:
(21) where The functions and are periodic with common period and The unpredictable functions and are such that where with the length of step All conditions (C1)–(C8) are valid with The degree of periodicity is equal to 100. In Figure 5 and Figure 6, we depict the coordinates and the trajectory of the solution of the neural network (21), with initial values The solution asymptotically converges to the unpredictable solution
Figure 5.
The coordinates , and of the solution of system (21) with the initial conditions , , and .
Figure 6.
The trajectory of the neural network (21).
Observing the graphs in Figure 1 and Figure 3, if we see that the unpredictability prevails. More preciously, periodicity appears only locally on separated intervals if and is not seen at all for Oppositely, if one can see in Figure 5 that the solution admits clear periodic shape, which is enveloped by the unpredictability.
5. Conclusions
In this paper, we consider HNNs with variable two-component connection matrix, rates and external inputs. Sufficient conditions are obtained to ensure the existence of exponentially stable unpredictable solutions for HNNs. We introduced and utilized the quantitative characteristic, the degree of periodicity, which differentiates contribution of components, that is, the periodicity and the unpredictability, in the outputs of the model. The obtained results make it possible to find effects of periodicity in chaotic oscillations, which is very important for synchronization, stabilization and control of chaos.
Author Contributions
M.A.: conceptualization; investigation; validation; writing—original draft. M.T.: investigation; writing—review and editing. A.Z.: investigation; software; writing—original draft. All authors have read and agreed to the published version of the manuscript.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflict of interest.
Funding Statement
M.A. and A.Z. have been supported by the 2247-A National Leading Researchers Program of TUBITAK, Turkey, N 120C138. M. Tleubergenova has been supported by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan, grant No. AP08856170.
Footnotes
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA. 1982;79:2554–2558. doi: 10.1073/pnas.79.8.2554. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Hopfield J.J. Neurons with graded response have collective computational properties like those of two-stage neurons. Proc. Natl. Acad. Sci. USA. 1984;81:3088–3092. doi: 10.1073/pnas.81.10.3088. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Pajares G. A Hopfield Neural Network for Image Change Detection. IEEE Trans. Neural Netw. 2006;17:1250–1264. doi: 10.1109/TNN.2006.875978. [DOI] [PubMed] [Google Scholar]
- 4.Koss J.E., Newman F.D., Johnson T.K., Kirch D.L. Abdominal organ segmentation using texture transforms and a Hopfield neural network. IEEE Trans. Med. Imaging. 1999;18:640–648. doi: 10.1109/42.790463. [DOI] [PubMed] [Google Scholar]
- 5.Cheng K.C., Lin Z.C., Mao C.W. The Application of Competitive Hopfield Neural Network to Medical Image Segmentation. IEEE Trans. Med. Imaging. 1996;15:560–567. doi: 10.1109/42.511759. [DOI] [PubMed] [Google Scholar]
- 6.Soni N., Sharma E.K., Kapoor A. Application of Hopfield neural network for facial image recognition. IJRTE. 2019;8:3101–3105. [Google Scholar]
- 7.Sang N., Zhang T. Segmentation of FLIR images by Hopfield neural network with edge constraint. Pattern Recognit. 2001;34:811–821. doi: 10.1016/S0031-3203(00)00041-8. [DOI] [Google Scholar]
- 8.Amartur S.C., Piraino D., Takefuji Y. Optimization neural networks for the segmentation of magnetic resonance images. IEEE Trans. Med. Imaging. 1992;11:215–220. doi: 10.1109/42.141645. [DOI] [PubMed] [Google Scholar]
- 9.Mohammad S. Exponential stability in Hopfield-type neural networks with impulses. Chaos Solitons Fractals. 2007;32:456–467. doi: 10.1016/j.chaos.2006.06.035. [DOI] [Google Scholar]
- 10.Chen T., Amari S.I. Stability of asymmetric Hopfield networks. IEEE Trans. Neural Netw. 2001;12:159–163. doi: 10.1109/72.896806. [DOI] [PubMed] [Google Scholar]
- 11.Shi P.L., Dong L.Z. Existence and exponential stability of anti-periodic solutions of Hopfield neural networks with impulses. Appl. Math. Comput. 2010;216:623–630. doi: 10.1016/j.amc.2010.01.095. [DOI] [Google Scholar]
- 12.Juang J. Stability analysis of Hopfield type neural networks. IEEE Trans. Neural Netw. 1999;10:1366–1374. doi: 10.1109/72.809081. [DOI] [PubMed] [Google Scholar]
- 13.Yang H., Dillon T.S. Exponential stability and oscillation of Hopfield graded response neural network. IEEE Trans. Neural Netw. 1994;5:719–729. doi: 10.1109/72.317724. [DOI] [PubMed] [Google Scholar]
- 14.Liu B. Almost periodic solutions for Hopfield neural networks with continuously distributed delays. Math. Comput. Simul. 2007;73:327–335. doi: 10.1016/j.matcom.2006.05.027. [DOI] [Google Scholar]
- 15.Liu Y., Huang Z., Chen L. Almost periodic solution of impulsive Hopfield neural networks with finite distributed delays. Neural Comput. Appl. 2012;21:821–831. doi: 10.1007/s00521-011-0655-x. [DOI] [Google Scholar]
- 16.Guo S.J., Huang L.H. Periodic oscillation for a class of neural networks with variable coefficients. Nonlinear Anal. Real World Appl. 2005;6:545–561. doi: 10.1016/j.nonrwa.2004.11.004. [DOI] [Google Scholar]
- 17.Liu B.W., Huang L.H. Existence and exponential stability of almost periodic solutions for Hopfield neural networks with delays. Neurocomputing. 2005;68:196–207. doi: 10.1016/j.neucom.2005.05.002. [DOI] [Google Scholar]
- 18.Liu Y.G., You Z.S., Cao L.P. On the almost periodic solution of generalized Hopfield neural networks with time-varying delays. Neurocomputing. 2006;69:1760–1767. doi: 10.1016/j.neucom.2005.12.117. [DOI] [Google Scholar]
- 19.Yang X.F., Liao X.F., Evans D.J., Tang Y. Existence and stability of periodic solution in impulsive Hopfield neural networks with finite distributed delays. Phys. Lett. A. 2005;343:108–116. doi: 10.1016/j.physleta.2005.06.008. [DOI] [Google Scholar]
- 20.Zhang H., Xia Y. Existence and exponential stability of almost periodic solution for Hopfield type neural networks with impulse. Chaos Solitons Fractals. 2008;37:1076–1082. doi: 10.1016/j.chaos.2006.09.085. [DOI] [Google Scholar]
- 21.Bai C. Existence and stability of almost periodic solutions of Hopfield neural networks with continuously distributed delays. Nonlinear Anal. Theory Methods Appl. 2009;71:5850–5859. doi: 10.1016/j.na.2009.05.008. [DOI] [Google Scholar]
- 22.Poincare H. New Methods of Celestial Mechanics. Dover Publications; New York, NY, USA: 1957. [Google Scholar]
- 23.Birkhoff G. Dynamical Systems. American Mathematical Society; Providence, RI, USA: 1927. [Google Scholar]
- 24.Akhmet M., Fen M.O. Unpredictable points and chaos. Commun. Nonlinear Sci. Nummer. Simulat. 2016;40:1–5. doi: 10.1016/j.cnsns.2016.04.007. [DOI] [Google Scholar]
- 25.Akhmet M., Fen M.O. Poincare chaos and unpredictable functions. Commun. Nonlinear Sci. Nummer. Simulat. 2017;48:85–94. doi: 10.1016/j.cnsns.2016.12.015. [DOI] [Google Scholar]
- 26.Akhmet M., Tleubergenova M., Zhamanshin A. Poincare chaos for a hyperbolic quasilinear system. Miskolc Math. Notes. 2019;20:33–44. doi: 10.18514/MMN.2019.2826. [DOI] [Google Scholar]
- 27.Akhmet M., Seilova R., Tleubergenova M., Zhamanshin A. Shunting inhibitory cellular neural networks with strongly unpredictable oscillations. Commun. Nonlinear Sci. Numer. Simul. 2020;89:105287. doi: 10.1016/j.cnsns.2020.105287. [DOI] [Google Scholar]
- 28.Akhmet M., Tleubergenova M., Akylbek Z. Inertial neural networks with unpredictable oscillations. Mathematics. 2020;8:1797. doi: 10.3390/math8101797. [DOI] [Google Scholar]
- 29.Akhmet M. Domain Structured Dynamics: Unpredictability, Chaos, Randomness, Fractals, Differential Equations and Neural Networks. IOP Publishing; Bristol, UK: 2021. [Google Scholar]
- 30.Akhmet M., ÇinÇin D.A., Tleubergenova M., Nugayeva Z. Unpredictable oscillations for Hopfield–type neural networks with delayed and advanced arguments. Mathematics. 2020;9:571. doi: 10.3390/math9050571. [DOI] [Google Scholar]
- 31.Akhmet M., Tleubergenova M., Nugayeva Z. Unpredictable Oscillations of Impulsive Neural Networks with Hopfield Structure. Lect. Notes Data Eng. Commun. Technol. 2021;76:625–642. [Google Scholar]
- 32.Sell G. Topological Dynamics and Ordinary Differential Equations. Van Nostrand Reinhold Company; London, UK: 1971. [Google Scholar]
- 33.Akhmet M., Tleubergenova M., Zhamanshin A. Modulo periodic Poisson stable solutions of quasilinear differential equations. Entropy. 2021;23:1535. doi: 10.3390/e23111535. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Hartman P. Ordinary Differential Equations. Birkhauser; Boston, MA, USA: 2002. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Not applicable.






