Abstract
A deep neural network has multiple layers to learn more complex patterns and is built to simulate the activity of the human brain. Currently, it provides the best solutions to many problems in image recognition, speech recognition, and natural language processing. The present study deals with the topological properties of deep neural networks. The topological index is a numeric quantity associated to the connectivity of the network and is correlated to the efficiency and accuracy of the output of the network. Different degree-related topological indices such as Zagreb index, Randic index, atom-bond connectivity index, geometric-arithmetic index, forgotten index, multiple Zagreb indices, and hyper-Zagreb index of deep neural network with a finite number of hidden layers are computed in this study.
1. Introduction
Neural networks are not only studied in artificial intelligence but also have got great applications in intrusion detection systems, image processing, localization, medicine, and chemical and environmental sciences [1–3]. Neural networks are used to model and learn complex and nonlinear relationships, which is very important in real life because many of the relationships of inputs and outputs are nonlinear and complex. Artificial neural networks are the backbone of robotics, defense technology, and neural chemistry. Neural networks are not only being widely used as a tool for predictive analysis but also trained successfully to model processes including crystallization, adsorption, distillation, gasification, dry reforming, and filtration in neural chemistry [4–8].
The topological index associates a unique number to a graph or network, which provides correlation with the physiochemical properties of the network. Degree-based topological index depends upon the connectivity of the network. The first degree-based topological index, called the Randić index, was formulated by Milan Randić [9] while analyzing the boiling point of paraffin. Over the last three decades, hundreds of topological indices have been formulated by researchers, which are helpful in studying the different properties of chemical graphs like reactivity, stability, boiling point, enthalpy of formation, and Kovat's constant and inherits physical properties of materials such as stress, elasticity, strain, mechanical strength, and many others.
Bollobás and Erdős [10] introduced the general Randić index given by equation (1). The first and second Zagreb indices were introduced by Gutman and Trinajstić [11] in 1972, which appeared during the analysis of π-electron energy of atoms. The multiplicative version of these Zagreb indices (the first multiplicative Zagreb index and the second multiplicative Zagreb index) of a graph were formulated by Ghorbani and Azimi [12]. Shirdel et al. [13] introduced a new version of Zagreb indices named as the hyper-Zagreb index. The widely used atom-bond connectivity (ABC) index is introduced by Estrada et al. [14]. Zhou and Trinajstić [15] gave the idea of the sum-connectivity index (SCI). The geometric-arithmetic index was introduced by Vukičević and Furtula [16]. Javaid et al. [17] investigated the degree-based topological indices for the probabilistic neural networks in 2017. Topological indices for multilayered probabilistic neural networks and recurrent neural networks have also been computed recently [18–21]. For more work-related to computation and bounds of topological indices, see [22–29].
Consider a graph G having a set of nodes Vand a set of edges E. Degree of a node v, denoted by dv, is the number of nodes connected to v via an edge. A degree-based topological indices of a graph G are defined as follows:
Randić index
| (1) |
General Randić index
| (2) |
First Zagreb index
| (3) |
Second Zagreb index
| (4) |
First multiple Zagreb index
| (5) |
Second multiple Zagreb index
| (6) |
Hyper-Zagreb index
| (7) |
Atom-bond connectivity index
| (8) |
Sum connectivity index
| (9) |
Geometric-arithmetic index
| (10) |
2. Methodology
A deep neural network (DNN) can be represented by a graph Z : =(V, E), where Vdenotes the nodes of the network andE denotes the set of edges between the nodes. We consider a DNN with an input layer having M nodes, r hidden layers each layer having Ni, i=1,2,…, r number of nodes such that the first layer has N1 nodes, the second layer has N2 nodes, and similarly, the r-th layer has Nr nodes, which can also be expressed as DNN(N1N2 … Nr). The output layer of DNN has N nodes. Each node of every layer is connected to all nodes of the next layer. For instance, Figure 1 shows a DNN with an input layer having four nodes, an output layer with three nodes, and five hidden layers.
Figure 1.

A deep neural network with five hidden layers of DNN (4,4,5,6,4,3,3).
We first partition the edges of the graph of DNN according to the degree of end vertices of the graph. We analyze the structure of the graph by considering the connectivity of vertices of each layer to the next layer. In DNN, each node of every layer is connected to all nodes of the next layer. This fact is employed to count the degree of each vertex. Consider a deep neural network DNN(N1N2 … Nr). Each node in the input layer has a degree N1 because every input node is connected to each node of a first hidden layer having N1 nodes. In the first hidden layer, all node (N1) has the same degree, i.e., M+N2. Nodes of the second layer have degree N1+N3. Similarly, the nodes of i-th hidden layer have degree Ni−1+Ni+1. The nodes of the output layer have degree Nr.
We will compute topological indices using the edge partition method. We will classify the edges on basis of degrees of end-nodes of the edges. The number of edges connecting the input layer to the first hidden layer is N1 , whose end-nodes have degrees N1 and M+N2. The edges connecting i-th hidden layer to i+1-st layer have end-nodes having degrees Ni−1+Ni+1 and Ni+Ni+2 and the number of such edges is NiNi+1 . Similarly, the NrN edges connecting the last hidden layer to the output layer have degrees Nr−1+N and Nr of end-nodes. These findings are summarized in Table 1 below, which will be further helpful in computing the topological indices.
Table 1.
The edge partition of DNN(N1N2 … Nr) based on degrees of end nodes.
| (du, dv), uv ∈ E(Z) | Number of edges of the form (du, dv) |
|---|---|
| (N1, M+N2) | MN 1 |
| (M+N2, N1+N3) | N 1 N 2 |
| (N1+N3, N2+N4) | N 2 N 3 |
| ⋮ | ⋮ |
| (Nr−3+Nr−1, Nr−2+Nr) | N r−2 N r−1 |
| (Nr−2+Nr, Nr−1+N) | N r−1 N r |
| (Nr−1+N, Nr) | N r N |
3. Results and Discussions
In this section, we have derived the expressions to compute the topological indices of the deep neural network. These results are related to the connectivity of nodes of DNN.
Theorem 1 . —
Let Z≅DNN(N1N2 … Nr) be a deep neural network. Then the Randić index(R(1/2)(Z)) and general Randić index (Rα(Z)) of DNN are given as
Rα(Z)=(MN1)((N1)(M+N2))α+(N1N2)((M+N2)(N1+N3))α+(∑i=2r−2NiNi+1((Ni−1+Ni+1)(Ni+Ni+2))α)+(Nr−1Nr)((Nr−2+Nr)(Nr−1+N))α+(NrN)((Nr−1+N)(Nr))α.
Proof —
We calculated the degrees of end nodes of every edge for DNN(N1N2 … Nr). By using the definitions and values from Table 1, we get the following results:
- (i)
Substituting values from Table 1, we get
(11) This can be expressed as follows:
(12) - (ii)
Using Table 1, we get
(13) This can be further summarized as
(14)
Theorem 2 . —
Let Z≅DNN(N1N2 … Nr) be a deep neural network. Then, first Zagreb index(M1(Z)), second Zagreb index(M2(Z)), first multiplicative Zagreb(PM1(Z)) index, and second multiplicative Zagreb index(PM2(Z)) of DNN are given as follows:
M1(Z)=(MN1)(N1+M+N2)+(N1N2)(M+N2+N1+N3)+∑i=2r−2NiNi+1(Ni−1+Ni+1+Ni+Ni+2)+(Nr−1Nr)(Nr−2+Nr+Nr−1+N)+(NrN)(Nr−1+N+Nr)
M2(Z)=(MN1)[(N1)(M+N2)]+(N1N2)[(M+N2)(N1+N3)]+∑i=2r−2NiNi+1(Ni−1+Ni+1)(Ni+Ni+2)+(Nr−1Nr)[(Nr−2+Nr)(Nr−1+N)]+(NrN)[(Nr−1+N)(Nr)]
PM1(Z)=(N1+M+N2)MN1 × (M+N2+N1+N3)N1N2 × ∏i=2r−2(Ni−1+Ni+1+Ni+Ni+2)NiNi+1 × (Nr−2+Nr+Nr−1+N)Nr−1Nr × (Nr−1+N+Nr)NrN
PM2(Z)=[(N1)(M+N2)]MN1 × [(M+N2)(N1+N3)]N1N2 × ∏i=2r−2[(Ni−1+Ni+1)(Ni+Ni+2)]NiNi+1 × [(Nr−2+Nr)(Nr−1+N)]Nr−1Nr × [(Nr−1+N)(Nr)]NrN
Proof —
To compute the topological indices of DNN, we use the edge partition method. In Table 1, we have calculated the degrees of end-nodes of each edge forDNN(N1N2 … Nr). Now, by using the definitions and values from Table 1, we have the following results:
- (i)
Substituting values from Table 1, we get
(15) It can be expressed as follows:
(16) - (ii)
Substituting values from Table 1, we have
(17) It can be expressed as follows:
(18) - (iii)
PM1(Z)=∏uv∈E[du+dv]=∏uv∈E(N1, M+N2)[du+dv]+∏uv∈E(M+N2, N1+N3)[du+dv]+∏uv∈E(N1+N3, N2+N4)[du+dv] … +∏uv∈E(Nr−3+Nr−1,Nr−2+Nr)[du+dv]+∏uv∈E(Nr−2+Nr, Nr−1+N)[du+dv]+∏uv∈E(Nr−1+N, Nr)[du+dv]
Using Table 1, we get
(19) This can be further summarized as follows:
(20) - (iv)
We know, from equation (6), PM2(Z)=∏uv∈E[du × dv]
(21) Substituting the values from Table 1, we getpi
(22) The above expression can be expressed as follows:
(23)
Theorem 3 . —
Let Z≅DNN(N1N2 … Nr) be a deep neural network. Then the forgotten Zagreb index(F(Z)) and hyper-Zagreb index(HM(Z)) of DNN are given as follows:
F(Z)=(MN1)((N1)2+(M+N2)2)+(N1N2)((M+N2)2+(N1+N3)2)+∑i=2r−2NiNi+1((Ni−1+Ni+1)2+(Ni+Ni+2)2)+(Nr−1Nr)((Nr−2+Nr)2+(Nr−1+N)2)+(NrN)((Nr−1+N)2+(Nr)2).
HM(Z)=(MN1)(N1+M+N2)2+(N1N2)(M+N2+N1+N3)2+∑i=2r−2NiNi+1(Ni−1+Ni+1+Ni+Ni+2)2+(Nr−1Nr)(Nr−2+Nr+Nr−1+N)2+(NrN)(Nr−1+N+Nr)2.
Proof —
To compute the topological indices of DNN, we use the edge partition method. In Table 1, we have calculated the degrees of end nodes of every edge for DNN(N1N2 … Nr). Now, by using the definitions and values from Table 1, we get the results given below
- (i)
Using Table 1, the above relation becomes
(24) This can be summarized as follows:
(25) - (ii)
Substituting values from Table 1, we get
(26) The above expression can be further summarized as follows:
(27)
Theorem 4 . —
Let Z≅DNN(N1N2 … Nr) be a deep neural network. The atom-bond connectivity index (ABC(Z)), geometric-arithmetic index (GA(Z)), sum connectivity index (SCI(Z)), and augmented Zagreb index (AZI(Z)) of DNN are given as follows:
AZI(Z)=(MN1)((N1)(M+N2)/N1+M+N2)3+(N1N2)((M+N2)(N1+N3)/M+N2+N1+N3)3+∑i=2r−2NiNi+1((Ni−1+Ni+1)(Ni+Ni+2)/Ni−1+Ni+1+Ni+Ni+2)3+(Nr−1Nr)((Nr−2+Nr)(Nr−1+N)/Nr−2+Nr+Nr−1+N)3+(NrN)((Nr−1+N)(Nr − 2)/Nr−1+N+Nr)3
Proof —
- (i)
Using edge partition in Table 1, we have
(28) which can be shortened as follows:
(29) - (ii)
Using Table 1, we get
(30) This can be expressed as follows:
(31) - (iii)
Using Table 1, we get
(32) This can be expressed as follows:
(33) - (iv)
AZI(Z)=∑uv∈E(du×dv/du+dv)3=∑uv∈E(N1, M+N2)(dudv/du+dv)3+∑uv∈E(M+N2, N1+N3)(dudv/du+dv)3+∑uv∈E(N1+N3, N2+N4)(dudv/du+dv)3+…+∑uv∈(Nr−3+Nr−1,Nr−2+Nr)(dudv/du+dv)3+∑uv∈E((Nr−2+NrNr−1+N))(dudv/du+dv)3+∑uv∈E(Nr−1+N, Nr)(dudv/du+dv)3
Substituting values from Table 1, we get
(34) This can be abbreviated as follows:
(35)
4. Conclusions
The deep neural network is helpful in modeling compounds with desirable physical and chemical properties employing the structure of compounds. This paper gives computational insight into the degree-dependent topological indices, which include the Randic index, Zagreb index, multiplicative Zagreb indices, harmonic index, ABC index, GA index, and sum-connectivity index of a general DNN with r-hidden layers. These indices correlate the structure with the properties such as boiling point, molar refractivity (MR), molar volume (MV), polar surface area, surface tension, enthalpy of vaporization, flash point, and many others. The results computed in the above theorems give generally closed formulas that can be exploited to compute the topological indices of neural networks under study by giving specific values to the input parameters. The values of the computed indices grow with the growth of hidden layers and also depend on the number of nodes in each layer.
A deep neural network is an important tool used in experimental design, data reduction, fault diagnosis, and process control. The QSAR studies must be integrated with the neural network approach in order to achieve a more physical understanding of the system. The use of DNN provides an alternative way of predicting physical properties and its linkage with topological indices can further enhance theoretical achievements.
This study can be extended further by analyzing the distance-based topological indices such as the Wiener index, Harary index, and PI index. Computation of spectral invariants of deep neural networks such as energy, Estrada energy, and Kirchhoff index is also open for further research in this area.
Acknowledgments
The study was supported by the Science & Technology Bureau of Chengdu 2020-YF09-00005-SN and Sichuan Science and by the Technology program 2021YFH0107 Erasmus + SHYFTE Project 598649-EPP-1-2018-1-FR-EPPKA2-CBHE-JP and by the National Key Research and Development Program under Grant 2018YFB0904205.
Contributor Information
Nazeran Idrees, Email: nazeranidrees@gcuf.edu.pk.
Salma Kanwal, Email: salma.kanwal055@gmail.com.
Data Availability
No data were used to support the findings of this study.
Conflicts of Interest
The authors declare that they have no conflicts of interest regarding the publication of this paper.
References
- 1.Cao J., Li R. Fixed-time synchronization of delayed memristor-based recurrent neural networks. Science China Information Sciences . 2017;60(3):032201–032215. doi: 10.1007/s11432-016-0555-2. [DOI] [Google Scholar]
- 2.Tran T. P., Nguyen T. T. S., Tsai P., Kong X. BSPNN: boosted subspace probabilistic neural network for email security. Artificial Intelligence Review . 2011;35(4):369–382. doi: 10.1007/s10462-010-9198-2. [DOI] [Google Scholar]
- 3.Tran T. P., Cao L., Tran D., Nguyen C. D. Novel intrusion detection using probabilistic neural network and adaptive boosting. International Journal of Computer Science and Information Security . 2009;6(1):83–91. [Google Scholar]
- 4.Yang M., Wei H. Application of a neural network for the prediction of crystallization kinetics. Industrial & Engineering Chemistry Research . 2006;45(1):70–75. doi: 10.1021/ie0487944. [DOI] [Google Scholar]
- 5.Kharitonova O. S., Bronskaya V. V., Ignashina T. V., Al-Muntaser A. A., Khairullina L. E. Modeling of absorption process using neural networks. IOP Conference Series: Earth and Environmental Science . 2019;315(3):032025–032026. doi: 10.1088/1755-1315/315/3/032025. [DOI] [Google Scholar]
- 6.Velásco-Mejía A., Vallejo-Becerra V., Chávez-Ramírez A. U., Torres-González J., Reyes-Vidal Y., Casta˜neda-Zaldivar F. Modeling and optimization of apharmaceutical crystallization process by using neural networks and genetic algorithms. Powder Technology . 2016;292:122–128. doi: 10.1016/j.powtec.2016.01.028. [DOI] [Google Scholar]
- 7.Azzam M., Aramouni N. A. K., Ahmad M. N., Awad M., Kwapinski W., Zeaiter J. Dynamic optimization of dry reformer under catalyst sintering using neural networks. Energy Conversion and Management . 2018;157:146–156. doi: 10.1016/j.enconman.2017.11.089. [DOI] [Google Scholar]
- 8.Bagheri M., Akbari A., Mirbagheri S. A. Advanced control of membrane fouling in filtration systems using artificial intelligence and machine learning techniques: a critical review. Process Safety and Environmental Protection . 2019;123:229–252. doi: 10.1016/j.psep.2019.01.013. [DOI] [Google Scholar]
- 9.Randić M. Characterization of molecular branching. Journal of the American Chemical Society . 1975;97(23):6609–6615. doi: 10.1021/ja00856a001. [DOI] [Google Scholar]
- 10.Bollobas B., Erdos P. Graphs of extremal weights. Ars Combinatoria . 1998;50:225–233. [Google Scholar]
- 11.Gutman I., Trinajstić N. Graph theory and molecular orbitals. Total φ-electron energy of alternant hydrocarbons. Chemical Physics Letters . 1972;17(4):535–538. doi: 10.1016/0009-2614(72)85099-1. [DOI] [Google Scholar]
- 12.Ghorbani M., Hosseinzadeh M. A new version of Zagreb indices. Filomat . 2012;26(1):93–100. doi: 10.2298/fil1201093g. [DOI] [Google Scholar]
- 13.Shirdel G. H., Rezapour H., Sayadi A. M. The hyper-zagreb index of graph operations. Iranian Journal of Mathematical Chemistry . 2013;4:213–220. [Google Scholar]
- 14.Estrada E., Torres L., Rodriguez L., Gutman I. An atom-bond connectivity index: modelling the enthalpy of formation of alkanes. Indian Journal of Chemistry - Section A Inorganic, Physical, Theoretical and Analytical Chemistry . 1998;37:849–855. [Google Scholar]
- 15.Zhou B., Trinajstić N. On general sum-connectivity index. Journal of Mathematical Chemistry . 2010;47(1):210–218. doi: 10.1007/s10910-009-9542-4. [DOI] [Google Scholar]
- 16.Vukičević D., Furtula B. Topological index based on the ratios of geometrical and arithmetical means of end-vertex degrees of edges. Journal of Mathematical Chemistry . 2009;46(4):1369–1376. doi: 10.1007/s10910-009-9520-x. [DOI] [Google Scholar]
- 17.Javaid M., Cao J. Computing topological indices of probabilistic neural network. Neural Computing & Applications . 2018;30(12):3869–3876. doi: 10.1007/s00521-017-2972-1. [DOI] [Google Scholar]
- 18.Mondal S., De N., Pal A. Molecular descriptors of neural networks with chemical significance. Revue Roumaine de Chimie . 2021;65(11):1031–1044. doi: 10.33224/rrch.2020.65.11.08. [DOI] [Google Scholar]
- 19.Liu J. B., Raza Z., Javaid M. Zagreb connection numbers for cellular neural networks. Discrete Dynamics in Nature and Society . 2020;2020:8. doi: 10.1155/2020/8038304.8038304 [DOI] [Google Scholar]
- 20.Javaid M., Abbas M., Liu J. B., Teh W. C., Cao J. Topological properties of four-layered neural networks. Journal of Artificial Intelligence and Soft Computing Research . 2019;9(2):111–122. doi: 10.2478/jaiscr-2018-0028. [DOI] [Google Scholar]
- 21.Liu J. B., Zhao J., Wang S., Javaid M., Cao J. On the topological properties of the certain neural networks. Journal of Artificial Intelligence and Soft Computing Research . 2018;8(4):257–268. doi: 10.1515/jaiscr-2018-0016. [DOI] [Google Scholar]
- 22.Wang C., Liu J. B., Wang S. Sharp upper bounds for multiplicative Zagreb indices of bipartite graphs with given diameter. Discrete Applied Mathematics . 2017;227:156–165. doi: 10.1016/j.dam.2017.04.037. [DOI] [Google Scholar]
- 23.Khalid R., Idrees N., Jawwad Saif M. Topological characterization of Book graph and stacked Book graph. Computers, Materials & Continua . 2019;60(1):41–54. doi: 10.32604/cmc.2019.06554. [DOI] [Google Scholar]
- 24.Gao W., Wang W., Farahani M. R. Topological indices study of molecular structure in anticancer drugs. Journal of Chemistry . 2016;2016:8. doi: 10.1155/2016/3216327.3216327 [DOI] [Google Scholar]
- 25.Idrees N., Jawwad Saif M., Sadiq A., Rauf A., Hussain F. Topological indices of H-naphtalenic nanosheet. Open Chemistry . 2018;16(1):1184–1188. doi: 10.1515/chem-2018-0131. [DOI] [Google Scholar]
- 26.Idrees N., Saif M. J., Rauf A., Mustafa S. First and second Zagreb eccentricity indices of thorny graphs. Symmetry . 2017;9(1):p. 7. doi: 10.3390/sym9010007. [DOI] [Google Scholar]
- 27.Liu J. B., Bao Y., Zheng W. T., Hayat S. Network coherence analysis on a family of nested weighted n-polygon networks. Fractals . 2021;29(08):2150260–2150276. doi: 10.1142/s0218348x21502601. [DOI] [Google Scholar]
- 28.Liu J. B., Zhang T., Wang Y., Lin W. The Kirchhoff index and spanning trees of Möbius/cylinder octagonal chain. Discrete Applied Mathematics . 2022;307:22–31. doi: 10.1016/j.dam.2021.10.004. [DOI] [Google Scholar]
- 29.Afridi S., Yasin Khan M., Ali G. On generalized topological indices for some special graphs. Journal of Mathematics . 2022;2022:p. 21. doi: 10.1155/2022/1369490. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
No data were used to support the findings of this study.
