Skip to main content
Computational Intelligence and Neuroscience logoLink to Computational Intelligence and Neuroscience
. 2022 Aug 1;2022:5692427. doi: 10.1155/2022/5692427

Adolescent Identity Search Algorithm Based on Fast Search and Balance Optimization for Numerical and Engineering Design Problems

Wentao Wang 1, Hao Liu 1,, Quanqin He 1
PMCID: PMC9359833  PMID: 35958778

Abstract

This paper proposed a fast convergence and balanced adolescent identity search algorithm (FCBAISA) for numerical and engineering design problems. The main contributions are as follows. Firstly, a hierarchical optimization strategy is proposed to balance the exploration and exploitation better. Secondly, a fast search strategy is proposed to avoid the local optimization and improve the accuracy of the algorithm; that is, the current optimal solution combines with the random disturbance of Brownian motion to guide other adolescents. Thirdly, the Chebyshev functional-link network (CFLN) is improved by recursive least squares estimation (RSLE), so as to find the optimal solution more effectively. Fourthly, the terminal bounce strategy is designed to avoid the algorithm falling into local optimization in the later stage of iteration. Fifthly, FCBAISA and comparison algorithms are tested by CEC2017 and CEC2022 benchmark functions, and the practical engineering problems are solved by algorithms above. The results show that FCBAISA is superior to other algorithms in all aspects and has high precision, fast convergence speed, and excellent performance.

1. Introduction

Optimization is an important part to find better solutions when solving many scientific problems [1]. Many practical problems ultimately boil down to a set of decision variables that make the objective function to obtain the most optimal value. Researchers have found that meta-heuristic algorithms can solve many practical problems in the specified error range, which greatly improves the efficiency. Therefore, a variety of meta-heuristic algorithms are widely proposed by researchers, which are used to find approximate solutions of many complex problems. Based on studies from many researchers, meta-heuristic approaches can be divided into four main categories [2], and its details are shown in Figure 1.

Figure 1.

Figure 1

Classification of meta-heuristic techniques (meta-heuristic diamond).

A mature global optimization method with good stability and wide applicability is named evolutionary computation. EAs are inspired by the evolutionary operation of organisms in nature. They have characteristics of organization, adaptive, and learning, which can be applied to solve complex problems effectively, which is difficult to be solved by traditional optimization algorithms. Some of the renowned algorithms are genetic algorithm (GA) [3], differential evolution (DE) [4], estimation of distribution algorithms (EDA) [5], etc.

Swarm intelligence mainly simulates a group behavior of insects, herds, birds, and fish. Each member of the population constantly changes direction by learning its own experience and other members' experience. This phenomenon stimulates design algorithms and distributed problem solution. There are many such algorithms, for example, particle swarm optimization (PSO) [6], investigation of bee colony algorithm (ABC) [7], bacterial foraging algorithm (BFA) [8], Harris hawks optimization algorithm (HHO) [9], research on firefly algorithm (FA) [10], fruit fly optimization algorithm (FOA) [11], krill herd algorithm (KH) [12], research on crow search algorithm (CSA) [13], grass fibrous root optimization algorithm (FRO) [14], Flamingo search algorithm (FSA) [15], flow direction algorithm (FDA) [16], grey wolf optimizer (GWO) [17], battle royale optimization algorithm (BRO) [18] and coot swarm optimization (CSO) [19].

Meta-heuristic algorithms are based on physics or chemistry and formed by observing some physical or chemical phenomena and using their laws including gravity, potential energy, ecosystem, and motion. Simulated annealing (SA) [20], gravitational search algorithm (GSA) [21], noisy intermediate-scale quantum algorithms (NISQ) [22], chemical reaction optimization (CRO) [23], charged system search (CSS) [24], black hole (BH) [25], ions motion algorithm [26], multiverse optimizer (MVO) [27] and vortex search (VS) [26] are some optimizers in this category.

The last kind of meta-heuristic algorithm is based on human behavior, habit, thought, and logic. It is very popular in solving many problems, such as Tabu search (TS) [28], mine blast algorithm (MBA) [29], teaching–learning-based optimization (TLBO) [30], interior search algorithm (ISA) [31], exchange market algorithm (EMA) [32] and heuristic genetic algorithm (HGA) [33].

Optimization is applied to various real-life applications to reduce the waste of resources, save costs, reduce expenses, and maximize benefits. Researchers develop a large number of new algorithms or hybrid algorithms to solve real-life problems. In the process of product development, the newly developed political optimization algorithm (POA) was used and minimized the product cost, which is a new idea for industrial companies to fill the gap in their product design stage [34]. The new optimizer based on the ecogeography-based optimization algorithm (EBO) was applied to vehicle design for the first time, and better design results are obtained [35]. A new optimization algorithm based on grasshopper optimization algorithm and Nerdler–Mead algorithm (HGOANM) was developed to explore robot design of the robot gripper mechanism. The results showed that this algorithm can solve practical engineering problems quickly in Reference [36]. A new hybrid Taguchi salp swarm algorithm (HTSSA) was designed and used to speed up the optimization process of industrial structure design. The results reflected that the ability of HTSSA was superiority to optimize the product design process [37]. The new optimizer was developed, which is based on Seagull optimization (SOA), and its performance was verified by large-scale industrial engineering problems [38].

With the research and development of algorithms, the continuous development of optimization algorithm diversity is encouraged. A novel meta-heuristic approach based on human behavior for solving various complex optimization problems was introduced and called adolescent identity search algorithm (AISA) [39], which are first proposed by Esref Bogar and Selami Beyhan in 2020. This paper makes a series of improvements to AISA, which can make it performance better. The main contributions can be summarized as follows:

  1. This work divides the iteration into three layers and makes full use of the update mechanism of each layer to obtain the best adolescent identity, which can enrich population diversity, and balance the capabilities of exploration and exploitation better.

  2. The current optimal solution guides other adolescents to combine Brownian motion, which can accelerate the convergence speed of the algorithm and prevent the algorithm from local optimization.

  3. Recursive least squares estimation (RLSE) is proposed to estimate the weight factor better. Optimizing the improved CFLN can improve the ability of exploration and exploitation, which makes the optimal solution and can be found more effectively by the algorithm.

  4. To prevent AISA into local optimum at the late iteration, a terminal bounce strategy is proposed.

The structure of this paper is listed as follows. In Section 2, the adolescent identity search algorithm (AISA) is introduced. Section 3 describes FCBAISA in detail. The experimental comparison among FCBAISA and other algorithms is presented and discussed in Section 4. The practical engineering problems are solved by FCBAISA and comparison algorithms in Section 5. In Section 6, the summaries of this paper and the future work based on FCBAISA are listed.

2. The Canonical AISA

An optimization algorithm constructed on human behavior was called AISA by Esref Bogar and Selami Beyhan in 2020. Through observing the formation process of adolescent identity and modeling it mathematically, a creative algorithm has been formed. This section briefly describes AISA, the details in Reference [39].

2.1. Population Random Initialization

In AISA, a random initial population is generated by

xji=lbj+U0,1jubjlbj,i=1,2,,N;j=1,2,,n, (1)

where xji is the jth identity feature of the ith adolescent and U(0,1) is an random number distributed uniformly in the range [0, 1]. lb is the lower boundary vectors of search space, and ub represents the upper.

2.2. Creating a New Identity

According to the characteristics of adolescent identity exploration, it is assumed that a situation is randomly selected during the iterative update. The three cases of adolescent identity feature selection in this algorithm are as follows:

Case 1 . —

Teenagers form their identities by observing the surrounding society, judging social values, and choosing the correct beliefs and attitudes. Specifically, the Chebyshev functional-link network (CFLN) [40] approximation model is introduced to find the best adolescent identity, and the modeling process is as follows.

Chebyshev polynomials are shown in the following equation:

Tsx=1,if s=0,x,if s=1,2xTs1xTs2x,if s2, (2)

where s is the degree of Chebyshev polynomials.

Normalizing input samples (population) for the CFLN model in [−1,1] by using the following equation:

x^ji=2xjilbjubjlbj1, (3)

where x^ji is normalized value of the jth identity feature of the ith adolescent. lb and ub are the lower and upper boundary vectors of search space. The identity is represented by the following normalized input matrix:

X^=x^11x^21x^n1x^12x^22x^n2x^1Nx^2Nx^nNN×n. (4)

Then, according to (2), the matrix Ψ of each input element is obtained by (5), and Ψ is the regression matrix.

Ψ=T1x^11Tsx^11T1x^n1Tsx^n1T1x^12Tsx^12T1x^n2Tsx^n2T1x^1NTsx^1NT1x^nNTsx^nNN×n×s=ψ11ψn1ψ12ψn2ψ1NψnN. (5)

Weighting factors are estimated by using the least square estimation (LSE) in approximate model as follows:

ω^=ΨTΨ1ΨTfΨ=ω^11,,ω^s1,,ω^1n,,ω^sn=ω1,,ωn1×n×s, (6)

where ωj represents the weight vector of the jth input.

All elements in (4) after normalization, the fitness values are calculated by (7) and stored in the matrix F^.

f^ji=ψjiωi, (7)
F^=f^11f^21f^n1f^12f^22f^n2f^1Nf^2Nf^nNN×n. (8)

Finally, the fitness values of the random initialization matrix elements are calculated and find the row index of the minimum value of each column in the matrix through the approximate model to form the best vector of identity of the present population, as shown in the following equation:

xj=xjmj,mj=argminlf^jl|l=1,2,,N,j. (9)

In Case 1, new identity of the ith adolescent is defined as

xnewi=xir1xix, (10)

where r1 ∈ [0,1] represents a random number, and x represents the best identity feature created by each teenager in (8). The (10) represents a new identity that adolescents strive to acquire from their peer group with good behaviors.

Case 2 . —

Believing that a role model has noble quality, good style, and imitating the role model to form the new identity.

Adolescents imitate the role model to form the new identity because they believe that a role model has noble quality and good style.

Therefore, adolescents can choose a better individual than themselves through learning. In this case, the updating formula for generate a new identity is written by the following equation:

xnewi=xir2xpxrm, (11)

where r2 ∈ [0,1] is a random number, and xrm is the role model, which the best individual. When prm, xp is an adolescent selected in the population randomly.

Case 3 . —

Adolescents may be negatively affected by the group and form bad identity choices such as smoking, dropping out of school, and fighting. In this case, the updating formula for obtaining the new identity of the ith adolescent is written by the following equation:

xnewi=xir3xixq, (12)

where r3 ∈ [0,1] is an n-dimensional vector of uniformly distributed numbers in the interval [0, 1], and xq is a negative identity vector and is written by the following equation:

xq=xu,xu,,xu1×nT, (13)

where xu is negative identity feature, which is an element randomly selected from the population matrix to make the algorithm that has the exploration capability.

3. The Proposed FCBAISA

Different from other meta-heuristic algorithms, AISA tries to find the fitness of adolescents and uses CFLN optimization. AISA performance is good in exploration, exploitation, avoidance of local optimization, and convergence. However, there are also some problems such as unbalanced exploration and exploitation abilities, falling into local optimum, and premature convergence. Adolescent identity development is a complex concept, which can integrate different network structures. Therefore, new ideas can still be injected into the algorithm.

3.1. Hierarchical Optimization Strategy

CFLN optimization method is very novel and effective for exploration, which is used in Case 1. In order to better play the role of CFLN, the iteration is divided into three layers to execute each update mechanism separately in this paper. This strategy can increase the diversity of the population and balance the abilities of exploration and exploitation better. In addition, improved CFLN topology in Section 3.3 has the better ability of exploration, as shown in Figure 2.

Figure 2.

Figure 2

Topology of improved CFLN.

3.2. Quick Search Strategy

This paper uses the current optimal solution (Gbest) to guide other adolescents in the whole search process and uses the characteristic that Brownian motion obeys standard normal distribution to design a fast search strategy to speed up the convergence speed of the algorithm. The Gbest guides other adolescents to update. In most cases, the optimal solution can be found faster. In addition, Brownian motion [41] is introduced to form a new update mechanism, because Brownian motion can replace random disturbance and effectively accelerate the convergence speed of the algorithm. This method enables teenagers to obtain the best adolescent identity as soon as possible, as shown in Figure 3.

Figure 3.

Figure 3

Brownian motion.

Based on the current optimal solution (Gbest) and Brownian motion, as shown in (14), the formula of Case 1 is changed to the following equation.

xpbi=b1.Gbestxi, (14)
xnewi=xib2r1xixb3xpb, (15)

where b1 is n-dimensional Brownian motion. r1 ∈ [0,1] represents a random number, and b2 and b3 are two random numbers generated by Brownian motion.

In addition, Brownian motion is integrated into Cases 2 and 3, and the corresponding update formulates are changed as (16) and (17), respectively.

xnewi=xirandnxpxrm, (16)
xnewi=xirandnxixq. (17)

3.3. RLSE Weight Factor Strategy

The classical least square estimator (LSE) can be written as follows:

A0X0=b0, (18)
A0A1X1=b0b1, (19)

where A0 is a N × n matrix, X0=[X1, X2,…,Xn]T is a n × 1 parameter vector, and b0=[b1, b2,…,bN]T is an output vector. The LSE can be given from following equations:

X0=A0TA01A0Tb0, (20)
X1=A0A1TA0A11A0A1Tb0b1. (21)

In AISA, the weight factor is estimated by (19), which is the classical LSE recursive least squares estimation (RSLE) [42] and is used to optimize the least square estimation and estimate the weigh factor of its approximate model.

G1=A0A1TA0A1=G0+A1TA1,A0A1Tb0b11=G1X0+b1A1X0A1T,X1=G11G1X0+G11A1Tb1A1X0=X0+G11A1Tb1A1X0,Xs+1=Xs+Gs+11As+1Tbs+1As+1Xs, (22)

where we eliminate A0 and b0 variables, G0=A0TA0, X0=G0−1A0Tb0.

In AISA, CFLN uses the LSE to estimate the weight factor of the approximate model. In this paper, a dynamic way to estimate the weight factor of the approximate model based on the LSE by learning from the recursive proof of RLSE. RLSE can dynamically estimate the weight factor of the approximate model and make CFLN more efficient as shown in Figure 2. FCBAISA can find the optimal solution more efficient by modifying the approximate model to affect the algorithm update mechanism. The formula is changed as (27).

ω=ΨTΨ1ΨTf, (23)
ω=ω+ΨTΨ1ΨTfΨω, (24)
ω^=ω^11ω^s1ω^1nω^sn1×n×s=ω1ωn1×n×s. (25)

3.4. Terminal Bounce Mechanism

In this paper, a terminal bounce mechanism is designed to avoid the algorithm falling into local optimization in the later stage of iteration. Specifically, the algorithm may fall into local optimization if the number of iterations increases, especially in the later stage of iteration, while the value of global optimization does not change within the specified number of iterations. In this paper, the value of the timer is set to 20 and adds a counter to monitor the change of the global optimum value, which is the end disturbance mechanism that will be triggered when there is no change in the global optimum value after 20 iterations, which can make the algorithm jump out of the local optimum. In order to achieve better disturbance effect, two individuals are selected randomly from the population and the Gbest is added for guidance when designing the end disturbance strategy. The pseudocode is given by Algorithm 1, and the specific design of the terminal bounce mechanism is as formula (26).

x=rGbest+1rrandxind1xind2, (26)

where Gbest represents the current optimal solution, r ∈ [0,1] denotes a random number, and ind(1) and ind(2) are two indexes generated from the population randomly.

In summary, a fast convergence and balanced AISA is proposed (FCBAISA), Algorithm 2, and Figure 4 gives the pseudocode and flowchart of FCBAISA, respectively.

Figure 4.

Figure 4

Flowchart of FCBAISA.

4. Experimental Results and Analysis

4.1. Benchmark Function and Comparison Algorithm

The CEC2017 benchmark functions are applied to check the performance of FCBAISA in this paper. Among the CEC2017 benchmark functions [43], {f1, f3}, {f4 ~ f10}, {f11 ~ f20}, and {f21 ~ f30} are unimodal functions, simple multimodal functions, hybrid functions, and composite functions, respectively. f2 has not been tested, the reason is the instability in high dimensional, and the details can be found in Reference [44]. The CEC2022 benchmark function includes unimodal function, basic functions, hybrid function, and composition function. These benchmark functions are detailed in Tables 1 and 2.

Table 1.

The information of the CEC2022 benchmark functions used in this paper.

No. Functions f opt
Unimodal function 1 Shifted and full-rotated Zakharov function 300

Basic functions 2 Shifted and full-rotated Rosenbrock's function 400
3 Shifted and full-rotated expanded Schaffer's f6 function 600
4 Shifted and full-rotated noncontinuous Rastrigin's function 800
5 Shifted and full-rotated levy function 900

Hybrid functions 6 Hybrid function 1 (N = 3) 1800
7 Hybrid function 2 (N = 6) 2000
8 Hybrid function 3 (N = 5) 2200

Composition functions 9 Composition function 1 (N = 5) 2300
10 Composition function 2 (N = 4) 2400
11 Composition function 4 (N = 6) 2600
12 Composition function 4 (N = 6) 2700

Search range: [−100,100]

Table 2.

The information of the CEC2017 benchmark functions used in this paper.

Fun Function Range f opt
f 1 Shifted and rotated bent cigar function [−100,100] 100
f 3 Shifted and rotated Zakharov function [−100,100] 300
f 4 Shifted and rotated Rosenbrock's function [−100,100] 400
f 5 Shifted and rotated Rastrigin's function [−100,100] 500
f 6 Shifted and rotated expanded Scaffer's function [−100,100] 600
f 7 Shifted and rotated Lunacek bi-Rastrigin function [−100,100] 700
f 8 Shifted and rotated noncontinuous Rastrigin's function [−100,100] 800
f 9 Shifted and rotated levy function [−100,100] 900
f 10 Shifted and rotated Schwefel's function [−100,100] 1000
f 11 Hybrid function 1 (N = 3) [−100,100] 1100
f 12 Hybrid function 2 (N = 3) [−100,100] 1200
f 13 Hybrid function 3 (N = 3) [−100,100] 1300
f 14 Hybrid function 4 (N = 4) [−100,100] 1400
f 15 Hybrid function 5 (N = 4) [−100,100] 1500
f 16 Hybrid function 6 (N = 4) [−100,100] 1600
f 17 Hybrid function 6 (N = 5) [−100,100] 1700
f 18 Hybrid function 6 (N = 5) [−100,100] 1800
f 19 Hybrid function 6 (N = 5) [−100,100] 1900
f 20 Hybrid function 6 (N = 6) [−100,100] 2000
f 21 Composition function 1 (N = 3) [−100,100] 2100
f 22 Composition function 2 (N = 3) [−100,100] 2200
f 23 Composition function 3 (N = 4) [−100,100] 2300
f 24 Composition function 4 (N = 4) [−100,100] 2400
f 25 Composition function 5 (N = 5) [−100,100] 2500
f 26 Composition function 6 (N = 5) [−100,100] 2600
f 27 Composition function 7 (N = 6) [−100,100] 2700
f 28 Composition function 8 (N = 6) [−100,100] 2800
f 29 Composition function 9 (N = 3) [−100,100] 2900
f 30 Composition function 10 (N = 3) [−100,100] 3000

For checking the effectiveness and superiority of FCBAISA, it is compared with the performance of eight evolutionary algorithms. In order to be more fair and reasonable, the comparison algorithms include the classical algorithm and the new excellent algorithm. These are as follows: transient search algorithm (TSO) [45], the Archerfish Hunting Optimizer algorithm (AHO) [46], butterfly optimization algorithm (BOA) [47], dynamic differential annealed optimization (DDAO) [48], PSO [6], owl search algorithm (OSA) [49], and gravitational search algorithm (GSA) [50]. The contents of these algorithms are shown in Table 3. To compare the performance of algorithms fairly, the population size (N) of all algorithms is 30, the dimension (n) is 30, and each algorithm runs 50 times independently. The maximum number of function evaluations is 30000, and the maximum number of iterations is 1000 in the CEC2017 benchmark functions. The population size (N) of all algorithms is 30, the dimension (n) is 20, and each algorithm runs 50 times independently. The maximum number of function evaluations is 100000, and the maximum number of iterations is 3334 in the CEC2022 benchmark functions.

Table 3.

Relevant parameter values of the algorithm.

Algorithm Years Parameter information Values
AISA [39] 2020 Number of Chebyshev polynomials (s) 3
TSO [45] 2020 NScaling factor 0.85
AHO [46] 2021 Theta pi/12
Omega 0.01
maxCount 10

BOA [47] 2019 p 0.8
a 0.1
c 0.01

DDAO [48] 2020 MaxSubIt 10
T 0 2000
Alpha 0.995

PSO [6] 1998 c 1, c2 2
w 0.9–0.4

OSA [49] 2018 Beta 0–1.9
Epsilon le-16

GSA [50] 2009 Rnorm 2
ElitistCheck 1
Minflag 1

FCBAISA presented 2021 Number of Chebyshev polynomials (s) 30

4.2. Comparison between FCBAISA and Other Algorithms

In order to be more fair and reasonable, the comparison algorithm includes the classical algorithm and the new excellent algorithm. The results of CEC2017 and CEC2022 benchmark functions are shown in Tables 4 and 5, respectively. Among the CEC2017 benchmark functions, FCBAISA ranks first in 21, second in 6, and first after the comprehensive comparison. For other algorithms, the comprehensive performance of PSO is better, ranking third. From the mean comparison, it is found that FCBAISA performs better on 21 benchmark functions, and GSA and PSO perform better on four and three test functions, respectively. From the comparison of standard deviation, it is found that the stability of FCBAISA is poor, but it also ranks first in 15 benchmark functions. In complex problems, the stability of FCBAISA is improved. By comparing the optimal solutions of each algorithm, FCBAISA can find a better optimal solution among 16 benchmark functions in CEC2017 benchmark functions. In CEC2022 benchmark functions, FCBAISA ranks first in 10 benchmark functions, second in 2 benchmark functions, and first after the comprehensive comparison. From the mean comparison, it is found that FCBAISA performs better on 10 benchmark functions. It is concluded that the FCBAISA algorithm can effectively solve the simple and complex problems, especially when solving complex problems, it is better than other algorithms. In general, FCBAISA performs better in all aspects and can find the optimal solution quickly and efficiently in most benchmark functions.

Table 4.

Experimental results of FCBAISA and other algorithms in CEC2017 benchmark functions.

Function TSO AHO BOA DDAO PSO OSA GSA AISA FCBAISA
f 1 Mean 5.04E + 10 1.19E + 11 6.05E + 10 5.20E + 10 1.73E + 10 5.68E + 10 4.50E + 06 5.14E + 09 9.23E + 03
Std 7.51E + 09 1.44E + 10 3.99E + 09 4.89E + 09 8.06E + 09 6.01E + 09 1.65E + 07 3.25E + 09 2.35E + 04
Best 2.86E + 10 7.76E + 10 4.81E + 10 3.89E + 10 1.03E + 09 4.71E + 10 4.80E + 02 5.26E + 08 1.99E + 02
5 9 8 6 4 7 2 3 1

f 3 Mean 9.33E + 04 2.12E + 05 9.14E + 04 9.49E + 04 1.08E + 05 9.33E + 04 8.42E + 04 2.39E + 04 1.59E + 04
Std 1.04E + 03 3.66E + 04 4.26E + 03 1.50E + 04 3.20E + 04 1.43E + 03 2.65E + 03 8.67E + 03 1.06E + 04
Best 8.89E + 04 1.51E + 05 7.78E + 04 6.06E + 04 5.59E + 04 8.65E + 04 7.76E + 04 8.53E + 03 2.50E + 03
6 9 4 7 8 5 3 2 1

f 4 Mean 1.22E + 04 4.17E + 04 2.03E + 04 1.45E + 04 1.67E + 03 1.35E + 04 5.86E + 02 1.03E + 03 5.03E + 02
Std 2.83E + 03 6.73E + 03 6.22E + 02 2.37E + 03 1.06E + 03 1.63E + 03 3.67E + 01 3.67E + 02 2.62E + 01
Best 7.08E + 03 2.95E + 04 1.90E + 04 8.60E + 03 7.43E + 02 9.85E + 03 5.33E + 02 6.10E + 02 4.00E + 02
5 9 8 7 4 6 2 3 1

f 5 Mean 8.67E + 02 1.04E + 03 9.34E + 02 9.39E + 02 6.69E + 02 9.67E + 02 7.57E + 02 7.76E + 02 7.05E + 02
Std 4.03E + 01 4.02E + 01 1.79E + 01 2.49E + 01 4.02E + 01 1.79E + 01 1.16E + 01 3.37E + 01 3.27E + 01
Best 7.85E + 02 9.46E + 02 8.90E + 02 8.77E + 02 5.94E + 02 9.15E + 02 7.22E + 02 6.92E + 02 6.34E + 02
5 9 6 7 1 8 3 4 2

f 6 Mean 6.77E + 02 7.12E + 02 6.83E + 02 6.96E + 02 6.22E + 02 6.97E + 02 6.62E + 02 6.55E + 02 6.50E + 02
Std 7.44E + 00 8.62E + 00 7.87E + 00 6.57E + 00 6.98E + 00 6.82E + 00 2.63E + 00 9.24E + 00 8.74E + 00
Best 6.58E + 02 6.88E + 02 6.68E + 02 6.70E + 02 6.11E + 02 6.74E + 02 6.56E + 02 6.22E + 02 6.31E + 02
5 9 6 7 1 8 4 3 2

f 7 Mean 1.43E + 03 3.22E + 03 1.34E + 03 1.43E+03 1.04E+03 1.49E+03 1.01E+03 1.15E+03 9.77E+02
Std 4.65E + 01 2.45E + 02 2.15E + 01 4.20E + 01 1.54E + 02 2.93E + 01 4.50E + 01 6.49E + 01 4.67E + 01
Best 1.34E + 03 2.60E + 03 1.29E + 03 1.34E + 03 8.10E + 02 1.42E + 03 9.12E + 02 1.03E + 03 8.94E + 02
6 9 5 7 3 8 2 4 1

f 8 Mean 1.14E + 03 1.35E + 03 1.13E + 03 1.18E + 03 9.63E + 02 1.17E + 03 9.63E + 02 1.01E + 03 9.77E + 02
Std 2.40E + 01 3.35E + 01 1.65E + 01 1.54E + 01 3.93E + 01 1.80E + 01 1.098E + 01 2.55E + 01 3.32E + 01
Best 1.08E + 03 1.28E + 03 1.09E + 03 1.13E + 03 8.97E + 02 1.14E + 03 9.41E + 02 9.50E + 02 9.11E + 02
6 9 5 8 2 7 1 4 3

f 9 Mean 1.03E + 04 2.66E + 04 1.20E + 04 1.35E + 04 7.47E + 03 1.34E + 04 4.29E + 03 6.55E + 03 3.96E + 03
Std 1.31E + 03 2.66E + 03 7.66E + 02 1.63E + 03 2.69E + 03 1.48E + 03 3.25E + 02 1.46E + 03 1.76E + 03
Best 7.40E + 03 2.08E + 04 1.05E + 04 1.05E + 04 3.84E + 03 1.00E + 04 3.67E + 03 3.45E + 03 1.84E + 03
5 9 6 8 4 7 2 3 1

f 10 Mean 8.98E + 03 9.42E + 03 9.01E + 03 9.09E + 03 5.51E + 03 9.08E + 03 4.30E + 03 6.89E + 03 6.70E + 03
Std 7.05E + 02 2.94E + 02 2.88E + 02 2.96E + 02 7.47E + 02 4.92E + 02 2.87E + 02 5.29E + 02 7.84E + 02
Best 7.35E + 03 8.85E + 03 8.16E + 03 7.96E + 03 2.74E + 03 7.27E + 03 3.64E + 03 5.91E + 03 5.01E + 03
5 9 6 8 2 7 1 3 4

f 11 Mean 1.13E + 04 2.62E + 04 8.41E + 03 1.26E + 04 2.25E + 03 1.29E + 04 4.34E + 03 1.44E + 03 1.27E + 03
Std 2.60E + 03 7.03E + 03 6.61E + 02 2.83E + 03 1.02E + 03 2.27E + 03 1.00E + 03 1.08E + 02 6.46E + 01
Best 5.36E + 03 1.32E + 04 7.33E + 03 6.07E + 03 1.35E + 03 8.69E + 03 2.55E + 03 1.27E + 03 1.17E + 03
6 9 5 7 3 8 4 2 1

f 12 Mean 1.03E + 10 2.57E + 10 2.02E + 10 1.02E + 10 2.14E + 09 1.57E + 10 2.21E + 08 2.15E + 07 6.32E + 05
Std 3.66E + 09 5.25E + 09 1.75E + 09 1.83E + 09 1.35E + 09 1.90E + 09 1.73E + 08 4.30E + 07 6.09E + 05
Best 3.64E + 09 1.53E + 10 1.71E + 10 5.28E + 09 6.29E + 06 1.21E + 10 2.48E + 06 4.53E + 05 9.95E + 03
6 9 8 5 4 7 3 2 1

f 13 Mean 3.13E + 09 1.87E + 10 2.14E + 10 5.36E + 09 5.45E + 08 7.51E + 09 5.52E + 04 3.12E + 04 9.24E + 03
Std 3.00E + 09 6.86E + 09 5.91E + 09 2.06E + 09 8.21E + 08 2.73E + 09 1.13E + 04 3.05E + 04 6.37E + 03
Best 2.34E + 08 7.73E + 09 1.11E + 10 1.68E + 09 1.60E + 05 3.86E + 09 2.83E + 04 7.79E + 03 3.46E + 03
5 8 9 6 4 7 3 2 1

f 14 Mean 8.89E + 06 8.66E + 06 2.44E + 06 3.06E + 06 1.69E + 05 2.35E + 07 1.19E + 06 1.62E + 03 1.62E + 03
Std 2.80E + 06 5.63E + 06 1.58E + 06 1.64E + 06 1.57E + 05 1.17E + 07 2.04E + 05 7.11E + 01 5.87E + 01
Best 2.65E + 06 1.39E + 06 4.24E + 05 3.18E + 05 1.12E + 04 3.61E + 06 7.90E + 05 1.50E + 03 1.51E + 03
8 7 5 6 3 9 4 2 1

f 15 Mean 2.02E + 08 3.26E + 09 6.89E + 08 5.54E + 08 7.47E + 04 8.94E + 08 1.39E + 04 2.52E + 03 2.02E + 03
Std 2.42E + 08 1.43E + 09 1.98E + 08 2.05E + 08 5.55E + 04 2.83E + 08 2.16E + 03 7.65E + 02 1.79E + 02
Best 6.13E + 05 8.12E + 08 1.65E + 08 6.43E + 07 1.29E + 04 6.51E + 08 8.68E + 03 1.87E + 03 1.76E + 03
5 9 7 6 4 8 3 2 1

f 16 Mean 5.30E + 03 6.80E + 03 9.95E + 03 5.35E + 03 2.93E + 03 6.32E + 03 3.68E + 03 3.25E + 03 2.98E + 03
Std 6.47E + 02 9.04E + 02 4.55E + 02 3.90E + 02 4.42E + 02 7.14E + 02 2.10E + 02 3.68E + 02 2.52E + 02
Best 3.55E + 03 5.13E + 03 9.23E + 03 4.24E + 03 2.02E + 03 5.23E + 03 3.26E + 03 2.50E + 03 2.30E + 03
5 8 9 6 1 7 4 3 2

f 17 Mean 4.41E + 03 8.30E + 03 1.79E + 04 3.68E + 03 2.40E + 03 2.84E + 04 2.87E + 03 2.15E + 03 2.13E + 03
Std 4.01E + 03 6.47E + 03 2.23E + 03 2.50E + 02 2.58E + 02 1.63E + 04 1.99E + 02 1.84E + 02 1.31E + 02
Best 2.38E + 03 3.13E + 03 1.40E + 04 3.21E + 03 2.03E + 03 5.49E + 03 2.52E + 03 1.82E + 03 1.86E + 03
6 7 8 5 3 9 4 2 1

f 18 Mean 1.53E + 08 1.71E + 08 9.62E + 07 3.88E + 07 5.07E + 06 2.76E + 08 1.76E + 06 2.72E + 03 3.77E + 03
Std 6.49E + 07 1.28E + 08 3.16E + 07 2.03E + 07 3.53E + 06 1.39E + 08 4.33E + 05 9.95E + 02 5.78E + 03
Best 7.80E + 06 7.75E + 06 2.23E + 07 6.27E + 06 1.80E + 05 1.17E + 07 8.06E + 05 2.06E + 03 2.23E + 03
7 8 6 5 4 9 3 1 2

f 19 Mean 4.82E + 08 4.34E + 09 1.47E + 09 6.72E + 08 1.19E + 07 8.44E + 08 2.54E + 04 2.51E + 03 2.04E + 03
Std 3.82E + 08 2.23E + 09 4.66E + 08 2.41E + 08 3.16E + 07 5.06E + 08 1.01E + 04 2.13E + 03 5.89E + 01
Best 6.66E + 06 5.38E + 08 2.09E + 08 2.92E + 08 2.16E + 03 2.14E + 08 1.20E + 04 1.97E + 03 1.96E + 03
5 9 8 6 4 7 3 2 1

f 20 Mean 3.02E + 03 3.29E + 03 3.11E + 03 3.11E + 03 2.52E + 03 3.28E + 03 3.43E + 03 2.53E + 03 2.49E + 03
Std 2.13E + 02 1.57E + 02 1.16E + 02 1.22E + 02 1.78E + 02 1.79E + 02 1.43E + 02 1.17E + 02 8.70E + 01
Best 2.45E + 03 2.92E + 03 2.65E + 03 2.67E + 03 2.14E + 03 2.90E + 03 3.11E + 03 2.26E + 03 2.30E + 03
4 8 5 6 3 7 9 2 1

f 21 Mean 2.71E + 03 2.87E + 03 2.75E + 03 2.74E + 03 2.45E + 03 2.78E + 03 2.61E + 03 2.51E + 03 2.49E + 03
Std 3.46E + 01 4.51E + 01 1.63E + 01 2.79E + 01 3.88E + 01 3.94E + 01 2.16E + 01 4.08E + 01 3.35E + 01
Best 2.65E + 03 2.76E + 03 2.71E + 03 2.67E + 03 2.39E + 03 2.68E + 03 2.57E + 03 2.44E + 03 2.43E + 03
5 9 7 6 1 8 4 3 2

f 22 Mean 8.22E + 03 1.06E + 04 6.38E + 03 8.87E + 03 6.87E + 03 1.02E + 04 7.35E + 03 3.37E + 03 2.30E + 03
Std 8.42E + 02 6.88E + 02 3.04E + 02 7.69E + 02 8.79E + 02 4.47E + 02 2.65E + 02 6.38E + 02 3.51E + 00
Best 6.50E + 03 8.60E + 03 5.64E + 03 7.00E + 03 5.06E + 03 9.06E + 03 6.86E + 03 2.67E + 03 2.30E + 03
6 9 3 7 4 8 5 2 1

f 23 Mean 3.50E + 03 3.62E + 03 3.60E + 03 3.40E + 03 2.97E + 03 3.70E + 03 3.77E + 03 2.97E + 03 2.96E + 03
Std 8.07E + 01 1.06E + 02 5.72E + 01 8.23E + 01 6.53E + 01 1.40E + 02 1.38E + 02 6.95E + 01 6.43E + 01
Best 3.36E + 03 3.35E + 03 3.50E + 03 3.17E + 03 2.85E + 03 3.39E + 03 3.54E + 03 2.84E + 03 2.83E + 03
5 7 6 4 3 8 9 2 1

f 24 Mean 4.36E + 03 4.03E + 03 4.28E + 03 3.59E + 03 3.18E + 03 3.97E + 03 3.38E + 03 3.16E + 03 3.12E + 03
Std 2.56E + 02 2.48E + 02 6.10E + 01 9.75E + 01 5.83E + 01 8.84E + 01 6.03E + 01 7.98E + 01 9.18E + 01
Best 3.75E + 03 3.46E + 03 4.12E + 03 3.42E + 03 3.01E + 03 3.76E + 03 3.24E + 03 3.01E + 03 2.99E + 03
9 7 8 5 3 6 4 2 1

f 25 Mean 4.48E + 03 1.64E + 04 6.44E + 03 5.31E + 03 3.49E + 03 4.96E + 03 2.98E + 03 3.19E + 03 2.90E + 03
Std 3.21E + 02 2.74E + 03 1.35E + 02 3.66E + 02 4.81E + 02 3.89E + 02 1.24E + 01 1.26E + 02 1.63E + 01
Best 3.89E + 03 1.14E + 04 6.13E + 03 4.25E + 03 2.94E + 03 4.13E + 03 2.95E + 03 3.02E + 03 2.88E + 03
5 9 8 7 4 6 2 3 1

f 26 Mean 1.18E + 04 1.52E + 04 1.04E + 04 1.09E + 04 5.44E + 03 1.23E + 04 7.81E + 03 6.83E + 03 5.26E + 03
Std 1.10E + 03 1.78E + 03 1.55E + 02 6.27E + 02 9.26E + 02 8.66E + 02 3.71E + 02 8.40E + 02 1.25E + 03
Best 9.62E + 03 1.06E + 04 9.75E + 03 8.93E + 03 4.16E + 03 1.06E + 04 7.15E + 03 4.22E + 03 2.81E + 03
7 9 5 6 2 8 4 3 1

f 27 Mean 4.23E + 03 4.66E + 03 4.12E + 03 4.08E + 03 3.31E + 03 4.81E + 03 5.02E + 03 3.28E + 03 3.26E + 03
Std 1.55E + 02 3.19E + 02 1.63E + 02 1.62E + 02 4.57E + 01 6.22E + 02 1.74E + 02 4.52E + 01 4.85E + 01
Best 3.86E + 03 3.87E + 03 3.83E + 03 3.71E + 03 3.24E + 03 4.10E + 03 4.45E + 03 3.21E + 03 3.19E + 03
6 7 5 4 3 8 9 2 1

f 28 Mean 6.64E + 03 1.10E + 04 8.53E + 03 7.08E + 03 5.98E + 03 6.87E + 03 3.51E + 03 3.78E + 03 3.23E + 03
Std 7.57E + 02 1.33E + 03 9.24E + 01 4.71E + 02 1.19E + 03 5.67E + 02 1.49E + 02 2.27E + 02 2.62E + 01
Best 5.00E + 03 7.96E + 03 8.30E + 03 6.06E + 03 3.51E + 03 5.72E + 03 3.36E + 03 3.42E + 03 3.20E + 03
5 9 8 7 4 6 2 3 1

f 29 Mean 5.91E + 03 8.88E + 03 1.81E + 04 6.51E + 03 4.14E + 03 8.08E + 03 4.93E + 03 4.26E + 03 4.05E + 03
Std 5.43E + 02 3.35E + 03 2.44E + 03 4.38E + 02 2.95E + 02 1.28E + 03 2.35E + 02 2.60E + 02 2.17E + 02
Best 4.87E + 03 5.72E + 03 1.24E + 04 5.48E + 03 3.68E + 03 6.22E + 03 4.50E + 03 3.59E + 03 3.66E + 03
5 8 9 6 2 7 4 3 1

f 30 Mean 1.85E + 08 1.28E + 09 3.01E + 09 6.99E + 08 1.28E + 07 2.75E + 09 9.00E + 05 2.48E + 05 3.07E + 04
Std 9.84E + 07 3.65E + 08 1.46E + 09 2.42E + 08 1.50E + 07 9.75E + 07 4.35E + 05 1.30E + 06 2.07E + 04
Best 5.82E + 07 3.33E + 08 8.00E + 08 1.93E + 08 2.94E + 04 2.29E + 09 1.26E + 05 9.35E + 03 7.49E + 03
5 7 9 6 4 8 3 2 1

Total rank 163 244 192 181 92 214 106 74 39
Final rank 5 9 7 6 3 8 4 2 1

Table 5.

Experimental results of FCBAISA and other algorithms in CEC2022 benchmark functions.

Function TSO AHO BOA DDAO20 PSO OSA GSA AISA FCBAISA
f 1 Mean 5.32E + 04 5.95E + 04 5.11E + 04 4.29E + 04 1.81E + 04 9.45E + 04 2.42E + 04 5.59E + 02 4.85E + 02
Std 3.58E + 04 9.11E + 03 1.66E + 04 7.39E + 03 1.56E + 04 3.95E + 04 4.60E + 03 2.83E + 02 2.23E + 02
Best 2.00E + 04 3.48E + 04 2.48E + 04 2.57E + 04 3.09E + 02 3.11E + 04 1.59E + 04 3.21E + 02 3.01E + 02
7 8 6 5 3 9 4 2 1

f 2 Mean 1.90E + 03 5.48E + 03 3.66E + 03 2.02E + 03 5.86E + 02 3.12E + 03 4.68E + 02 5.57E + 02 4.47E + 02
Std 5.89E + 02 1.64E + 03 7.47E + 02 2.44E + 02 1.14E + 02 7.16E + 02 1.96E + 01 6.14E + 01 1.02E + 01
Best 9.08E + 02 2.10E + 03 2.33E + 03 1.30E + 03 4.46E + 02 2.05E + 03 4.01E + 02 4.69E + 02 4.07E + 02
5 9 8 6 4 7 2 3 1

f 3 Mean 6.73E + 02 6.92E + 02 6.67E + 02 6.75E + 02 6.15E + 02 6.87E + 02 6.36E + 02 6.34E + 02 6.04E + 02
Std 1.15E + 01 1.04E + 01 1.13E + 01 5.62E + 00 5.96E + 00 9.85E + 00 9.58E + 00 1.08E + 01 1.64E + 00
Best 6.50E + 02 6.69E + 02 6.34E + 02 6.63E + 02 6.04E + 02 6.65E + 02 6.07E + 02 6.12E + 02 6.01E + 02
6 9 5 7 2 8 4 3 1

f 4 Mean 9.58E + 02 1.05E + 03 9.68E + 02 9.83E + 02 8.73E + 02 9.80E + 02 8.75E + 02 8.75E + 02 8.51E + 02
Std 1.74E + 01 1.82E + 01 1.02E + 01 1.07E + 01 2.31E + 01 1.56E + 01 1.06E + 01 1.40E + 01 1.47E + 01
Best 9.22E + 02 1.01E + 03 9.44E + 02 9.40E + 02 8.36E + 02 9.38E + 02 8.51E + 02 8.37E + 02 8.22E + 02
5 9 6 8 2 7 4 3 1

f 5 Mean 3.25E + 03 8.40E + 03 3.24E + 03 3.96E + 03 1.75E + 03 3.81E + 03 9.64E + 02 1.82E + 03 9.79E + 02
Std 3.95E + 02 1.15E + 03 3.46E + 02 4.39E + 02 6.04E + 02 4.03E + 02 1.16E + 02 4.29E + 02 5.29E + 01
Best 2.43E + 03 6.38E + 03 2.25E + 03 2.95E + 03 9.02E + 02 2.97E + 03 9.00E + 02 1.11E + 03 9.08E + 02
6 9 5 8 3 7 1 4 2

f 6 Mean 1.16E + 09 3.51E + 09 2.62E + 09 1.08E + 09 1.79E + 07 3.40E + 09 3.06E + 03 1.96E + 03 1.89E + 03
Std 1.08E + 09 1.30E + 09 1.29E + 09 3.55E + 08 1.91E + 07 1.14E + 09 1.22E + 03 6.52E + 01 4.50E + 01
Best 8.62E + 06 6.99E + 08 1.91E + 08 3.33E + 08 2.13E + 03 1.33E + 09 1.95E + 03 1.87E + 03 1.82E + 03
6 9 7 5 4 8 3 2 1

f 7 Mean 2.20E + 03 2.26E + 03 2.17E + 03 2.19E + 03 2.08E + 03 2.25E + 03 2.36E + 03 2.08E + 03 2.05E + 03
Std 3.63E + 01 4.42E + 01 2.47E + 01 2.54E + 01 3.96E + 01 5.84E + 01 6.65E + 01 2.62E + 01 1.34E + 01
Best 2.11E + 03 2.17E + 03 2.12E + 03 2.13E + 03 2.03E + 03 2.17E + 03 2.20E + 03 2.03E + 03 2.03E + 03
6 8 4 5 3 7 9 2 1

f 8 Mean 2.30E + 03 2.69E + 03 5.42E + 03 2.41E + 03 2.27E + 03 2.34E + 03 2.51E + 03 2.23E + 03 2.23E + 03
Std 9.10E + 01 2.81E + 02 6.76E + 03 8.65E + 01 6.12E + 01 1.13E + 02 1.06E + 02 3.28E + 00 2.21E + 00
Best 2.23E + 03 2.26E + 03 2.33E + 03 2.27E + 03 2.22E + 03 2.24E + 03 2.23E + 03 2.22E + 03 2.23E + 03
4 8 9 6 3 5 7 2 1

f 9 Mean 2.96E + 03 3.23E + 03 3.99E + 03 2.87E + 03 2.59E + 03 3.73E + 03 2.51E + 03 2.49E + 03 2.48E + 03
Std 1.99E + 02 1.85E + 02 5.33E + 02 8.58E + 01 9.58E + 01 3.09E + 02 1.60E + 01 1.03E + 01 2.14E-01
Best 2.65E + 03 2.88E + 03 3.05E + 03 2.70E + 03 2.49E + 03 3.10E + 03 2.49E + 03 2.48E + 03 2.48E + 03
6 7 9 5 4 8 3 2 1

f 10 Mean 5.82E + 03 6.21E + 03 3.23E + 03 2.80E + 03 4.08E + 03 6.56E + 03 4.75E + 03 2.90E + 03 2.66E + 03
Std 1.29E + 03 1.36E + 03 1.44E + 03 2.07E + 02 9.76E + 02 7.91E + 02 6.23E + 02 5.75E + 02 5.58E + 02
Best 2.58E + 03 2.69E + 03 2.52E + 03 2.55E + 03 2.52E + 03 3.29E + 03 2.50E + 03 2.50E + 03 2.50E + 03
7 8 4 2 5 9 6 3 1

f 11 Mean 7.73E + 03 1.08E + 04 9.05E + 03 7.14E + 03 5.05E + 03 9.04E + 03 2.91E + 03 3.84E + 03 3.01E + 03
Std 1.13E + 03 1.43E + 03 5.81E + 02 7.14E + 02 1.01E + 03 5.09E + 02 1.05E + 02 4.94E + 02 1.05E + 02
Best 4.83E + 03 7.04E + 03 7.13E + 03 5.03E + 03 3.34E + 03 7.87E + 03 2.60E + 03 3.10E + 03 2.82E + 03
6 9 8 5 4 7 1 3 2

f 12 Mean 3.56E + 03 3.64E + 03 3.30E + 03 3.41E + 03 3.03E + 03 4.36E + 03 3.71E + 03 3.01E + 03 2.94E + 03
Std 3.28E + 02 1.70E + 02 1.09E + 02 7.92E + 01 6.48E + 01 3.84E + 02 2.46E + 02 5.77E + 01 3.73E + 00
Best 3.06E + 03 3.25E + 03 3.09E + 03 3.22E + 03 2.95E + 03 3.55E + 03 3.15E + 03 2.95E + 03 2.93E + 03
6 7 4 5 3 9 8 2 1

Total rank 70 100 75 67 40 91 52 31 14
Final rank 6 9 7 5 3 8 4 2 1

4.3. Convergence Rate

In CEC2017 benchmark functions, f1 and f3 are two unimodal functions, and FCBAISA has a very fast convergence rate. f9is a simple multimodal function. At the beginning of the iteration, the decline speed of GSA in the convergence curve is faster than that of FCBAISA. But in the later stage of the iteration, FCBAISA exceeds GSA, indicating the strong development ability of the improved algorithm. f12, f13 and f19 are hybrid functions. The faster the convergence speed of FCBAISA, the greater the advantages of FCBAISA in this kind of functions. f22, f28, and f30 are composite functions, and the convergence speed of FCBAISA is the fastest among the three composite functions and indicates that FCBAISA has strong performance in solving complex problems. The convergence curve is shown in Figure 5. In CEC2022 benchmark functions, FCBAISA performs better on most benchmark functions, and all details are in Figure 6.

Figure 5.

Figure 5

Convergence curves of 9 algorithms in CEC2017 benchmark functions. (a) f1. (b) f3. (c) f9. (d) f12. (e) f13. (f) f19. (g) f22. (h) f28. (i) f30.

Figure 6.

Figure 6

Convergence curves of 9 algorithms in CEC2022 benchmark functions. (a) f1. (b) f2. (c) f3. (d) f4. (e) f5. (f) f6. (g) f7. (h) f8. (i) f9. (j) f10. (k) f11. (l) f12.

4.4. Statistical Analysis

For testing the FCBAISA and the above experimental results, statistical analysis is carried out, including the Wilcoxon rank test, Friedman test, and Quade test. Wilcoxon rank test mainly checks the performance of FCBAISA and compares algorithms one by one. Friedman test and Quaid test mainly test all algorithms together, then compare the performance of the algorithm from the overall point of view, and finally give the ranking and p_value. Through these tests, the performance of the improved algorithm can be well tested.

For the Wilcoxon rank test, its criterion is when the significance level is 0.05, when p_value ≤ 0.05, if R+ < R is marked as “+,” FCBAISA and other algorithms are significantly better. On the contrary, it will be marked as “+,” indicating that FCBAISA performs worse than other algorithms. If there is no significant difference between FCBAISA and other algorithms, it will be marked as “=.” In Tables 6 and 7, the last row gives the sum of each tag to judge whether FCBAISA has significant advantages over other comparison algorithms. In comparison with other improved algorithms, FCBAISA got “+” on at least 25 benchmark functions, except for 20 “+” compared with PSO, and only two “−.” To sum up, these tests show that the performance of FCBAISA is better than other comparing algorithms.

Table 6.

The results of Wilcoxon rank test for the benchmark functions of CEC2017.

Function FCBAISA vs AISA FCBAISA vs PSO FCBAISA vs BOA FCBAISA vs DDAO
p_value R + R +/=/− p_value R + R +/=/− p_value R + R +/=/− p_value R + R +/=/−
f 1 7.56E-10 0 1257 + 7.56E-10 0 1275 + 7.55E-10 0 1230 + 7.56E-10 0 1275 +
f 3 6.42E-04 284 766 + 7.55E-10 0 1274 + 7.56E-10 0 1275 + 7.55E-10 0 1274 +
f 4 7.50E-10 0 1200 + 7.56E-10 0 1173 + 7.55E-10 0 1184 + 7.55E-10 0 1275 +
f 5 1.85E-09 15 1095 + 7.71E-05 1047 180 7.55E-10 0 1260 + 7.56E-10 0 1275 +
f 6 4.30E-03 342 758 + 7.55E-10 1253 0 8.03E-10 1 1258 + 7.56E-10 0 1273 +
f 7 7.50E-10 0 1230 + 9.80E-03 370 905 + 7.55E-10 0 1258 + 7.55E-10 0 1272 +
f 8 2.88E-07 80 1169 + 7.18E-02 777 430 = 7.55E-10 0 1268 + 7.54E-10 0 1270 +
f 9 3.43E-08 66 1014 + 1.07E-07 87 1142 + 7.55E-10 0 1186 + 7.53E-10 0 1271 +
f 10 4.37E-01 618 557 = 4.51E-09 1182 26 7.54E-10 0 1240 + 8.03E-10 1 1274 +
f 11 4.76E-09 31 1158 + 7.55E-10 0 1234 + 7.55E-10 0 1233 + 7.55E-10 0 1270 +
f 12 8.01E-10 1 1162 + 7.55E-10 0 1258 + 7.55E-10 0 1268 + 7.56E-10 0 1272 +
f 13 7.77E-08 81 1054 + 7.55E-10 0 1275 + 7.55E-10 0 1235 + 7.55E-10 0 1271 +
f 14 9.96E-01 637 582 = 7.55E-10 0 1230 + 7.55E-10 0 1212 + 7.56E-10 0 1270 +
f 15 8.35E-06 96 1099 + 7.56E-10 0 1275 + 7.55E-10 0 1226 + 7.56E-10 0 1270 +
f 16 1.02E-04 235 835 + 3.88E-01 706 509 = 7.55E-10 0 1194 + 7.56E-10 0 1270 +
f 17 1.54E-01 460 785 = 6.99E-08 79 1145 + 7.55E-10 0 1217 + 7.55E-10 0 1270 +
f 18 6.00E-03 860 353 7.56E-10 0 1240 + 7.55E-10 0 1173 + 7.55E-10 0 1270 +
f 19 9.78E-05 234 836 + 7.55E-10 0 1232 + 7.55E-10 0 1250 + 7.55E-10 0 1270 +
f 20 3.57E-01 530 733 = 2.04E-01 506 769 = 7.56E-10 0 1275 + 7.55E-10 0 1270 +
f 21 4.30E-03 297 894 + 4.53E-05 1060 215 7.54E-10 0 1179 + 7.55E-10 0 1270 +
f 22 7.54E-10 0 1227 + 7.56E-10 0 1226 + 7.56E-10 0 1226 + 7.56E-10 0 1270 +
f 23 9.50E-01 631 609 = 3.82E-01 547 716 = 7.55E-10 0 1225 + 7.56E-10 0 1275 +
f 24 2.67E-02 213 867 + 1.63E-04 170 1028 + 7.55E-10 0 1260 + 7.56E-10 0 1273 +
f 25 7.50E-10 0 1225 + 7.55E-10 0 1228 + 7.55E-10 0 1214 + 7.56E-10 0 1274 +
f 26 5.91E-08 76 1034 + 9.96E-01 553 638 = 7.56E-10 0 1275 + 7.56E-10 0 1271 +
f 27 1.36E-01 483 703 = 6.44E-04 284 981 + 7.56E-10 0 1233 + 7.56E-10 0 1273 +
f 28 7.52E-10 0 1209 + 7.56E-10 0 1275 + 7.55E-10 0 1182 + 7.56E-10 0 1268 +
f 29 2.20E-05 198 1011 + 7.81E-02 455 770 = 7.54E-10 0 1101 + 7.56E-10 0 1275 +
f 30 2.50E-03 324 796 + 7.55E-10 0 1192 + 7.56E-10 0 1245 + 7.56E-10 0 1275 +
Total 22/6/1 19/6/4 29/0/0 29/0/0

Table 7.

The results of Wilcoxon rank test for the benchmark functions of CEC2017.

Function FCBAISA vs AHO FCBAISA vs OSA FCBAISA vs TSO FCBAISA vs GSA
p_value R + R +/=/− p_value R + R +/=/− p_value R + R +/=/− p_value R + R +/=/−
f 1 7.55E-10 0 1270 + 7.55E-10 0 1221 + 7.56E-10 0 1270 + 3.82E-01 547 728 =
f 3 7.52E-10 0 1232 + 7.55E-10 0 1180 + 7.56E-10 0 1270 + 7.55E-10 0 1178 +
f 4 7.50E-10 0 1270 + 7.54E-10 0 1161 + 7.55E-10 0 1270 + 8.01E-10 1 1172 +
f 5 7.53E-10 0 1270 + 7.55E-10 0 1168 + 7.55E-10 0 1270 + 1.30E-09 9 1188 +
f 6 7.55E-10 0 1270 + 7.55E-10 0 1228 + 8.03E-10 1 1274 + 1.26E-08 48 1162 +
f 7 7.52E-10 3 1268 + 7.54E-10 0 1230 + 7.54E-10 5 1162 + 1.50E-03 267 966 +
f 8 7.53E-10 7 1270 + 7.55E-10 0 1267 + 7.55E-10 0 1275 + 9.30E-03 760 368
f 9 7.48E-10 0 1275 + 7.55E-10 0 1132 + 7.46E-10 0 1252 + 5.41E-02 438 792 =
f 10 7.32E-10 0 1275 + 7.55E-10 0 1204 + 9.07E-10 3 1272 + 7.55E-10 1222 0
f 11 7.43E-10 9 1275 + 7.54E-10 0 1143 + 7.49E-10 0 1260 + 7.38E-10 0 1152 +
f 12 7.81E-10 0 1237 + 7.55E-10 0 1196 + 6.58E-10 0 1255 + 7.39E-10 0 1128 +
f 13 7.42E-10 2 1270 + 7.55E-10 0 1197 + 7.51E-10 0 1267 + 7.53E-10 0 1209 +
f 14 7.49E-10 0 1271 + 7.55E-10 0 1226 + 7.53E-10 0 1273 + 7.51E-10 0 1194 +
f 15 7.45E-10 0 1238 + 7.55E-10 0 1203 + 7.54E-10 0 1245 + 7.55E-10 0 1266 +
f 16 7.48E-10 0 1235 + 7.55E-10 0 1243 + 7.48E-10 0 1236 + 7.57E-10 0 1161 +
f 17 7.37E-10 0 1272 + 7.55E-10 0 1156 + 7.49E-10 0 1269 + 7.50E-10 0 1266 +
f 18 7.51E-10 0 1269 + 7.54E-10 0 1053 + 7.48E-10 0 1270 + 7.49E-10 0 1170 +
f 19 7.47E-10 0 1272 + 7.52E-10 0 1217 + 7.52E-10 0 1270 + 7.53E-10 0 1227 +
f 20 7.52E-10 0 1262 + 7.55E-10 0 1180 + 8.03E-10 1 1274 + 7.55E-10 0 1238 +
f 21 7.43E-10 0 1254 + 7.55E-10 0 1145 + 7.46E-10 0 1267 + 7.54E-10 0 1167 +
f 22 7.49E-10 0 1270 + 7.55E-10 0 1204 + 7.43E-10 0 1252 + 7.55E-10 0 1206 +
f 23 7.39E-10 0 1273 + 7.54E-10 0 1173 + 7.48E-10 0 1243 + 7.54E-10 0 1131 +
f 24 7.51E-10 0 1270 + 7.55E-10 0 1268 + 7.50E-10 0 1241 + 1.38E-09 4 1265 +
f 25 7.52E-10 0 1265 + 7.55E-10 0 1225 + 7.39E-10 4 1270 + 7.55E-10 0 1216 +
f 26 7.44E-10 0 1267 + 7.55E-10 0 1190 + 7.48E-10 0 1269 + 7.55E-10 0 1202 +
f 27 7.50E-10 0 1267 + 7.55E-10 0 1210 + 7.52E-10 0 1263 + 7.54E-10 0 1167 +
f 28 7.41E-10 0 1251 + 7.52E-10 0 1133 + 7.53E-10 0 1268 + 7.54E-10 0 1173 +
f 29 7.47E-10 0 1235 + 7.52E-10 0 1239 + 7.51E-10 1 1270 + 8.01E-10 1 1193 +
f 30 7.52E-10 3 1272 + 7.54E-10 0 1122 + 7.52E-10 2 1273 + 7.54E-10 0 1140 +
Total 29/0/0 29/0/0 29/0/0 25/2/2

For Friedman and Quade test, the significance level is 0.05, if p_value≤ 0.05, indicating that the test result is true. The results of the Friedman and Quade tests are shown in Tables 8 and 9 and indicate that the FCBAISA ranks first. After the Friedman test result in Table 8, the FCBAISA has a p_value of 1.2942E-10 and the test result is 8.6552. For Quade test result in Table 9, it can be observed that FCBAISA's final ranking is number one. FCBAISA's result is 8.8229 with a p_value of 2.2054E-43 in the Quade test. In summary, after three statistical tests, it can be proved that FCBAISA has significant advantages over 8 other comparative algorithms, including improved algorithms and other excellent algorithms.

Table 8.

The results of Friedman test for the benchmark functions of CEC2017.

Name Score Rank
AHO 1.5517 9
OSA 2.6724 8
BOA 3.3621 7
DDAO 3.7759 6
TSO 4.3620 5
GSA 6.3793 4
PSO 6.8448 3
AISA 7.3965 2
FCBAISA 8.6552 1
p_value 1.2942E-10

Table 9.

The results of Quade test for the benchmark functions of CEC2017.

Name Score Rank
AHO 1.5264 9
OSA 2.7701 8
BOA 3.0414 7
DDAO 3.8103 6
TSO 4.3229 5
PSO 6.3506 4
GSA 6.7701 3
AISA 7.5851 2
FCBAISA 8.8229 1
p_value 2.2054E-43

5. Practical Engineering Problems

This section uses FCBAISA and all comparison algorithms to solve several problems of engineering design. The superiority of FCBAISA is further tested by analyzing those experimental results. Engineering design problems include pressure vessel design [51], welded beam design [52], gear train engineering design [53, 54], and speed reducer design, and the details are as follows.

5.1. The Problem of Pressure Vessel Design

Pressure vessels are designed to minimize costs. The pressure vessel, as shown in Figure 7, consists of a cylindrical center and hemispherical heads at both ends, where L (x4), Ts (x1), Th (x2), and R(x3) are the length of the cylindrical part, the thickness of the shell, the thickness of head, and the inner radius, respectively. This problem consists of four constraints, including three linear inequalities and a nonlinear inequality, and its model is shown in the following equation.

Min fx=0.6224x1x3x4+1.7781x2x32+3.1611x12x4+19.84x12x3,s.t.y1x=x1+0.0193x30,y2x=x2+0.00954x30,y3x=πx32x443πx33+1,296,0000,y4x=x42400.1x1,x299,  10x3,x4200. (27)

Figure 7.

Figure 7

Model design.

In Table 10, it can be seen that the optimal solution of each algorithm in solving the problem of pressure vessel design is 6.06E-10, and FCBAISA is better than that of other algorithms. The convergence curve of the algorithm involved in this paper on pressure vessel problem is shown in Figure 8.

Table 10.

Experimental results of pressure vessel design problem.

TSO AHO BOA DDAO PSO OSA GSA AISA FCBAISA
x 1 0.9129 0.8948 1.139 0.8572 0.7851 0.7859 0.9297 0.7866 0.7824
x 2 0.4688 0.4708 0.5321 0.481 0.4064 0.4724 0.4822 0.4063 0.4076
x 3 46.6098 40.6415 53.8072 42.2844 40.3196 40.9759 47.5163 42.0984 42.0909
x 4 11.4509 12.8559 10.0000 10.0000 10.0000 45.7413 56.4759 62.7144 140.3024
Min 9.98E + 03 1.071E + 04 1.05E + 04 9.42E + 03 6.45E + 03 9.39E + 03 1.07E + 04 6.14E + 03 6.06E + 03

Figure 8.

Figure 8

Convergence curves of 9 algorithms in pressure vessel design.

5.2. The Problem of Welded Beam Design

In this design problem, the main restrict factors of the design cost of welded beams include shear stress (τ), bending stress (σ) in the beam, buckling load on the bar (Pc), end deflection of the beam (δ), and side constraints. The variables involved including h(x1), l(x2), t(x3), and b(x4) in this design problem, and the details are shown in Figure 9. This problem consists of seven constraints, including two linear and five nonlinear inequality, its model as in equation (28), where x = [x1, x2, x3, x4] = [h, l, t, b].

Min fx=1.10471x12x2+0.04811x3x414+x2,s.t.g1x=τx+τmax0,g2x=σx+σmax0,g3x=δx+δmax0,g4x=x1x40,g5x=PPcx0,g6x=0.125x10,g7x=0.10471x12+0.04811x3x414+x250,0.1x1,x42,  0.1x2,x310, (28)

where τx=τ2+2ττx2/2R+τ2,τ=P/2x1x2,τ=MR/JM=PL+x2/2,R=x22/4+x1+x3/22,J=22x1x2x23/4+x1+x3/22σx=6PL/x4x32,δx=4PL3/Ex4x33,Pcx=4.013Ex32x46/36/L21x3/2LE/4G and where P = 6000lb, L = 14in, E = 30 × 106psi, G = 12 × 106psi, τmax = 13600psi, σmax = 30000psi, δmax = 0.25in.

Figure 9.

Figure 9

Model design.

In Table 11, it can be seen that the optimal solution of each algorithm in solving the problem of welded beam design is 2.0632, and FCBAISA is better than that of other algorithms. The convergence curve of the algorithm involved in this paper on welded beam design problem is shown in Figure 10.

Table 11.

Experimental results of welded beam design problem.

TSO AHO BOA DDAO PSO OSA GSA AISA FCBAISA
x 1 0.1544 0.1501 0.2578 0.1557 0.166 0.1518 0.2088 0.1658 0.1659
x 2 3.3508 2.6955 3.3616 3.7372 8.2326 3.0559 3.8553 7.4115 8.2326
x 3 4.9116 6.8373 5.7222 7.0623 9.9971 4.3309 5.7837 9.9936 9.9971
x 4 0.1733 0.1687 0.2518 0.1841 0.168 0.1696 0.1879 0.168 0.168
Min 2.9727 2.8871 3.337 2.8895 2.0632 2.677 2.8778 2.0638 2.0632

Figure 10.

Figure 10

Convergence curves of 9 algorithms in welded beam design problem.

5.3. The Problem of Gear Train Engineering Design

The problem of gear train engineering design is to find the minimum value of gear and tooth ratio without affecting the efficiency as shown in Figure 11. The number of teeth must be an integer; thus, the design variables for this problem are discrete. Because constraints are constraints on design variables, the problem of constraints on discrete variables can increase its complexity. So, in this design problem, nA, nB, nD, and nF are decision variables, the integer variable of the upper bound is 60, and the lower is 12. Besides, the gear ratio is defined as (nBnD)/(nFnA), this specific problem can be modelled as (29), where x=[x1, x2, x3, x4]=[nA, nB, nD, nF].

Min fx=16.931x1x2x3x42.s.t.12xi60,xiN,i=1,2,3,4. (29)

Figure 11.

Figure 11

Model design.

In Table 12, it can be seen that the optimal solution of each algorithm in solving the problem of gear train engineering design is 2.23E-10, and FCBAISA is also better than that of other algorithms. The convergence curve of the algorithm involved in this paper on gear train engineering design problem is shown in Figure 12.

Table 12.

Experimental results of gear train design problem.

TSO AHO BOA DDAO PSO OSA GSA AISA FCBAISA
x 1 12.0222 12.0000 12.0000 12.37856 12.0000 12.0000 13.2554 12.0000 12.0000
x 2 12.8303 12.0000 12.0000 12.0000 12.0000 20.0455 12.2594 12.0693 12.3333
x 3 51.9823 19.03763 24.0755 29.2456 23.1030 35.1536 43.7907 23.0162 34.3642
x 4 12.8303 20.97001 21.6255 24.4759 18.0335 31.4768 43.5781 23.3093 34.1505
Min 0.0020 3.20E-08 0.0178 6.40E-08 4.14E-08 3.57E-02 6.81E-10 3.09E-10 2.23E-10

Figure 12.

Figure 12

Convergence curves of 9 algorithms in gear train design problem.

5.4. The Problem of Speed Reducer Design

In this constrained optimization problem (see Figure 13), the variables x1, x2, and x3, are face width (b), teeth module (m), teeth number (z), and x4, x5, and x6 represent length of the first shaft (l1), the second length (l2), and the diameter between l1 and l2(d2). This problem consists of 4 linear and 7 nonlinear inequalities, and the model of this specific problem is as (30), where x=[x1, x2, x3, x4, x5, x6, x7]=[b, m, z, l1, l2, d1, d2].

Min fx=0.7854x1x223.3333x32+14.9334x343.09341.508x1x62+x72+7.4777x63+x73+0.7854x4x62+x5x72,s.t.y1x=27.0x1x22x310,y2x=397.50x1x22x3210,y3x=1.930x42x2x64x310,y4x=1.930x52x2x74x310,y5x=745.0x4/x2x32+16.90×106110.0x6310,y6x=745.0x5/x2x32+157.50×10685.0x7310,y7x=x2x34010,y8x=5x2x110,y9x=x112x210,y10x=1.50x6+1.90x410,y11x=1.10x7+1.90x510,2.60x13.60,0.70x20.80,17.0x328.0,7.30x48.307.50x58.30,2.90x63.90,5.0x75.50. (30)

Figure 13.

Figure 13

Model design.

In Table 13, it can be seen that the optimal solution of each algorithm in solving the problem of speed reducer design is 3.00E + 03, and FCBAISA is better than that of other algorithms obviously. The convergence curve of the algorithm involved in this paper on speed reducer design problem is shown in Figure 14.

Table 13.

Experimental results of speed reducer design problem.

TSO AHO BOA DDAO PSO OSA GSA AISA FCBAISA
x 1 2.6025 3.5012 2.7400 3.5039 2.6000 2.7698 3.5177 2.9963 3.5000
x 2 0.7000 0.7000 0.7000 0.7000 0.7000 0.7878 0.7006 0.7000 0.7000
x 3 17.0000 17.0000 17.0000 17.0000 17.0000 18.0818 17.0330 17.0000 17.0000
x 4 7.3000 7.3000 7.3000 7.3000 7.3000 7.7211 7.3282 7.3000 7.3000
x 5 7.8000 7.8000 7.8000 7.8139 7.8000 8.2789 7.8080 7.8000 7.8000
x 6 3.3492 3.355 3.3543 3.3517 3.3486 8.4332 3.3528 3.3497 3.3502
x 7 5.2864 5.2895 5 5.2891 5.2862 5.3459 5.2947 5.2864 5.2865
Min 7.63E + 05 3.14E + 03 9.43E + 05 3.31E + 03 4.03E + 05 1.00E + 06 3.53E + 03 6.30E + 04 3.00E + 03

Figure 14.

Figure 14

Convergence curves of 9 algorithms in speed reducer design problem.

6. Conclusion

A fast convergence and balanced adolescent identity search algorithm (FCBAISA) is proposed in this work for numerical and engineering design problems to advance the quality of AISA. To balance the exploration and exploitation of FCBAISA better, a layered optimization strategy is proposed. A fast search strategy is proposed to make the algorithm break away from the local optimization and converge to the optimal value faster. The CFLN is improved by RSLE to obtain the optimal result effectively. A terminal disturbance strategy is designed to prevent the algorithm from local optimization in the later iteration. The CEC2017 benchmark functions, CEC2022 benchmark functions, and the design problems of engineering are applied to check the quality of FCBAISA. It is clear that FCBAISA has high precision, fast convergence speed, strong exploration, and exploitation ability, and the balance between them is better. In addition, future research can be carried out from the following aspects:

  1. Further improvement of FCBAISA, including the Chebyshev approximation model and other effective alternative models.

  2. Trying to apply FCBAISA to the problems of multi-objective optimization, and considering the combination of specific practical problems, including scheduling optimization and engineering problems.

Algorithm 1.

Algorithm 1

 Terminal bounce mechanism.

Algorithm 2.

Algorithm 2

 Pseudocode of FCBAISA.

Acknowledgments

The author pays great respect to the contributions made by Esref Bogar and Selami Beyhan, the proponents of adaptive identity search algorithm (AISA) and thanks for the algorithm code shared by the author. The authors wish to acknowledge the National Natural Science Foundation of China (Grant no. U1731128); the Natural Science Foundation of Liaoning Province (Grant No. 2019-MS-174); and the Foundation of Liaoning Province Education Administration (Grant nos. LJKZ0279 and 2019LNJC12) for the financial support.

Data Availability

The data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors would like to submit the enclosed manuscript entitled “Adolescent identity search algorithm based on fast search and balance optimization for numerical and engineering design problems.” The authors wish to be considered for publication in “Computational Intelligence and Neuroscience.” No conflicts of interest exit in the submission of this manuscript, and manuscript is approved by all authors for publication. The authors would like to declare on behalf of my co-authors that the work described was original research that has not been published previously, and not under consideration for publication elsewhere, in whole or in part. All the authors listed have approved the manuscript that is enclosed.

References

  • 1.Abed-alguni B., Paul D. J. Hybridizing the cuckoo search algorithm with different mutation operators for numerical optimization problems. Journal of Intelligent Systems . 2020;29(1):1043–1062. doi: 10.1515/jisys-2018-0331. [DOI] [Google Scholar]
  • 2.Mirjalili S., Mirjalili S., Hatamlou A. Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Computing & Applications . 2016;27(2):495–513. doi: 10.1007/s00521-015-1870-7. [DOI] [Google Scholar]
  • 3.Yan C. Audience evaluation and analysis of symphony performance effects based on the genetic neural network algorithm for the multilayer perceptron (ga-mlp-nn) Computational Intelligence and Neuroscience . 2021;2021 doi: 10.1155/2021/4133892.4133892 [DOI] [PMC free article] [PubMed] [Google Scholar] [Retracted]
  • 4.Storn R. Price K. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization . 1997;11(4):341–359. doi: 10.1023/a:1008202821328. [DOI] [Google Scholar]
  • 5.Yao S., Wang L., Liu M., Xu Y. An effective estimation of distribution algorithm for solving the distributed permutation flow-shop scheduling problem. International Journal of Production Economics . 2013;145(1):387–396. doi: 10.1016/j.ijpe.2013.05.004. [DOI] [Google Scholar]
  • 6.Kennedy J., Eberhart R. Particle swarm optimization. Proceedings of the ICNN’95 - International Conference on Neural Networks; November 1995; Perth, WA, Australia. IEEE; [DOI] [Google Scholar]
  • 7.Karaboga D., Gorkemli B., Ozturk C., Karaboga N. A comprehensive survey: artificial bee colony (abc) algorithm and applications. Artificial Intelligence Review . 2014;42(1):21–57. doi: 10.1007/s10462-012-9328-0. [DOI] [Google Scholar]
  • 8.Dan Y., Tao J. Knowledge worker scheduling optimization model based on bacterial foraging algorithm. Future Generation Computer Systems . 2021;124:330–337. doi: 10.1016/j.future.2021.05.028. [DOI] [Google Scholar]
  • 9.Heidari A. A., Mirjalili S., Faris H., Aljarah I., Mafarja M., Chen H. Harris hawks optimization: algorithm and applications. Future Generation Computer Systems . 2019;97:849–872. doi: 10.1016/j.future.2019.02.028. [DOI] [Google Scholar]
  • 10.Kumar V., Kumar D. A systematic review on firefly algorithm: past, present, and future. Archives of Computational Methods in Engineering . 2021;28(4):3269–3291. doi: 10.1007/s11831-020-09498-y. [DOI] [Google Scholar]
  • 11.Wang L., Xiong Y., Li S., Zeng Y. R. New fruit fly optimization algorithm with joint search strategies for function optimization problems. Knowledge-Based Systems . 2019;176:77–96. doi: 10.1016/j.knosys.2019.03.028. [DOI] [Google Scholar]
  • 12.Wang G. G., Guo L., Gandomi A. H., Hao G. S., Wang H. Chaotic krill herd algorithm. Information Sciences . 2014;274:17–34. doi: 10.1016/j.ins.2014.02.123. [DOI] [Google Scholar]
  • 13.Hussien A. G., Amin M., Wang M., et al. Crow search algorithm: theory, recent advances, and applications. IEEE Access . 2020;8:173548–173565. doi: 10.1109/access.2020.3024108. [DOI] [Google Scholar]
  • 14.Akkar H. A. R., Mahdi F. R., Mahdi F. R. Grass fibrous root optimization algorithm. International Journal of Intelligent Systems and Applications . 2017;9(6):15–23. doi: 10.5815/ijisa.2017.06.02. [DOI] [Google Scholar]
  • 15.Zhiheng W., Jianhua L. Flamingo search algorithm: a new swarm intelligence optimization algorithm. IEEE Access . 2021;9:88564–88582. doi: 10.1109/access.2021.3090512. [DOI] [Google Scholar]
  • 16.Karami H., Anaraki M. V., Farzin S., Mirjalili S. Flow direction algorithm (fda): a novel optimization approach for solving optimization problems. Computers & Industrial Engineering . 2021;156 doi: 10.1016/j.cie.2021.107224.107224 [DOI] [Google Scholar]
  • 17.Sulaiman M. H., Mustaffa Z., Mohamed M. R., Aliman O. Using the gray wolf optimizer for solving optimal reactive power dispatch problem. Applied Soft Computing . 2015;32:286–292. doi: 10.1016/j.asoc.2015.03.041. [DOI] [Google Scholar]
  • 18.Mirjalili S., Lewis A. The whale optimization algorithm. Advances in Engineering Software . 2016;95:51–67. doi: 10.1016/j.advengsoft.2016.01.008. [DOI] [Google Scholar]
  • 19.Naruei I., Keynia F. A new optimization method based on coot bird natural life model. Expert Systems with Applications . 2021;183 doi: 10.1016/j.eswa.2021.115352.115352 [DOI] [Google Scholar]
  • 20.van Laarhoven P. J. M., Aarts E. H. L. Simulated Annealing: Theory and Applications . Dordrecht, Netherlands: Springer; 1987. Simulated annealing; pp. 7–15. [DOI] [Google Scholar]
  • 21.Cheema S. S., Singh A., Gritli H. Optimal crop selection using gravitational search algorithm. Mathematical Problems in Engineering . 2021;2021 doi: 10.1155/2021/5549992.5549992 [DOI] [Google Scholar]
  • 22.Bharti K., Cervera-Lierta A., Kyaw T. H., et al. Noisy intermediate-scale quantum algorithms. Reviews of Modern Physics . 2022;94(1) doi: 10.1103/revmodphys.94.015004.015004 [DOI] [Google Scholar]
  • 23.Lam A. Y. S., Li V. O. K., Yu J. J. Q. Real-coded chemical reaction optimization. IEEE Transactions on Evolutionary Computation . 2012;16(3):339–353. doi: 10.1109/tevc.2011.2161091. [DOI] [Google Scholar]
  • 24.Kaveh A., Talatahari S. Hybrid charged system search and particle swarm optimization for engineering design problems. Engineering Computations . 2011;28(4):423–440. doi: 10.1108/02644401111131876. [DOI] [Google Scholar]
  • 25.Hatamlou A. Black hole: a new heuristic optimization approach for data clustering. Information Sciences . 2013;222:175–184. doi: 10.1016/j.ins.2012.08.023. [DOI] [Google Scholar]
  • 26.Javidy B., Hatamlou A., Mirjalili S. Ions motion algorithm for solving optimization problems. Applied Soft Computing . 2015;32:72–79. doi: 10.1016/j.asoc.2015.03.035. [DOI] [Google Scholar]
  • 27.Rahkar Farshi T. Battle royale optimization algorithm. Neural Computing & Applications . 2021;33(4):1139–1157. doi: 10.1007/s00521-020-05004-4. [DOI] [Google Scholar]
  • 28.Battiti R., Tecchiolli G. The reactive tabu search. ORSA Journal on Computing . 1994;6(2):126–140. doi: 10.1287/ijoc.6.2.126. [DOI] [Google Scholar]
  • 29.Sadollah Sadollah A., Eskandar H., Bahreininejad A., Kim J. H. Water cycle, mine blast and improved mine blast algorithms for discrete sizing optimization of truss structures. Computers & Structures . 2015;149:1–16. doi: 10.1016/j.compstruc.2014.12.003. [DOI] [Google Scholar]
  • 30.Singh M., Panigrahi B. K., Abhyankar A. R. Optimal coordination of directional over-current relays using teaching learning-based optimization (tlbo) algorithm. International Journal of Electrical Power & Energy Systems . 2013;50:33–41. doi: 10.1016/j.ijepes.2013.02.011. [DOI] [Google Scholar]
  • 31.Amir H. Gandomi. Interior search algorithm (isa): a novel approach for global optimization. ISA Transactions . 2014;53(4):1168–1183. doi: 10.1016/j.isatra.2014.03.018. [DOI] [PubMed] [Google Scholar]
  • 32.Ghorbani N., Babaei E., Sadikoglu F. Bema: binary exchange market algorithm. Procedia Computer Science . 2017;120:656–663. doi: 10.1016/j.procs.2017.11.292. [DOI] [Google Scholar]
  • 33.Li-Wang C., Wang Y., Zeng Z. Y., Lin C. Y., Yu Q. L. Research on logistics distribution vehicle scheduling based on heuristic genetic algorithm. Complexity . 2021;2021 doi: 10.1155/2021/8275714.8275714 [DOI] [Google Scholar]
  • 34.Yıldız B. S., Pholdee N., Bureerat S., Erdaş M. U., Yıldız A. R., Sait S. M. Comparision of the political optimization algorithm, the Archimedes optimization algorithm and the Levy flight algorithm for design optimization in industry. Materials Testing . 2021;63(4):356–359. doi: 10.1515/mt-2020-0053. [DOI] [Google Scholar]
  • 35.Yıldız B. S., Patel V., Pholdee N., Sait S. M., Bureerat S., Yıldız A. R. Conceptual comparison of the ecogeography-based algorithm, equilibrium algorithm, marine predators algorithm and slime mold algorithm for optimal product design. Materials Testing . 2021;63(4):336–340. doi: 10.1515/mt-2020-0049. [DOI] [Google Scholar]
  • 36.Yildiz B. S., Pholdee N., Bureerat S., Yildiz A. R., Sait S. M. Robust design of a robot gripper mechanism using new hybrid grasshopper optimization algorithm. Expert Systems . 2021;38(3) doi: 10.1111/exsy.12666.e12666 [DOI] [Google Scholar]
  • 37.Yıldız A. R., Erdaş M. U. A new hybrid taguchi-salp swarm optimization algorithm for the robust design of real-world engineering problems. Materials Testing . 2021;63(2):157–162. doi: 10.1515/mt-2020-0022. [DOI] [Google Scholar]
  • 38.Yildiz B. S., Pholdee N., Bureerat S., Yildiz A. R., Sait S. M. Enhanced grasshopper optimization algorithm using elite opposition-based learning for solving real-world engineering problems. Engineering with Computers . 2021:1–13. doi: 10.1007/s00366-021-01368-w. [DOI] [Google Scholar]
  • 39.Bogar E., Beyhan S. Adolescent identity search algorithm (aisa): a novel metaheuristic approach for solving optimization problems. Applied Soft Computing . 2020;95 doi: 10.1016/j.asoc.2020.106503.106503 [DOI] [Google Scholar]
  • 40.Çetin M., Bahtiyar B., Beyhan S. Adaptive uncertainty compensation-based nonlinear model predictive control with real-time applications. Neural Computing & Applications . 2019;31(S2):1029–1043. doi: 10.1007/s00521-017-3068-7. [DOI] [Google Scholar]
  • 41.Abdechiri M., Meybodi M. R., Bahrami H. Gases brownian motion optimization: an algorithm for optimization (gbmo) Applied Soft Computing . 2013;13(5):2932–2946. doi: 10.1016/j.asoc.2012.03.068. [DOI] [Google Scholar]
  • 42.Hu H., Wong W. E., Chang-Hai J., Yuan C. K. A Case Study of the Recursive Least Squares Estimation Approach to Adaptive Testing for Software Components. Proceedings of the Fifth International Conference on Quality Software (QSIC’05); September 2005; Melbourne, VIC, Australia. IEEE; pp. 135–141. [DOI] [Google Scholar]
  • 43.Liu H., Zhang X. W., Tu L. P. A modified particle swarm optimization using adaptive strategy. Expert Systems with Applications . 2020;152 doi: 10.1016/j.eswa.2020.113353.113353 [DOI] [Google Scholar]
  • 44.Awad N. H., Ali M. Z., Suganthan P. N. Ensemble Sinusoidal Differential Covariance Matrix Adaptation with Euclidean Neighborhood for Solving Cec2017 Benchmark Problems. Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC); June 2017; Donostia, Spain. IEEE; pp. 372–379. [DOI] [Google Scholar]
  • 45.Qais M. H., Hasanien H. M., Alghuwainem S. Transient search optimization: a new meta-heuristic optimization algorithm. Applied Intelligence . 2020;50(11):3926–3941. doi: 10.1007/s10489-020-01727-y. [DOI] [Google Scholar]
  • 46.Zitouni F., Harous S., Belkeram A., Hammou L. E. B. The Archerfish Hunting Optimizer: A Novel Metaheuristic Algorithm for Global Optimization. Arabian Journal for Science and Engineering . 2022;47:2513–2553. doi: 10.1007/s13369-021-06208-z. [DOI] [Google Scholar]
  • 47.Arora S., Singh S. Butterfly optimization algorithm: a novel approach for global optimization. Soft Computing . 2019;23(3):715–734. doi: 10.1007/s00500-018-3102-4. [DOI] [Google Scholar]
  • 48.Ghafil H. N., Jármai K. Dynamic differential annealed optimization: new metaheuristic optimization algorithm for engineering applications. Applied Soft Computing . 2020;93 doi: 10.1016/j.asoc.2020.106392.106392 [DOI] [Google Scholar]
  • 49.Jain M., Maurya S., Rani A., Singh V. Owl search algorithm: a novel nature-inspired heuristic paradigm for global optimization. Journal of Intelligent and Fuzzy Systems . 2018;34(3):1573–1582. doi: 10.3233/jifs-169452. [DOI] [Google Scholar]
  • 50.Rashedi E., Nezamabadi-Pour H., Saryazdi S. Gsa: a gravitational search algorithm. Information Sciences . 2009;179(13):2232–2248. doi: 10.1016/j.ins.2009.03.004. [DOI] [Google Scholar]
  • 51.Kannan B. K., Kramer S. N. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. Journal of Mechanical Design . 1994;116(2):405–411. doi: 10.1115/1.2919393. [DOI] [Google Scholar]
  • 52.Coello C. A. C. Use of a self-adaptive penalty approach for engineering optimization problems. Computers in Industry . 2000;41(2):113–127. doi: 10.1016/s0166-3615(99)00046-9. [DOI] [Google Scholar]
  • 53.Sadollah Sadollah A., Bahreininejad A., Eskandar H., Hamdi M. Mine blast algorithm: a new population based algorithm for solving constrained engineering optimization problems. Applied Soft Computing . 2013;13(5):2592–2612. doi: 10.1016/j.asoc.2012.11.026. [DOI] [Google Scholar]
  • 54.Kamboj V. K., Nandi A., Bhadoria A., Sehgal S. An intensify Harris hawks optimizer for numerical and engineering optimization problems. Applied Soft Computing . 2020;89 doi: 10.1016/j.asoc.2019.106018.106018 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The data used to support the findings of this study are included within the article.


Articles from Computational Intelligence and Neuroscience are provided here courtesy of Wiley

RESOURCES