1. Introduction
In the past twenty years, many stochastic swarm intelligence (SI) has been proposed. The field of SI has emerged many optimization algorithms [1-4]. Fireworks algorithm (FWA) is a novel evolutionary algorithm. In 2010, inspired by the natural phenomenon that the fireworks explode in the night sky, Tan [5] proposed the fireworks algorithm. As a high-efficiency SI algorithm, FWA has attracted much research interest and has been applied in engineering and science [6-10].
In FWA, fireworks are launched into potential search space and a shower of sparks fills the surrounding local space [11-13]. However, it does not take advantage of more information about fireworks and sparks in the whole population. The critical feature of FWA is the random detonation process. "good" fireworks with small fitness value can produce a large number of explosive sparks in a smaller explosion amplitude. "bad" fireworks with great fitness can produce a small number of explosive sparks with a more extensive scope [14]. Therefore, the performance of FWA can be improved by dynamically changing the number of sparks and amplitude.
In the paper, the main contributions are as follows: First, the dynamic coefficient formula is created. Second DE is improved with Commensal learning (CDE). Then, we propose hybrid fireworks algorithm with dynamic coefficients and improved differential evolution (namely HDEFWA). The dynamic coefficient balances the global and local search. CDE avoids premature convergence of population, and the diversity of the population is improved. Experiments on 13 benchmark functions show that the performance of FWA can be significantly enhanced by using well-informed fireworks and sparks.
2. Related Research
2.1 Bare-Bones FWA
FWA is a new meta-heuristic algorithm based on swarm intelligence to solve complex optimization problems [5]. The bare-bones FWA is described as Algorithm 1.
Algorithm 1 The bare-bones fireworks
There are two important factors in the process of the explosion [15,16]. The number of sparks generated by each firework is defined as follows:
\(N_{i}=E C \cdot \frac{y_{\max }-f\left(x_{i}\right)+\delta}{\sum_{j=1}^{p}\left(y_{\max }-f\left(x_{j}\right)\right)+\delta}\) (1)
The amplitude of a firework explosion is defined as follows:
\(A_{\mathrm{i}}=A C \cdot \frac{f\left(x_{i}\right)-y_{\min }+\delta}{\sum_{j=1}^{p}\left(f\left(x_{j}\right)-y_{\min }\right)+\delta}\) (2)
where AC and EC are parameters that are used to controll the rang of explosion amplitude and sparks generated by the fireworks, f(xi) denotes a function of xi, ymin is the minimum fitness values, ymax is the maximum fitness values. is the smallest amount for the machine which is utilized to avoid that the denominator is zero.
2.2 Bare-bones Differential Evolution
Differential Evolution (DE) is a heuristic search and a simple yet effective algorithm for global optimization [17]. It maintains a population with P individuals Y1, N, Y2, N, ..., YP, N, every individual represents one candidate solutions. N represents the generation index and P represents the population [17]. The process of DE is as follows.
2.2.1 Mutation
DE selects an initial population Y1, 0, Y2, 0, ..., YP, 0 randomly in the search space. Then, in each generation, the optimal solution is searched by Mutation, crossover, and selection. In this paper, only the minimization of the objective function is considered without loss of consistency.
① DE/best/1
\(T_{j, \mathrm{~N}}=Y_{\text {best }, N}+C \cdot\left(Y_{j 1, N}-Y_{j 2, N}\right)\) (3)
② DE/rand/1
\(T_{j, N}=Y_{j 1, N}+C \cdot\left(Y_{j 2, N}-Y_{j 3, N}\right)\) (4)
③ DE/rand/2
\(\begin{aligned} T_{j, N}=& Y_{j 1, N}+C \cdot\left(Y_{j 2, N}-Y_{j 3, N}\right) \\ &+C \cdot\left(Y_{j 4, N}-Y_{j 5, N}\right) \end{aligned}\) (5)
④ DE/current-to-best/1
\(\begin{aligned} T_{i, N}=& Y_{j, N}+C \cdot\left(Y_{\text {best }, N}-Y_{j, N}\right) \\ &+C \cdot\left(Y_{j 1, N}-Y_{\mathrm{j} 2, N}\right) \end{aligned}\) (6)
⑤ DE/current-to-rand/1
\(\begin{aligned} C_{j, N}=& Y_{j, N}+\text { rand } \cdot\left(Y_{j 1, N}-Y_{j, N}\right) \\ &+C \cdot\left(Y_{j 2, N}-Y_{j 3, N}\right) \end{aligned}\) (7)
⑥ DE/best/2
\(\begin{aligned} T_{j, N}=& Y_{\text {best }, N}+C \cdot\left(Y_{j 1, N}-Y_{j 2, N}\right) \\ &+C \cdot\left(Y_{j 3, N}-Y_{j 4, N}\right) \end{aligned}\) (8)
where the indices j1, j2, j3, j4, and j5 are used to distinguish form j, which are mutually distinct from {1, 2, ..., P}. C is the mutation factor which is a constant from [0,2]. Ybest, N is the optimal individual which is in the population p at the Nth generation. whereas, instead of a mutant vector Ti, N , the “DE/current-to-rand/1” directly produces a trial vector Ci, N with no need for crossover operation.
2.2.2 Crossover
Binomial crossover operator incorporates successful solutions from the target vector Yi, N with mutant vector Ti, N and generates a trial vector Ci, N.
where j = 1, 2, ... , D, D is the parameter which controls the dimension of the mutant vector, jrand denotes a random integer in {1, 2, ..., D}, randj denotes a standard normal distribution number which is in the jth dimension, and CR represents the crossover rate, which Values range from 0 to 1.
\(C_{i, j, N}=\left\{\begin{array}{ll} T_{i, j, N}, & \text { if } \text { rand }_{j} \leq C R \text { or } j=j_{\text {rand }} \\ Y_{i, j, N}, & \text { otherwise } \end{array}\right.\) (9)
2.2.3 Selection
One-to-one compare is adopted by the selection operator. Yi, N and Ci, N denote the target vector and the trial vector respectively, Yi, N is compared with Ci, N, the one with the better fitness value is admitted to the next generation [18].
\(Y_{i, N+1}= \begin{cases}C_{i, N}, & \text { if } f\left(Z_{i, G}\right)<f\left(X_{i, G}\right) \\ Y_{i, N}, & \text { otherwise }\end{cases}\)
3. Hybrid Fireworks Algorithm
In this section, we use commensal learning in differential evolution (CDE), and hybridize CDE in FWA with a dynamic coefficient method. HDEFWA enhances information sharing among individuals and balances the exploration and exploitation ability of FWA.
3.1 Commensal Learning
As known, firstly, the performance of DE is sensitive to parameter settings. Secondly, mutation, crossover, and selection of DE depend on a population to some extent. In the search process, the parameter settings and mutation strategies are automatically changed in the existing methods [19]. Therefore, in this section, two-sided Gaussian distributions with different mean values are constructed respectively to update the scale factor C and the crossover probability CR. Parameter setting of “two-sided Gaussian distribution” [17] is used to update C and CR dynamically. Parameters are setting as Eq. (11-12).
\(\text { Lower }=\left\{\begin{array}{l} C_{\text {lower }}=N(0.5,0.1) \\ C R_{\text {lower }}=N(0.1,0.1) \end{array}\right.\) (11)
\(\text { Upper }=\left\{\begin{array}{l} C_{\text {upper }}=N(0.8,0.1) \\ C R_{\text {upper }}=N(0.9,0.1) \end{array}\right.\) (12)
where Upper indicates upper value of the parameter settings and Lower indicates lower value of the parameter settings. N indicates the Gaussian distribution.
Three mutation strategies with different characteristics are elaborately selected.
① DE/rand/1.
② DE/best/1.
③ DE/current-to-rand/1.
We combine the parameter setting of “two-sided Gaussian distribution” with the above three strategies, and six experimental vector schemes are generated, as follows:
ⓐ DE/best/1, Upper;
ⓑ DE/best/1, Lower;
ⓒ DE/rand/1, Upper;
ⓓ DE/rand/1, Lower;
ⓔ DE/target-to-rand/1, Uppe;
ⓕ DE/target-to-rand/1, Lower.
where “DE/target-to-rand/1” is the modification of “DE/current-to-rand/1”. Parameter setting of “two-sided Gaussian distribution” combines “DE/target-to-rand/1” strategy and conducts binomial crossover. as shown Eq. (13).
\(\begin{aligned} T_{i, G}=& Y_{i, G}+\text { rand } \cdot\left(Y_{i 1, G}-Y_{i, G}\right) \\ &+C \cdot\left(Y_{i 2, G}-Y_{i 3, G}\right) \end{aligned}\) (13)
3.2 Dynamic Coefficient
In FWA, two essential factors affect the performance of fireworks, which are explosion amplitude and explosion sparks. Yet, the coefficient of the explosion sparks and explosion amplitude is fixed in the bare-bones FWA. In the real-world, dynamic adjustment of coefficient with the iteration may correspond to an actual process of a fireworks explosion. The large explosion amplitude is conducive to explore new search space in the initial phase of the fireworks explosion. While the smaller the amplitude and the more sparks of the blast, the more brilliant the fireworks, which is conducive to the local search in the middle and late stages of the fireworks explosion. We propose the approach on controlling the explosion sparks coefficient and amplitude coefficient as shown in Eq. (14) and Eq. (15), which are used to calculate the explosion sparks and explosion amplitude of each firework in Eq. (1) and Eq. (2) respectively.
\(E C(t)=40 *\left(e^{0.69 *\left(\frac{t-\max }{\max }\right)}\right)\) (14)
\(A C(t)=\frac{40}{1+e^{\left(0.015^{*}(t-\max )\right)}}\) (15)
where max represents the maximum of evaluation times, and t denotes the current evaluation times.
3.3 The Framework of HDEFWA
In the bare-bones FWA, The global and local searches are performed by controlling amplitude and sparks when fireworks explode. However, this mechanism is not flexible enough in terms of the diversity and searchability of the population. First, to better reflect the diversity of fireworks explosions, the amplitude and the sparks of the explosions should change with the number of evaluations. Second, it should share more information about other individuals in the population. As far as swarm intelligence is concerned, individuals do not have adequate access to information about the population.
Algorithm 2. The HDEFWA
Because of the above analysis, CDE and dynamic coefficients are used to improve FWA. The coefficient of the explosion amplitude and explosion spark is adjusted dynamically with the evaluation times. In each generation, CDE and dynamic coefficients are used to search for new feasible solutions, which avoid premature of the FWA, and the diversity of the FWA is further improved. The framework of the HDEFWA is as algorithm 2. The algorithm set parameters and initialize fireworks (line 1 to line 2). Next, set termination conditions and begin the main loop (line 3). First, each firework explodes according to its amplitude and number of sparks, the explosion sparks are generated (line 4 to line 6). Second, Gaussian sparks are generated (line 7 to line 8). Subsequently, obtain the position and fitness values of the two types of sparks (line 9). Select the best spark from both explosions to represent the population of the fireworks for the next generation candidate (line 10). Randomly select p-1 sparks from both explosions sparks (line 11). The selected P sparks are improved by mutation, crossover, and selection of the CDE, then generate the next generation (line 12). Check the termination condition (line 13).
4. Experimental Simulation and Analysis
4.1 Experimental Parameters Setting and Test Function
13 benchmark functions are selected to test the performance of the HDEFWA. f1–f5 are unimodal functions and f6–f13 are basic multimodal functions in the benchmark functions, which are described in detail in reference [20].
In the experimental simulation, the dimension size of the population is 30. The maximum number of evaluations for the function is set to 10000, i.e., the termination criterion of algorithms. The HDEFWA and relevant algorithms run independently 30 times on each test function. The experimental hardware environment contains 3.00GHz Intel-Core i7-9700 Processor and 16GB RAM. The software environment includes Windows 10 operating system and Matlab R2018a.
So far, researchers have not agreed on the size of the population. The population P is determined by each generation of fireworks in FWA. The population P of CDE is the same as HEDFWA, which are all set to 5. Each algorithm is run 30 times on the benchmark functions, standard deviation and the mean of the algorithm are recorded in the experiment. To evaluate objectively and impartially, the Wilcoxon rank-sum test and the Friedman test [18] are used to analyze the experimental results.
4.2 Experimental Results and Comparison
To illustrate the performance of the HDEFWA, which is compared with CDE, FWA, and VACUFWA [21], Friedman test result of CDE, FWA, VACUFWA, and HDEFWA is 3.85, 2.77, 2.04, and 1.35. The HDEFWA gets the best result. Wilcoxon rank-sum test is shown in Table 1. The symbol“≈”, “−”, and “+” indicates that the other three algorithms are respectively similar to, inferior to, and superior to HDEFWA in performance. As you can observe from the last rows of Table 1 and Table 2, in the four algorithms for the 13 test functions, HDEFWA is the best one in general. HDEFWA is superior to CDE, FWA, and HDEFWA, respectively, for almost all test functions, but only inferior to VACUFWA on f5 when the dimension is 30 and inferior to FWA when the dimension is 100.
(Table 1) Comparison of CDE, FWA, VACUFWA, and HDEFWA for 13 functions (D=30, EavTimes=10000)
(Table 2) Comparison of CDE, FWA, VACUFWA, and HDEFWA for 13 functions (D=100, EavTimes=10000)
To observe the convergence speed of each algorithm more intuitively, Figure 1 shows the convergence characteristics of 9 representative functions. The convergence curves of the CDE, FWA, VACUFWA, and VACUFWA are painted by four lines with different styles. To make the images more apparent, the ordinate represents the mean of the objective function of 30 experiments in the form of log10(f). The abscissa in the figure represents the evaluation times of the function.
As shown in Figure 1, by comparing the convergence curves of CDE, FWA, VACUFWA, and HDEFWA, the following results can be obtained: Firstly, the convergence property of FWA is significantly better than that of CDE, and the convergence property of HDEFWA is best in the four algorithms. Secondly, in the early phase of evolution, for 9 functions, the convergence rate of the HDEFWA is similar to VACUFWA and FWA; in the middle and late stage of evolution, for the function f1, f2, f3, f8, f10, and f11, dynamic coefficient of HDEFWA cause that the amplitude of explosion to decrease to generate more sparks in a small explosion range. The convergence speed of HDEFWA is faster than the FWA and VACUFWA. Thirdly, for the function f4, f12, and f13, the convergence rate of HDEFWA is similar to that of VACUFWA and FWA. HDEFWA has a better diversity property.
(Figure 1) The convergence curves of the standard CDE, FWA, VACUFWA, and HDEFWA on 12 test functions
4.3 The Effectiveness of the Two Components in HDEFWA
As above-mentioned, two new methods, i.e., dynamic coefficients and differential evolution with commensal learning are used to improve the HDEFWA. To investigate the impact of these two components, we conducted a set of experiments on 13 benchmark functions. Therefore, we verify the performance of the four algorithms which are FWA, FWA_CDE, FWA_DC, and HDEFWA. FWA_CDE is bare-bones FWA combining with differential evolution and commensal learning. FWA_DC is bare-bones FWA with the dynamic coefficient. The parameters setting of FWA_DE and FWA_DC are the same as that of FWA and HDEFWA.
As shown in Table 3, the HDEFWA gets the best results out of the four algorithms for the 13 test functions. The performance of HDEFWA is improved by CDE and DC, among the 13 test functions, 10 test functions have significantly better solution quality than the bare-bones FWA. The statistical results show that HDEFWA surpasses FWA_DE on 9 test functions, and surpasses FWA_DC on 6 test functions. Combing with differential evolution and commensal learning, individuals can make more effective use of beneficial information in the population, thus HDEFWA produces better quality solutions. Dynamic coefficient enables firework to balance the exploration and exploitation ability of HDEFWA with the number of evaluation.
(Table 3) Experimental results of FWA, FWA_CDE, FWA_DC, and HDEFWA for 13 functions (D=30, EavTimes=10000)
4.4 Complexity Analysis
The runtime complexity of a classic FWA is O(E*Ps*(S*D+SN)). Where E is the maximum number of evaluations for the function, and Ps is the number of fireworks in each iteration, S is the sparks generated by a firework in the explosion, and D is the dimension of the fireworks population, and SN is the total number of sparks generated by currently fireworks. The runtime complexity of classic DE is O(Gmax*NP*D) [22], where Gmax is the maximum number of generations, and Ps is the number of population, and D is the dimension of the problem. In HDEFWA, Gmax is set to the same to E, so the total complexity of HDEFW is O(E*(Ps*(S*D+SN)+NP*DP)), where the number of population NP is set to the same value as Ps, and DP is also the dimension of the fireworks population. Hence the runtime complexity of HDEFWA does not increase.
5. Conclusions
HDEFWA has the smallest mean among the four algorithms and a greater rate of convergence than FWA and VACUFWA. The possible reason is that CDE and dynamic coefficients improve FWA. This indicates that the contribution of CDE operators enhances the diversity of FWA search and increases the information sharing between fireworks and sparks. Furthermore, the dynamic coefficient balances the local exploitation and global exploration. Compared with CDE, FWA, and VACUFWA based on 13 benchmark test functions, the superior performance of HDEFWA is verified. Our ongoing work is implementing the experiment of FWA on the spark platform. In the future, we will also seek the potential hybridization strategies of FWA and exhibit more different behaviors.
References
- Gandomi A H, Yang X S., "Chaotic bat algorithm" Journal of Computational Science, 5(2): 224-232, 2014. https://doi.org/10.1016/j.jocs.2013.10.002
- Gandomi A H, Yang X S, Talatahari S, et al. "Firefly algorithm with chaos," Communications in Nonlinear Science and Numerical Simulation, 18(1): 89-98, 2013. https://doi.org/10.1016/j.cnsns.2012.06.009
- Gandomi A H, Yang X S, Alavi A H., "Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems," Engineering with computers, 29(1): 17-35. 2013. http://dx.doi.org/10.1007/s00366-012-0308-4
- Liu J, Lampinen J., "A fuzzy adaptive differential evolution algorithm," Soft Computing, 9(6): 448-462, 2005. https://doi.org/10.1007/s00500-004-0363-x
- Tan Y, Zhu Y., "Fireworks algorithm for optimization," International conference in swarm intelligence, Springer, Berlin, Heidelberg, 355-364, 2010. https://doi.org/10.1007/978-3-642-13495-1_44
- Imran, A.M., Kowsalya, M., Kothari, D.P., "A novel integration technique for optimal network reconfiguration and distributed generation placement in power distribution networks," International Journal of Electrical Power & Energy Systems, 63, 461-472, 2014. https://doi.org/10.1016/j.ijepes.2014.06.011
- Luo H, Xu W, Tan Y., "A discrete fireworks algorithm for solving large-scale travel salesman problem," 2018 IEEE Congress on Evolutionary Computation (CEC). IEEE, 1-8, 2018. https://doi.org/10.1109/CEC.2018.8477992
- Lana I, Del Ser J, Velez M., "A novel Fireworks Algorithm with wind inertia dynamics and its application to traffic forecasting," 2017 IEEE Congress on Evolutionary Computation (CEC), IEEE, 706-713, 2017. https://doi.org/10.1109/CEC.2017.7969379
- Zhou X, Zhao Q, Zhang D.. "Discrete Fireworks Algorithm for Welding Robot Path Planning," Journal of Physics: Conference Series. IOP Publishing, 1267(1):012003, 2019. https://doi.org/10.1088/1742-6596/1267/1/012003
- Tuba E, Strumberger I, Bacanin N, et al., "Bare bones fireworks algorithm for feature selection and SVM optimization," 2019 IEEE Congress on Evolutionary Computation (CEC), IEEE, 2207-2214, 2019. https://doi.org/10.1109/CEC.2019.8790033
- S. Zheng, A. Janecek, and Y. Tan, "Enhanced Fireworks Algorithm," 2013 IEEE Congress on, pp.2069-2077, 2013. https://doi.org/10.1109/CEC.2013.6557813
- Xue Y, Zhao B, Ma T, et al., "A self-adaptive fireworks algorithm for classification problems," IEEE Access, 6:44406-44416, 2018. https://doi.org/10.1109/ACCESS.2018.2858441
- Li J, Tan Y., "A Comprehensive Review of the Fireworks Algorithm," ACM Computing Surveys (CSUR), 52(6):1-28, 2019. https://doi.org/10.1145/3362788
- Barraza J, Melin P, Valdez F, et al., "Iterative fireworks algorithm with fuzzy coefficients," 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), IEEE, 1-6, 2017. https://doi.org/10.1109/FUZZ-IEEE.2017.8015524
- Zheng Y J, Xu X L, Ling H F, et al., "A hybrid fireworks optimization method with differential evolution operators," Neurocomputing, 148: 75-82, 2015. https://doi.org/10.1016/j.neucom.2012.08.075
- Gong C., "Opposition-based adaptive fireworks algorithm," Algorithms, 9(3): 43, 2016. https://doi.org/10.3390/a9030043
- Price K V., Differential evolution[M]//Handbook of Optimization, Springer, Berlin, Heidelberg, 187-214, 2013. https://doi.org/10.1007/978-3-642-30504-7_8
- Peng H, Wu Z, Deng C., "Enhancing differential evolution with commensal learning and uniform local search," Chinese Journal of Electronics, 26(4): 725-733, 2017. https://doi.org/10.1049/cje.2016.11.010
- Zhang J, Sanderson A C., "JADE: adaptive differential evolution with optional external archive," IEEE Transactions on evolutionary computation, 13(5): 945-958, 2009. https://doi.org/10.1109/TEVC.2009.2014613
- X. J. Wang, H. Peng, C. S. Deng et al., "Firefly algorithm based on uniform local search and variable step size" . Journal of Computer Applications, vol. 38, no. 3, pp. 715-721, 2018. http://en.cnki.com.cn/Article_en/CJFDTotal-JSJY201803020.htm
- Lixian Li, Jaewan Lee. "The Variable Amplitude Coefficient Fireworks Algorithm with Uniform Local Search Operator," Journal of Internet Computing and Services, vol. 21, no. 3, pp. 21-28, 2020. https://doi.org/10.7472/jksii.2020.21.3.21
- S. Das, A. Abraham, U. K. Chakraborty and A. Konar, "Differential Evolution Using a Neighborhood-Based Mutation Operator," in IEEE Transactions on Evolutionary Computation, vol. 13, no. 3, pp. 526-553, June 2009. https://doi.org/10.1109/TEVC.2008.2009457