An Improved Gravitational Search Algorithm For Optimization Problems
An Improved Gravitational Search Algorithm For Optimization Problems
An Improved Gravitational Search Algorithm For Optimization Problems
Wei Li1,2
1. School of Computer Science and Engineering, Xi’an University of Technology, Xi’an 710048, China
2. Shaanxi Key Laboratory for Network Computing and Security Technology, Xi’an 710048, China
E-mail: [email protected]
Abstract: Although GSA is an effective optimization algorithm, the best fitness found by GSA cannot be improved in
every generation. In order to improve the performance of GSA, this paper proposed an improved gravitational search
algorithm (IGSA). In IGSA, a crossover operation is introduced in IGSA so that each solution can inherit some useful
information from the global best solution. The exploitation capability of the algorithm can be greatly enhanced. The
linearly decreasing weight is used to balance the global and local search abilities. To verify the effectiveness of IGSA,
numerical experiments are carried on ten benchmark problems from CEC2014. The experimental results show that IGSA
is competitive with respect to other compared algorithms for solving optimization problems.
Key Words: Gravitational Search Algorithm, Crossover, Weight, Optimization Algorithm
978-1-7281-0106-4/19/$31.00 2019
c IEEE 2605
ൌ
ൈ ሺെ߬ ൈ ሻ (3)
ݔ௦௧ǡ ݂݅ ݀݊ܽݎ൏ ݆ݎܴܥൌ ݆ௗ
ೌೣ ݑǡ ൌ ൝ ାଵ (12)
where
is set to 1 and IJ is set to 23. ݔǡ ݁ݏ݅ݓݎ݄݁ݐ
In order to avoid trapping in a local optimum, Kbest is where CR is the crossover rate which is used to control the
introduced in the GSA. Kbest represents the set of the top K probability of inheriting from the global best solution at
best solutions where K is a linear decreasing function of generation g. rand is the uniformly distributed random
time. The total gravitational force that acts on mass i in number in the range (0,1). jrandę(1,2,…D) is a randomly
different directions is obtained by the following formula chosen index. With the crossover operation, the solution
[10]. inherits some information from the best solution, which can
ܨ ൌ σேୀଵǡஷ ݀݊ܽݎ ܨǡ (4) ାଵ
improve the exploitation ability of the algorithm.ݔǡ is
where randj is the uniformly distributed random number in calculated according to Eq.(9).
the range (0,1). Subsequently, a greedy selection scheme is used
According to the law of motion, the acceleration of the
ܝ ݂݂݅൫ܝ ൯ ൏ ݂ሺ ݐݏܾ݁ܠሻ
݃
ାଵ
mass i at generation g is given by the following formula ܠ ൌቊ
ାଵ
(13)
[10]. ܠ ݁ݏ݅ݓݎ݄݁ݐ
ிǡೕ The pseudo-code of IGSA is illustrated in Algorithm 1.
ܽǡ ൌ (5)
ெ Algorithm 1. IGSA algorithm
where ܯ is the inertial mass of ith solution. ܯ is updated 01: Set g =0, CR =0.9, FES=0, G0=1
by the following formula [10]. 02: Initialize population
൫ܠ ൯ି൫ܠೢೝೞ ൯
03: while termination condition is not satisfied
݉ ൌ (6) 04: Evaluate the fitness value of each solution
൫್ܠೞ ൯ି൫ܠೢೝೞ ൯
05: Update ܠ௦௧ , ܠ௪௦௧ ,ܯ (i=1,2,…,NP)
ܯ ൌ σಿು (7) 06: for i=1 to NP
ೕసభ ೕ 07: Compute the total force of ith solution
where ݂൫ ܠ ൯ represents the fitness of the solution xi at according to Eq.(4)
08: endfor
generation g. ݂൫ ܠ௦௧ ൯ and ݂൫ ܠ௪௦௧ ൯ represent the
09: for i=1 to NP
solution with the best fitness value and the worst fitness 10: Compute the acceleration of each solution
value, respectively. according to Eq.(5)
Afterward, the position and the velocity of each solution are 11: Update the velocity of each solution
calculated by the following formula [10]. according to Eq.(10)
ାଵ
ݒǡ ൌ ݀݊ܽݎ ൈ ݒǡ ܽǡ (8) 12: Update the position of the ith solution
ାଵ ାଵ according to Eq.(9)
ݔǡ ൌ ݔǡ ݒǡ (9) 13: Generate jrand
The detail of GSA can be found in [10]. 14: for j=1 to D
15: Compute ݑǡ according to Eq.(12)
3 IMPROVED GRAVITATIONAL SEARCH
16: endfor
ALGORITHM
17: if ݂൫ܝ ൯ ൏ ݂ሺ ܠ௦௧ ሻ
Traditional GSA employs the acceleration and velocity to 18: ାଵ
ܠ ൌ ܝ
update the position of each solution during the search 19: endif
process. The search mechanism shows better performance 20: endfor
in maintaining the population diversity [13]. However, the 21: endwhile
global optimal solution, which is beneficial to promote the
exploitation, is not kept in the population all the time. In 4 EXPERIMENTAL AND DISCUSSIONS
order to make full use of the guidance of the best solution, a
crossover operation between the best solution found so far In order to examine the performance of IGSA, 10
and the current solution is introduced into GSA to improve benchmark problems from CEC2014 are selected. They are
the exploitation ability. In addition, in order to restrict the unimodal problems f1-f3 and simple multimodal problems
change of velocities and control the scope of search, an f4-f10. In this section, IGSA is compared with two
inertia weight is introduced instead of randi. The idea optimization algorithms GSA and PSOGSA, the simulation
comes from the particle swarm optimization algorithm results and comparisons are presented in the following.
(PSO). The velocity of each solution is updated as follows: For all experiments, 30 independent runs are carried out on
ାଵ the same machine with an Intel Core 3.40 GHz CPU, 4 GB
ݒǡ ൌ ߱ ൈ ݒǡ ܽǡ (10) memory, and windows 7 operating system with Matlab
ఠೌೣ ିఠ
߱ ൌ ߱௫ െ ݃ (11) R2013a, and conducted with D×10,000 (number of
ெ௫ீ
function evaluations, FES). In our experimental studies, the
where Ȧmax and Ȧmin are set to 0.9 and 0.4, respectively. g is
mean value (Fmean) and standard deviation (SD) of the
the current generation. MaxGen is the maximum
solution error measure [14] which is defined as ݂ሺݔሻ െ
generation.
݂ሺ כ ݔሻ are recorded for evaluating the performance of each
The proposed crossover operation is calculated as follows: algorithm, where f(x) is the best fitness value found by an
2606 The 31th Chinese Control and Decision Conference (2019 CCDC)
algorithm in a run, and f(x*) is the real global optimization 7
log10(F(x)-F(x *))
0.05 significant level and three marks “+”, “െ” and “§” are 6
GSA
used to evaluate the results clearly. “+”, “െ” and “§” denote PSOGSA
IGSA
that the performance of IGSA is better than, worse than, and 5.5
From the statistical results of Table 1, we can see that IGSA Fig.1 Convergence graphs of f1
2
performs well on f1- f10 except f2. GSA performs well on f2.
IGSA beats GSA and PSOGSA on nine and eight test 1.5
log10(F(x)-F(x *))
problems, respectively. On the contrary, GSA and 1 GSA
PSOGSA outperform IGSA only on one and zero test PSOGSA
problems, respectively. The reason that IGSA has the 0.5 IGSA
problems f1 and f8 in terms of the mean errors (in Fig.2 Convergence graphs of f8
logarithmic scale) achieved by each of three algorithms for 4.2 Comparison of IGSA with other optimization
CEC2014 problems versus the number of FES for 10 algorithms on 30 Dimension
dimensions. It can be obviously observed that IGSA shows
fast convergence rate on problems f1 and f8. It is worth From the statistical results of Table 2, we can see that IGSA
mentioning that GSA and PSOGSA tend to suffer from performs well on all the problems except f2 and f5. GSA
slow exploitation on these problems. performs well on f2. PSOGSA performs well on f5. IGSA
beats GSA and PSOGSA on nine and nine test problems,
Table 1. Experimental results of GSA, PSOGSA and IGSA on
respectively. On the contrary, GSA and PSOGSA
f1-f10
outperform IGSA only on zero and one test problems,
Function GSA PSOGSA IGSA respectively. Thus, IGSA provides better results than the
compared algorithms. Fig.3 and Fig.4 displayed the
MeanfSD MeanfSD MeanfSD
convergence graphs of problems f3 and f6 in terms of the
4.23e+06f 8.81e+06f 3.88e+04f
f1 mean errors (in logarithmic scale) achieved by each of three
1.37e+06 2.81e+07 3.40e+04
algorithms for CEC2014 problems versus the number of
3.42e+02f 6.22e+08f 2.68e+03f
f2 FES for 30 dimensions. As seen, IGSA shows fast
3.74e+02 1.24e+09 2.77e+03
convergence rate on problems f3 and f6. GSA tends to suffer
1.81e+04f 1.75e+04f 4.51e+03f
f3
4.26e+03 2.63e+04 3.57e+03
from trapping in local optima on these problems. Therefore,
6.31e+01f 6.29e+01f 1.81e+01f
it can be concluded that the proposed crossover operation in
f4
1.15e+01 1.11e+02 2.08e+01
IGSA can achieve better performance.
2.04e+01f 2.01e+01f 2.00e+01f Table 2. Experimental results of GSA, PSOGSA and IGSA on
f5
9.47e-02 1.49e-01 5.45e-02 f1-f10
1.09e+01f 4.76e+00f 4.25e+00f
f6 Function GSA PSOGSA IGSA
7.66e-01 2.17e+00 1.66e+00
2.29e+01f 1.85e+01f 2.94e-01f MeanfSD MeanfSD MeanfSD
f7
1.80e+01 2.55e+01 1.54e-01 4.25e+06f 1.17e+08f 5.62e+05f
4.09e+01f 2.86e+01f 9.84e-01f f1
f8 9.95e+05 2.07e+08 4.09e+05
1.30e+01 1.51e+01 1.80e+00 8.45e+03f 9.64e+09f 1.09e+04f
3.96e+01f 4.42e+01f 1.38e+01f f2
f9 2.82e+03 1.09e+10 8.56e+03
9.81e+00 2.08e+01 5.13e+00 8.80e+04f 3.67e+04f 1.22e+03f
1.22e+03f 6.66e+02f 1.66e+02f f3
f10 1.14e+04 3.74e+04 1.35e+03
3.41e+02 3.40e+02 1.12e+02 1.77e+02f 6.61e+02f 9.64e+01f
f4
+ 9 8 / 4.69e+01 8.48e+02 3.93e+01
– 1 0 / 2.09e+01f 2.02e+01f 2.03e+01f
f5
6.07e-02 1.58e-01 1.62e-01
Ĭ 0 2 /
3.61e+01f 2.37e+01f 1.57e+01f
f6
2.64e+00 4.26e+00 3.27e+00
7.05e+01f 8.23e+01f 2.72e-01f
f7
6.55e+01 9.23e+01 2.78e-01
1.55e+02f 1.82e+02f 7.05e+00f
f8
1.77e+01 4.33e+01 9.16e+00
f9 1.86e+02f 2.91e+02f 1.13e+02f
The 31th Chinese Control and Decision Conference (2019 CCDC) 2607
2.65e+01 7.60e+01 2.54e+01 Evolution for large-scale optimization, Information Sciences,
Vol.323, 106-129, 2015.
3.88e+03f 3.66e+03f 5.46e+02f
f10 [3] R. Moeini, M. Soltani-nezhad, M. Daei, Constrained
6.33e+02 6.58e+02 3.38e+02
gravitational search algorithm for large scale reservoir
+ 9 9 / operation optimization problem, Engineering Applications of
– 0 1 / Artificial Intelligence, Vol. 62, 222-233, 2017.
[4] G. Y. Sun, A. Z. Zhang, X. P. Jia, X. D. Li, S. Y. Ji, Z. J.
Ĭ 1 0 /
Wang, DMMOGSA: Diversity-enhanced and memory-based
multi-objective gravitational search algorithm, Information
5 Sciences,Vol.363, 52-71, 2016.
[5] M. Alswaitti, M. K. Ishak, N. A. M. Isa, Optimized
log10(F(x)-F(x *))
4.5
gravitational-based data clustering algorithm, Engineering
GSA Applications of Artificial Intelligence, Vol. 73, 126-148,
4
PSOGSA 2018.
IGSA
3.5 [6] M. F. Zaman, S. M. Elsayed, T. Ray, R. A. Sarker,
Evolutionary Algorithms for Dynamic Economic Dispatch
3 Problems, IEEE Trans. on Power Systems, Vol. 31, No. 2,
0 0.5 1 1.5 FES 2 2.5
5
3
1486-1495, 2016.
x 10
[7] H. Garg, A hybrid GSA-GA algorithm for constrained
Fig.3 Convergence graphs of f3
1.6
optimization problems, Information Sciences, Vol.
GSA 478,499-523, 2019.
[8] D. Pelusi, R. Mascella, L. Tallini, J. Nayak, B. Naik, A.
log10(F(x)-F(x *))
1.5 PSOGSA
1.4
IGSA
Abraham, Neural network and fuzzy system for the tuning of
Gravitational Search Algorithm parameters, Expert Systems
1.3 with Applications, Vol.102, 234-244, 2018.
1.2 [9] C. Shou-Hsiung, C. Shyi-Ming, J. Wen-Shan, Fuzzy time
series forecasting based on fuzzy logical relationships and
1.1
0 0.5 1 1.5 FES 2 2.5 3 similarity measures, Information Sciences, Vol. 327,
x 10
5 272-287, 2016.
Fig.4 Convergence graphs of f6 [10] E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, GSA: A
Gravitational Search Algorithm, Intelligent Information
5 CONCLUSIONS Management, Vol.4, No. 6, 390-395, 2012.
[11] S. Mirjalili, A. Lewis, Adaptive gbest-guided gravitational
An improved optimization algorithm based on GSA
search algorithm, Neural Computing and Applications, Vol.
(IGSA) is proposed in this paper. With the crossover 25, No.7, 1569-1584, 2014.
operation, the dimensions of each solution will learn from [12] J. J. Liang, B. Y. Qu, P. N. Suganthan, Problem Definitions
the global best solution under certain conditions; otherwise and Evaluation Criteria for the CEC 2014 Special Session
it will learn from its own. Therefore, the exploitation ability and Competition on Single Objective Real-Parameter
of the algorithm can be improved. Moreover, the linearly Numerical Optimization, Zhengzhou University and
decreasing weight is used to balance the global and local Nanyang Technological University, Tech. Rep., 2013.
search abilities. The performance of IGSA is tested on ten [13] S. Darzi , T.S. Kiong, M.T. Islam, H. R. Soleymanpour, S.
benchmark problems from CEC2014 benchmark problems. Kibria, A memory-based gravitational search algorithm for
The comparisons with two other algorithms indicate that enhancing minimum variance distortion- less response
IGSA achieves better performance in most cases. Future beamforming, Applied Soft Computing, Vol. 47, 103-18,
work will apply IGSA to solve multimodal optimization 2016.
problems. [14] P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y. P. Chen,
A. Auger, S. Tiwari, Problem definitions and evaluation
REFERENCES criteria for the CEC2005 special session on real-parameter
optimization, 2005, URL <http://www.ntu.edu.sg/home/
[1] S. Mirjalili, A. H. Gandomi, Chaotic gravitational constants EPNSugan>.
for the gravitational search algorithm, Applied Soft
[15] R. V. Rao, V. J. Savsani, D. P. Vakharia,
Computing, Vol.53, 407-419, 2017.
Teaching-learning-based optimization: A novel method for
[2] C. Segura, C. A. CoelloCoello, A. G. Hernández-Díaz, constrained mechanical design optimization problems,
Improving the vector generation strategy of Differential Computer-Aided Design, Vol.43, No.3, 303-315, 2011.
2608 The 31th Chinese Control and Decision Conference (2019 CCDC)