[1] J. Cheng, G. Yuan, M. Zhou, S. Gao, C. Liu, and H. Duan, “A fluid mechanics-based data flow model to estimate VANET capacity,” IEEE Transactions on Intelligent Transportation Systems, 2019, doi: 10.1109/TITS.2019.2921074.
[2] Y. Wang, Y. Yu, S. Cao, X. Zhang, and S. Gao, “A review of applications of artificial intelligent algorithms in wind farms,” Artificial Intelligence Review, 2019, doi: 10.1007/s10462-019-09768-7.
[3] S. Gao, Q. Cao, Z. Zhang, and Z. Tang, “A chaotic clonal selection algorithm and its application to synthesize multiple-valued logic functions,” IEEJ Transactions on Electrical and Electronic Engineering, vol. 5, no. 1, pp. 105–114, 2010.
[4] S. Gao, S. Song, J. Cheng, Y. Todo, and M. Zhou, “Incorporation of solvent effect into multi-objective evolutionary algorithm for improved protein structure prediction,” IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 15, no. 4, pp. 1365–1378, 2018.
[5] J. Wang, B. Cen, S. Gao, Z. Zhang, and Y. Zhou, “Cooperative evolutionary framework with focused search for many-objective optimization,” IEEE Transactions on Emerging Topics in Computational Intelligence, 2018, doi: 10.1109/TETCI.2018.2849380.
[6] J. Wang, L. Yuan, Z. Zhang, S. Gao, Y. Sun, and Y. Zhou, “Multiobjective multiple neighborhood search algorithms for multiobjective fleet size and mix locationrouting problem with time windows,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2019, doi: 10.1109/TSMC.2019.2912194.
[7] P. J. Van Laarhoven and E. H. Aarts, “Simulated annealing,” in Simulated Annealing: Theory and Applications. Springer, 1987, pp. 7–15.
[8] F. Glover and M. Laguna, “Tabu search,” in Handbook of Combinatorial Optimization. Springer, 1998, pp. 2093–2229.
[9] D. S. Weile and E. Michielssen, “Genetic algorithm optimization applied to electromagnetics: A review,” IEEE Transactions on Antennas and Propagation, vol. 45, no. 3, pp. 343–353, 1997.
[10] X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999.
[11] K. V. Price, “Differential evolution,” in Handbook of Optimization. Springer, 2013, pp. 187–214.
[12] Y. Yu, S. Gao, Y. Wang, and Y. Todo, “Global optimum-based search differential evolution,” IEEE/CAA Journal of Automatica Sinica, vol. 6, no. 2, pp. 379–394, 2018.
[13] S. Gao, Y. Yu, Y. Wang, J. Wang, J. Cheng, and M. Zhou, “Chaotic local searchbased differential evolution algorithms for optimization,” IEEE Transactions on Systems, Man and Cybernetics: Systems, 2019, doi: 10.1109/TSMC.2019.2956121.
[14] M. Dorigo and C. Blum, “Ant colony optimization theory: A survey,” Theoretical Computer Science, vol. 344, no. 2-3, pp. 243–278, 2005.
[15] R. Eberhart and J. Kennedy, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, vol. 4. Citeseer, 1995, pp. 1942–1948.
[16] Y. Shi, “Brain storm optimization algorithm,” in International Conference in Swarm Intelligence. Springer, 2011, pp. 303–309.
[17] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a gravitational search algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232–2248, 2009.
[18] E. Atashpaz-Gargari and C. Lucas, “Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition,” in 2007 IEEE congress on Evolutionary Computation. IEEE, 2007, pp. 4661–4667.
[19] H.-G. Beyer and H.-P. Schwefel, “Evolution strategies-a comprehensive introduction,” Natural Computing, vol. 1, no. 1, pp. 3–52, 2002.
[20] Z. W. Geem, Music-inspired harmony search algorithm: theory and applications. Springer, 2009, vol. 191.
[21] S. A. Hofmeyr and S. Forrest, “Architecture for an artificial immune system,” Evolutionary Computation, vol. 8, no. 4, pp. 443–473, 2000.
[22] K. M. Passino, “Bacterial foraging optimization,” International Journal of Swarm Intelligence Research, vol. 1, no. 1, pp. 1–16, 2010.
[23] A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems,” Engineering with Computers, vol. 29, no. 1, pp. 17–35, 2013.
[24] D. Karaboga and B. Akay, “A comparative study of artificial bee colony algorithm,” Applied Mathematics and Computation, vol. 214, no. 1, pp. 108–132, 2009.
[25] S. Salcedo-Sanz, J. Del Ser, I. Landa-Torres, S. Gil-Lopez, and J. Portilla-Figueras, ´ “The coral reefs optimization algorithm: a novel metaheuristic for efficiently solving optimization problems,” The Scientific World Journal, vol. 2014, Article ID 739768, 15 pages, 2014.
[26] X.-S. Yang, “Firefly algorithm, levy flights and global optimization,” in Research and Development in Intelligent Systems XXVI. Springer, 2010, pp. 209–218.
[27] R. V. Rao, V. Savsani, and J. Balic, “Teaching–learning-based optimization algorithm for unconstrained and constrained real-parameter optimization problems,” Engineering Optimization, vol. 44, no. 12, pp. 1447–1462, 2012.
[28] M. Eusuff, K. Lansey, and F. Pasha, “Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization,” Engineering Optimization, vol. 38, no. 2, pp. 129–154, 2006.
[29] H. Duan and P. Qiao, “Pigeon-inspired optimization: a new swarm intelligence optimizer for air robot path planning,” International Journal of Intelligent Computing and Cybernetics, vol. 7, no. 1, pp. 24–37, 2014.
[30] L. Shan, H. Qiang, J. Li, and Z.-q. Wang, “Chaotic optimization algorithm based on tent map,” Control and Decision, vol. 20, no. 2, pp. 179–182, 2005.
[31] H. Shah-Hosseini, “The intelligent water drops algorithm: a nature-inspired swarmbased optimization algorithm,” International Journal of Bio-inspired Computation, vol. 1, no. 1-2, pp. 71–79, 2009.
[32] M.-H. Tayarani-N and M. Akbarzadeh-T, “Magnetic optimization algorithms a new synthesis,” in 2008 IEEE Congress on Evolutionary Computation. IEEE, 2008, pp. 2659–2664.
[33] J. Cheng, J. Cheng, M. Zhou, F. Liu, S. Gao, and C. Liu, “Routing in internet of vehicles: A review,” IEEE Transactions on Intelligent Transportation Systems, vol. 16, no. 5, pp. 2339–2352, 2015.
[34] Y. Liu, D. Cheng, Y. Wang, J. Cheng, and S. Gao, “A novel method for predicting vehicle state in internet of vehicles,” Mobile Information Systems, vol. 2018, Article ID 9728328, 2018.
[35] S. Gao, Q. Cao, M. Ishii, and Z. Tang, “Local search with probabilistic modeling for learning multiple-valued logic networks,” IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. 94, no. 2, pp. 795–805, 2011.
[36] J. Sun, S. Gao, H. Dai, J. Cheng, M. Zhou, and J. Wang, “Bi-objective elite differential evolution for multivalued logic networks,” IEEE Transactions on Cybernetics, vol. 50, no. 1, pp. 233–246, 2020.
[37] S. Gao, J. Zhang, X. Wang, and Z. Tang, “Multi-layer neural network learning algorithm based on random pattern search method,” International Journal of Innovative Computing, Information and Control, vol. 5, no. 2, pp. 489–502, 2009.
[38] S. Gao, M. Zhou, Y. Wang, J. Cheng, H. Yachi, and J. Wang, “Dendritic neural model with effective learning algorithms for classification, approximation, and prediction,” IEEE Transactions on Neural Networks and Learning Systems, vol. 30, no. 2, pp. 601–614, 2019.
[39] A.-L. Barabasi, “Scale-free networks: a decade and beyond,” ´ Science, vol. 325, no. 5939, pp. 412–413, 2009.
[40] D. J. Watts and S. H. Strogatz, “Collective dynamics of ’small-world’ networks,” Nature, vol. 393, no. 6684, pp. 440–442, 1998.
[41] H. Dai, S. Gao, Y. Yang, and Z. Tang, “Effects of rich-gets-richer rule on small-world networks,” Neurocomputing, vol. 73, no. 10-12, pp. 2286–2289, 2010.
[42] R. Bell and P. Dean, “Properties of vitreous silica: Analysis of random network models,” Nature, vol. 212, no. 5068, pp. 1354–1356, 1966.
[43] M. E. Newman, “Analysis of weighted networks,” Physical Review E, vol. 70, no. 5, 056131, 2004.
[44] S. Boccaletti, V. Latora, Y. Moreno, M. Chavez, and D.-U. Hwang, “Complex networks: Structure and dynamics,” Physics Reports, vol. 424, no. 4, pp. 175–308, 2006.
[45] J. Cheng, X. Wu, M. Zhou, S. Gao, Z. Huang, and C. Liu, “A novel method for detecting new overlapping community in complex evolving networks,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2018, doi: 10.1109/TSMC.2017.2779138.
[46] J. Cheng, M. Chen, M. Zhou, S. Gao, C. Liu, and C. Liu, “Overlapping community change point detection in an evolving network,” IEEE Transactions on Big Data, 2018, doi: 10.1109/TBDATA.2018.2880780.
[47] S. Gao, Y. Wang, J. Wang, and J.-J. Cheng, “Understanding differential evolution: A Poisson law derived from population interaction network,” Journal of Computational Science, vol. 21, pp. 140–149, 2017.
[48] B. Dorronsoro and P. Bouvry, “Improving classical and decentralized differential evolution with new mutation operator and population topologies,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 67–98, 2011.
[49] S. Janson and M. Middendorf, “A hierarchical particle swarm optimizer and its adaptive variant,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 35, no. 6, pp. 1272–1282, 2005.
[50] Y. Wang, S. Gao, Y. Yu, and Z. Xu, “The discovery of population interaction with a power law distribution in brain storm optimization,” Memetic Computing, vol. 11, no. 1, pp. 65–87, 2019.
[51] Y. Yu, S. Gao, S. Cheng, Y. Wang, S. Song, and F. Yuan, “CBSO: a memetic brain storm optimization with chaotic local search,” Memetic Computing, vol. 10, no. 4, pp. 353–367, 2018.
[52] J. Wang, W. Zhang, and J. Zhang, “Cooperative differential evolution with multiple populations for multiobjective optimization,” IEEE Transactions on Cybernetics, vol. 46, no. 12, pp. 2848–2861, 2016.
[53] J. Wang, Y. Zhou, Y. Wang, J. Zhang, C. P. Chen, and Z. Zheng, “Multiobjective vehicle routing problems with simultaneous delivery and pickup and time windows: formulation, instances, and algorithms,” IEEE Transactions on Cybernetics, vol. 46, no. 3, pp. 582–594, 2016.
[54] S. Gao, Y. Wang, J. Cheng, Y. Inazumi, and Z. Tang, “Ant colony optimization with clustering for solving the dynamic location routing problem,” Applied Mathematics and Computation, vol. 285, pp. 149–173, 2016.
[55] J. L. Payne, M. Giacobini, and J. H. Moore, “Complex and dynamic population structures: synthesis, open questions, and future directions,” Soft Computing, vol. 17, no. 7, pp. 1109–1120, 2013.
[56] E. Alba and M. Tomassini, “Parallelism and evolutionary algorithms,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 5, pp. 443–462, 2002.
[57] N. Lynn, M. Z. Ali, and P. N. Suganthan, “Population topologies for particle swarm optimization and differential evolution,” Swarm and Evolutionary Computation, vol. 39, pp. 24–35, 2018.
[58] M. Tomassini, Spatially structured evolutionary algorithms: artificial evolution in space and time. Springer, 2006.
[59] E. Alba and B. Dorronsoro, Cellular genetic algorithms. Springer Science & Business Media, 2009, vol. 42. [60] E. Cantu-Paz, Efficient and accurate parallel genetic algorithms. Springer Science & Business Media, 2000, vol. 1.
[61] D. Whitley, “A genetic algorithm tutorial,” Statistics and Computing, vol. 4, no. 2, pp. 65–85, 1994.
[62] W. Fang, J. Sun, H. Chen, and X. Wu, “A decentralized quantum-inspired particle swarm optimization algorithm with cellular structured population,” Information Sciences, vol. 330, pp. 19–48, 2016.
[63] J. Liao, Y. Cai, T. Wang, H. Tian, and Y. Chen, “Cellular direction information based differential evolution for numerical optimization: an empirical study,” Soft Computing, vol. 20, no. 7, pp. 2801–2827, 2016.
[64] I. Arnaldo, I. Contreras, D. Millan-Ruiz, J. I. Hidalgo, and N. Krasnogor, “Match- ´ ing island topologies to problem structure in parallel evolutionary algorithms,” Soft Computing, vol. 17, no. 7, pp. 1209–1225, 2013.
[65] J. Apolloni, J. Garc´ıa-Nieto, E. Alba, and G. Leguizamon, “Empirical evaluation ´ of distributed differential evolution on standard benchmarks,” Applied Mathematics and Computation, vol. 236, pp. 351–366, 2014.
[66] L. Vanneschi, D. Codecasa, and G. Mauri, “An empirical study of parallel and distributed particle swarm optimization,” in Parallel Architectures and Bioinspired Algorithms. Springer, 2012, pp. 125–150.
[67] F. Zou, D. Chen, and R. Lu, “Hybrid hierarchical backtracking search optimization algorithm and its application,” Arabian Journal for Science and Engineering, vol. 43, no. 2, pp. 993–1014, 2018.
[68] M. Owais and M. K. Osman, “Complete hierarchical multi-objective genetic algorithm for transit network design problem,” Expert Systems with Applications, vol. 114, pp. 143–154, 2018.
[69] D. Sanchez, P. Melin, and O. Castillo, “Optimization of modular granular neural ´ networks using a hierarchical genetic algorithm based on the database complexity applied to human recognition,” Information Sciences, vol. 309, pp. 73–101, 2015.
[70] S. Rastegar, R. Araujo, and J. Mendes, “Online identification of Takagi–Sugeno fuzzy models based on self-adaptive hierarchical particle swarm optimization algorithm,” Applied Mathematical Modelling, vol. 45, pp. 606–620, 2017.
[71] L. Rodr´ıguez, O. Castillo, J. Soria, P. Melin, F. Valdez, C. I. Gonzalez, G. E. Martinez, and J. Soto, “A fuzzy hierarchical operator in the grey wolf optimizer algorithm,” Applied Soft Computing, vol. 57, pp. 315–328, 2017.
[72] F. Herrera, M. Lozano, and C. Moraga, “Hierarchical distributed genetic algorithms,” International Journal of Intelligent Systems, vol. 14, no. 11, pp. 1099–1121, 1999.
[73] Z. Cao, Y. Shi, X. Rong, B. Liu, Z. Du, and B. Yang, “Random grouping brain storm optimization algorithm with a new dynamically changing step size,” in International Conference in Swarm Intelligence. Springer, 2015, pp. 357–364.
[74] H. Zhu and Y. Shi, “Brain storm optimization algorithms with k-medians clustering algorithms,” in 2015 Seventh International Conference on Advanced Computational Intelligence. IEEE, 2015, pp. 107–110.
[75] Y. Yu, S. Gao, Y. Wang, J. Cheng, and Y. Todo, “ASBSO: An improved brain storm optimization with flexible search length and memory-based selection,” IEEE Access, vol. 6, pp. 36 977–36 994, 2018.
[76] Y. Yu, S. Gao, Y. Wang, Z. Lei, J. Cheng, and Y. Todo, “A multiple diversity-driven brain storm optimization algorithm with adaptive parameters,” IEEE Access, vol. 7, pp. 126 871–126 888, 2019.
[77] D. Zhou, Y. Shi, and S. Cheng, “Brain storm optimization algorithm with modified step-size and individual generation,” International Conference in Swarm Intelligence, pp. 243–252, 2012.
[78] Z. Yang and Y. Shi, “Brain storm optimization with chaotic operation,” in 2015 Seventh International Conference on Advanced Computational Intelligence. IEEE, 2015, pp. 111–115.
[79] Y. Yang, Y. Shi, and S. Xia, “Advanced discussion mechanism-based brain storm optimization algorithm,” Soft Computing, vol. 19, no. 10, pp. 2997–3007, 2015.
[80] Z. Jia, H. Duan, and Y. Shi, “Hybrid brain storm optimisation and simulated annealing algorithm for continuous optimisation problems,” International Journal of Bio-Inspired Computation, vol. 8, no. 2, pp. 109–121, 2016.
[81] Z. Cao, X. Hei, L. Wang, Y. Shi, and X. Rong, “An improved brain storm optimization with differential evolution strategy for applications of ANNs,” Mathematical Problems in Engineering, vol. 2015, Article ID 923698, 18 pages, 2015.
[82] S. Cheng, Q. Qin, J. Chen, and Y. Shi, “Brain storm optimization algorithm: a review,” Artificial Intelligence Review, vol. 46, no. 4, pp. 445–458, 2016.
[83] Y. Wang, Y. Yu, S. Gao, H. Pan, and G. Yang, “A hierarchical gravitational search algorithm with an effective gravitational constant,” Swarm and Evolutionary Computation, vol. 46, pp. 118–139, 2019.
[84] M. R. Narimani, A. A. Vahed, R. Azizipanah-Abarghooee, and M. Javidsharifi, “Enhanced gravitational search algorithm for multi-objective distribution feeder reconfiguration considering reliability, loss and operational cost,” IET Generation, Transmission & Distribution, vol. 8, no. 1, pp. 55–69, 2014.
[85] S. Gao, C. Vairappan, Y. Wang, Q. Cao, and Z. Tang, “Gravitational search algorithm combined with chaos for unconstrained numerical optimization,” Applied Mathematics and Computation, vol. 231, pp. 48–62, 2014.
[86] Z. Song, S. Gao, Y. Yu, J. Sun, and Y. Todo, “Multiple chaos embedded gravitational search algorithm,” IEICE Transactions on Information and Systems, vol. 100, no. 4, pp. 888–900, 2017.
[87] S. Mirjalili and S. Z. M. Hashim, “A new hybrid PSOGSA algorithm for function optimization,” in 2010 International Conference on Computer and Information Application. IEEE, 2010, pp. 374–377.
[88] M. Khatibinia and S. Khosravi, “A hybrid approach based on an improved gravitational search algorithm and orthogonal crossover for optimal shape design of concrete gravity dams,” Applied Soft Computing, vol. 16, pp. 223–233, 2014.
[89] B. Shaw, V. Mukherjee, and S. Ghoshal, “Solution of reactive power dispatch of power systems by an opposition-based gravitational search algorithm,” International Journal of Electrical Power & Energy Systems, vol. 55, pp. 29–40, 2014.
[90] M. Soleimanpour-Moghadam, H. Nezamabadi-Pour, and M. M. Farsangi, “A quantum inspired gravitational search algorithm for numerical function optimization,” Information Sciences, vol. 267, pp. 83–100, 2014.
[91] G. Sun, P. Ma, J. Ren, A. Zhang, and X. Jia, “A stability constrained adaptive alpha for gravitational search algorithm,” Knowledge-Based Systems, vol. 139, pp. 200– 213, 2018.
[92] V. K. Bohat and K. Arya, “An effective gbest-guided gravitational search algorithm for real-parameter optimization and its application in training of feedforward neural networks,” Knowledge-Based Systems, vol. 143, pp. 192–207, 2018.
[93] B. Gonzalez, F. Valdez, P. Melin, and G. Prado-Arechiga, “Fuzzy logic in the grav- ´ itational search algorithm for the optimization of modular neural networks in pattern recognition,” Expert Systems with Applications, vol. 42, no. 14, pp. 5839–5847, 2015.
[94] ——, “Fuzzy logic in the gravitational search algorithm enhanced using fuzzy logic with dynamic alpha parameter value adaptation for the optimization of modular neural networks in echocardiogram recognition,” Applied Soft Computing, vol. 37, pp. 245–254, 2015.
[95] P. Haghbayan, H. Nezamabadi-Pour, and S. Kamyab, “A niche GSA method with nearest neighbor scheme for multimodal optimization,” Swarm and Evolutionary Computation, vol. 35, pp. 78–92, 2017.
[96] K. Pal, C. Saha, S. Das, and C. A. C. Coello, “Dynamic constrained optimization with offspring repair based gravitational search algorithm,” in 2013 IEEE Congress on Evolutionary Computation. IEEE, 2013, pp. 2414–2421.
[97] H. Sajedi and S. F. Razavi, “DGSA: discrete gravitational search algorithm for solving knapsack problem,” Operational Research, vol. 17, no. 2, pp. 563–591, 2017.
[98] S. Gao, Y. Todo, T. Gong, G. Yang, and Z. Tang, “Graph planarization problem optimization based on triple-valued gravitational search algorithm,” IEEJ Transactions on Electrical and Electronic Engineering, vol. 9, no. 1, pp. 39–48, 2014.
[99] P. Das, H. S. Behera, and B. K. Panigrahi, “A hybridization of an improved particle swarm optimization and gravitational search algorithm for multi-robot path planning,” Swarm and Evolutionary Computation, vol. 28, pp. 14–28, 2016.
[100] S. Mallick, R. Kar, D. Mandal, and S. Ghoshal, “Optimal sizing of CMOS analog circuits using gravitational search algorithm with particle swarm optimization,” International Journal of Machine Learning and Cybernetics, vol. 8, no. 1, pp. 309– 331, 2017.
[101] R. Storn and K. Price, “Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997.
[102] S. Das and P. N. Suganthan, “Differential evolution: a survey of the state-of-the-art,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 4–31, 2011. [103] H. John, Holland, Adaptation in natural and artificial systems. MIT Press, Cambridge, MA, 1992.
[104] Y. Yu and Z.-H. Zhou, “A new approach to estimating the expected first hitting time of evolutionary algorithms,” Artificial Intelligence, vol. 172, no. 15, pp. 1809–1832, 2008.
[105] D. E. Goldberg and K. Deb, “A comparative analysis of selection schemes used in genetic algorithms,” Foundations of Genetic Algorithms, vol. 1, pp. 69–93, 1991.
[106] A. Prugel-Bennett and J. L. Shapiro, “Analysis of genetic algorithms using statistical ¨ mechanics,” Physical Review Letters, vol. 72, no. 9, 1305, 1994.
[107] B. Dorronsoro and P. Bouvry, “Study of different small-world topology generation mechanisms for genetic algorithms,” in 2012 IEEE Congress on Evolutionary Computation. IEEE, 2012, pp. 1–8.
[108] J. M. Whitacre, R. A. Sarker, and Q. T. Pham, “Effects of adaptive social networks on the robustness of evolutionary algorithms,” International Journal on Artificial Intelligence Tools, vol. 20, no. 05, pp. 783–817, 2011.
[109] C. Echegoyen, A. Mendiburu, R. Santana, and J. A. Lozano, “Estimation of bayesian networks algorithms in a class of complex networks,” in 2010 IEEE Congress on Evolutionary Computation. IEEE, 2010, pp. 1–8.
[110] R. Salomon, “Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions. a survey of some theoretical and practical aspects of genetic algorithms,” BioSystems, vol. 39, no. 3, pp. 263–278, 1996.
[111] J. Brest, S. Greiner, B. Boskovic, M. Mernik, and V. Zumer, “Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 6, pp. 646–657, 2006.
[112] J. Vesterstrom and R. Thomsen, “A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems,” in Proceedings of 2004 Congress on Evolutionary Computation, vol. 2. IEEE, 2004, pp. 1980–1987.
[113] P. N. Suganthan, N. Hansen, J. J. Liang, K. Deb, Y.-P. Chen, A. Auger, and S. Tiwari, “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” KanGAL Report, vol. 2005005, 2005.
[114] M. Jamil and X.-S. Yang, “A literature survey of benchmark functions for global optimisation problems,” International Journal of Mathematical Modelling and Numerical Optimisation, vol. 4, no. 2, pp. 150–194, 2013.
[115] Y. Wang, Z. Cai, and Q. Zhang, “Differential evolution with composite trial vector generation strategies and control parameters,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 55–66, 2011.
[116] Z.-h. Zhan, W.-n. Chen, Y. Lin, Y.-j. Gong, Y.-l. Li, and J. Zhang, “Parameter investigation in brain storm optimization,” in 2013 IEEE Symposium on Swarm Intelligence. IEEE, 2013, pp. 103–110.
[117] S. Cheng, Y. Shi, Q. Qin, and S. Gao, “Solution clustering analysis in brain storm optimization algorithm,” in 2013 IEEE Symposium on Swarm Intelligence. IEEE, 2013, pp. 111–118.
[118] J. M. Whitacre, R. A. Sarker, and Q. T. Pham, “The self-organization of interaction networks for nature-inspired optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 2, pp. 220–230, 2008.
[119] A. Clauset, C. R. Shalizi, and M. E. Newman, “Power-law distributions in empirical data,” SIAM Review, vol. 51, no. 4, pp. 661–703, 2009.
[120] J. Derrac, S. Garc´ıa, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm and Evolutionary Computation, vol. 1, no. 1, pp. 3–18, 2011.
[121] R. Jugulum, S. Taguchi et al., Computer-based robust engineering: essentials for DFSS. ASQ Quality Press, 2004.
[122] J. Qi and Z. Rong, “The emergence of scaling laws search dynamics in a particle swarm optimization,” Physica A: Statistical Mechanics and Its Applications, vol. 392, no. 6, pp. 1522–1531, 2013.
[123] S. Mirjalili and A. Lewis, “Adaptive gbest-guided gravitational search algorithm,” Neural Computing and Applications, vol. 25, no. 7-8, pp. 1569–1584, 2014.
[124] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “BGSA: binary gravitational search algorithm,” Natural Computing, vol. 9, no. 3, pp. 727–745, 2010.
[125] ——, “Filter modeling using gravitational search algorithm,” Engineering Applications of Artificial Intelligence, vol. 24, no. 1, pp. 117–122, 2011.
[126] S. Duman, U. Guvenc¸, Y. S ¨ onmez, and N. Y ¨ or¨ ukeren, “Optimal power flow using ¨ gravitational search algorithm,” Energy Conversion and Management, vol. 59, pp. 86–95, 2012.
[127] J. Liang, B. Qu, P. Suganthan, and A. G. Hernandez-D ´ ´ıaz, “Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization,” Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore, Technical Report, vol. 201212, no. 34, pp. 281–295, 2013.
[128] N. Awad, M. Ali, J. Liang, B. Qu, and P. Suganthan, “Problem definitions and evaluation criteria for the CEC 2017 special session and competition on single objective bound constrained real-parameter numerical optimization,” Technical Report, 2016.
[129] B. Gu and F. Pan, “Modified gravitational search algorithm with particle memory ability and its application,” International Journal of Innovative Computing, Information and Control, vol. 9, no. 11, pp. 4531–4544, 2013.
[130] A. Zhang, G. Sun, J. Ren, X. Li, Z. Wang, and X. Jia, “A dynamic neighborhood learning-based gravitational search algorithm,” IEEE Transactions on Cybernetics, vol. 48, no. 1, pp. 436–447, 2018.
[131] N. Hansen, S. D. Muller, and P. Koumoutsakos, “Reducing the time complexity of ¨ the derandomized evolution strategy with covariance matrix adaptation (CMA-ES),” Evolutionary Computation, vol. 11, no. 1, pp. 1–18, 2003.
[132] G. Zhu and S. Kwong, “Gbest-guided artificial bee colony algorithm for numerical function optimization,” Applied Mathematics and Computation, vol. 217, no. 7, pp. 3166–3173, 2010.
[133] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,” Advances in Engineering Software, vol. 69, pp. 46–61, 2014.
[134] S. Mirjalili, “SCA: a sine cosine algorithm for solving optimization problems,” Knowledge-Based Systems, vol. 96, pp. 120–133, 2016.
[135] R. Poli, J. Kennedy, and T. Blackwell, “Particle swarm optimization,” Swarm Intelligence, vol. 1, no. 1, pp. 33–57, 2007.
[136] R. Mendes, J. Kennedy, and J. Neves, “The fully informed particle swarm: simpler, maybe better,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 3, pp. 204–210, 2004.
[137] Y. Shi and R. Eberhart, “A modified particle swarm optimizer,” in 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence. IEEE, 1998, pp. 69–73.
[138] J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar, “Comprehensive learning particle swarm optimizer for global optimization of multimodal functions,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 3, pp. 281–295, 2006.
[139] Z. H. Zhan, J. Zhang, Y. Li, and Y. H. Shi, “Orthogonal learning particle swarm optimization,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 6, pp. 832–847, 2011.
[140] K. Premalatha and A. Natarajan, “Hybrid PSO and GA for global maximization,” Int. J. Open Problems Compt. Math, vol. 2, no. 4, pp. 597–608, 2009.
[141] C. Zhang, J. Ning, S. Lu, D. Ouyang, and T. Ding, “A novel hybrid differential evolution and particle swarm optimization algorithm for unconstrained optimization,” Operations Research Letters, vol. 37, no. 2, pp. 117–122, 2009.
[142] Y.-J. Gong, J.-J. Li, Y. Zhou, Y. Li, H. S.-H. Chung, Y.-H. Shi, and J. Zhang, “Genetic learning particle swarm optimization,” IEEE Transactions on Cybernetics, vol. 46, no. 10, pp. 2277–2290, 2016.
[143] S. Das and P. N. Suganthan, “Problem definitions and evaluation criteria for CEC 2011 competition on testing evolutionary algorithms on real world optimization problems,” Jadavpur University, Nanyang Technological University, Kolkata, pp. 341–359, 2010.