Neurocomputing 128 (2014) 363–370
Contents lists available at ScienceDirect
Neurocomputing journal homepage: www.elsevier.com/locate/neucom
Stud krill herd algorithm Gai-Ge Wang a,n, Amir H. Gandomi b, Amir H. Alavi c a b c
School of Computer Science and Technology, Jiangsu Normal University, Xuzhou, Jiangsu 221116, China Department of Civil Engineering, The University of Akron, Akron, OH 44325, USA Department of Civil and Environmental Engineering, Engineering Building, Michigan State University, East Lansing, MI 48824, USA
art ic l e i nf o
a b s t r a c t
Article history: Received 31 January 2013 Received in revised form 17 June 2013 Accepted 27 August 2013 Communicated by Prof. D. Liu Available online 17 October 2013
Recently, Gandomi and Alavi proposed a meta-heuristic optimization algorithm, called Krill Herd (KH), for global optimization [Gandomi AH, Alavi AH. Krill Herd: A New Bio-Inspired Optimization Algorithm. Communications in Nonlinear Science and Numerical Simulation, 17(12), 4831–4845, 2012.]. This paper represents an optimization method to global optimization using a novel variant of KH. This method is called the Stud Krill Herd (SKH). Similar to genetic reproduction mechanisms added to KH method, an updated genetic reproduction schemes, called stud selection and crossover (SSC) operator, is introduced into the KH during the krill updating process dealing with numerical optimization problems. The introduced SSC operator is originated from original Stud genetic algorithm. In SSC operator, the best krill, the Stud, provides its optimal information for all the other individuals in the population using general genetic operators instead of stochastic selection. This approach appears to be well capable of solving various functions. Several problems are used to test the SKH method. In addition, the influence of the different crossover types on convergence and performance is carefully studied. Experimental results indicate an instructive addition to the portfolio of swarm intelligence techniques. & 2013 Elsevier B.V. All rights reserved.
Keywords: Global optimization problem Krill herd Stud genetic algorithm Stud selection and crossover operator Multimodal function
1. Introduction In management, computing science, and artificial intelligence area, in essence, optimization is a selection of a vector that can make an optimal solution for the objective function [1]. With the development of the science and technology, practical engineering optimization problems are becoming more and more complex. Usually, intelligent stochastic methods have been applied to deal with these complex problems. A familiar way for categorizing techniques is to explore the attribute of the methods, and these techniques can be primarily divided into two parts canonical methods, and stochastic methods. Canonical methods always follow the same optimization path. We can repeat the process of optimization and get the same final solutions if the optimization begins with the same initial condition [1]. Contrary to the canonical methods, for modern stochastic methods, their behavior has some randomness at all times, and they have no rigorous step to follow. The process of optimization
n
Corresponding author. Tel.: þ 86 138 520 060 92. E-mail addresses:
[email protected],
[email protected] (G.-G. Wang),
[email protected],
[email protected] (A.H. Gandomi),
[email protected],
[email protected] (A.H. Alavi). URLS: http://www.gozips.uakron.edu/~ag72 (A.H. Gandomi), http://www.egr.msu.edu/~alavi/ (A.H. Alavi). 0925-2312/$ - see front matter & 2013 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.neucom.2013.08.031
cannot be repeatable, and they would always follow new different optimization path. Eventually, this randomness leads to different solutions regardless of the initial value. However, in most cases, both of them can find the optimal solutions though they have slight difference. Recently, meta-heuristic search approaches perform effectively in dealing with nonlinear problems. In all meta-heuristic search techniques, much effort has been devoted to make an appropriate trade-off between the exploration and exploitation in searching for the optimal solutions [2]. A great many robust meta-heuristic search approaches that are inspired by nature have been designed to solve complicated engineering problems [3], like parameter estimation [4], system identification [5], education [6], and engineering optimization [7,8]. A vast majority of meta-heuristic approaches can always find optimal or sub-optimal solutions from a population of solutions. In the last two decades, many famous optimization techniques have been developed, like artificial bee colony (ABC) [9], genetic programming (GP) [10], ant colony optimization (ACO) [11,12], differential evolution (DE) [13,14], evolutionary strategy (ES) [15], bat algorithm (BA) [16,17], charged system search (CSS) [18,19], biogeography-based optimization (BBO) [20], harmony search (HS) [21,22], cuckoo search (CS) [23,24], particle swarm optimization (PSO) [25–27], big bang-big crunch algorithm [28–31], population-based incremental learning (PBIL) [32] and
364
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370
formulated this motion below:
more recently, the KH algorithm [33] that is based on the simulation of the swarm behavior of krill. In 2012, a swarm intelligence approach, namely KH method [33], was firstly presented for the global optimization problem. The KH methodology draws its analogy from the herding behavior of krill individuals in nature. The objective function used in KH method is mostly decided by the least distances of the position of the food and the biggest swarm density. The position for each krill mostly covers three parts. KH is an effective method in exploitation. However, on occasion, it may not escape some local best solutions in multimodal fitness landscape so that it cannot search globally well [3]. For regular KH approach, the search relies fully on randomness; therefore, it cannot always converge rapidly. In standard GA (genetic algorithm) [34,35], three genetic operators (selection, crossover and mutation) repeat until a termination condition is satisfied. To improve the performance of GA, a variety of GAs has been developed. One of the well-famous methods is Stud GA (SGA) [36]. In SGA, instead of stochastic selection, the best individual, the Stud, provides its useful information for all the other individuals in the population by GA operators [36]. In this paper, an effective SKH method combining KH with SGA is proposed. The aim of SKH is to accelerate convergence speed. In the first stage of SKH, we utilize basic KH to choose an optimal promising solution set. Subsequently, for more accurate modeling of the krill behavior, inspired by SGA, an updated selection and crossover operation, called stud selection and crossover (SSC) operator, is added to the approach. The SSC operator is applied to fine-tune the chosen promising solution in order to enhance its reliability and robustness for global optimization. The added SSC operator updated the krill's position according to the roulette wheel selection. The crossover operation in SSC operator can help to avoid premature convergence in the early run phase, and refine the final solutions in the later. The proposed SKH method is verified on 22 benchmarks. Experimental results indicate that SKH performs more efficiently and robust than the KH, and other 11 optimization methods. The mainframe of this paper is provided below. Section 2 and Section 3 describe the KH and SGA methods in brief, respectively. Our SKH approach is presented in Section 4. The superiority of the SKH method is verified by 22 benchmarks in Section 5. Finally, Section 6 summarizes all the work in the present work.
3. Genetic algorithm and SGA
2. KH algorithm
SGA is based on the simple genetic algorithm, therefore firstly a brief description of GA is provided in this section.
KH [33] is a new generic stochastic optimization approach for the global optimization problem. It is inspired by the krill swarms when hunting for the food and communicating with each other. The KH approach repeats the implementation of the three movements and follows search directions that enhance the objective function value. The time-relied position is mostly determined by three movements i. foraging action; ii. movement influenced by other krill; iii. physical diffusion. Regular KH approach adopted the Lagrangian model [33] as shown in the following expression: dX i ¼ F i þ N i þ Di dt
ð1Þ
where Fi, Ni, and Di denote the foraging motion, the motion influenced by other krill, and the physical diffusion of the krill i, respectively. The first motion Fi covered two parts: the current food location and the information about the previous location. For the krill i, we
F i ¼ V f βi þ ωf F old i
ð2Þ
where βi ¼ βfi ood þ βbest i
ð3Þ
and Vf is the foraging speed, ωf is the inertia weight of the foraging motion in (0,1) F old is the last foraging motion. i The direction led by the second movement Ni, αi, is estimated by the three effects: target effect, local effect, and repulsive effect. For a krill i, it can be formulated below: ¼ N max αi þ ωn N old Nnew i i
ð4Þ
max
is the maximum induced speed, ωn is the inertia weight and N of the second motion in (0,1) N old is the last motion influenced by i other krill. For the ith krill, as a matter of fact, the physical diffusion is a random process. This motion includes two components: a maximum diffusion speed and an oriented vector. The expression of physical diffusion can be given below Di ¼ Dmax δ
ð5Þ
max
is the maximum diffusion speed, and δ is the oriented where D vector whose values are random numbers between 1 and 1. According to the three above-analyzed actions, the time-relied position from time t to tþΔt can be formulated by the following equation: X i ðt þ ΔtÞ ¼ X i ðtÞ þ Δt
dX i dt
ð6Þ
Most importantly, note that Δt is an important parameter and should be regulated in terms of the special real-life problem. The reason is that, to some extent, this parameter can be treated as a scale factor of the speed and features the variations of the global best attraction, and its value is of vital importance in determining the speed of the convergence and how the KH works. More details about regular KH approach and the three main moves can be referred as [3,33].
3.1. Genetic algorithm Genetic algorithm (GA) is a canonical stochastic meta-heuristic search method for the global optimization in a large search space. The genetic information is encoded as genome that is implemented in an uncommon way that permits asexual reproduction that leads to the offspring that are genetically the same with the parent. While sexual reproduction can exchange and re-order chromosomes, giving birth to offspring which include a hybridization of genetic information from all parents. This operation is frequently called crossover because the chromosomes crossover when swapping genetic information. To evade premature convergence, mutation is applied to increase the diversity of the population. A general GA procedure has the following moves: randomly initializing a population of candidate solutions, generating new offspring by genetic operators. The fitness of the newly generated solutions is approximately calculated and well-fitted selection scheme is then utilized to decide which solutions will be held into the next generation. This process is then repeated until a fixed number of generations is reached or some stop criterion is satisfied.
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370
Genetic algorithms have been widely used since it is developed, and GA has proved to succeed in solving many benchmark and real-world engineering problems. 3.2. SGA A SGA [36] is a type of GA that employs the optimal genome for crossover at each generation. The idea of SGA is to employ the optimal genome to mate with all others to generate new offspring [36]. Here, SGA do not use stochastic selection. The SGA can be presented as follows [37]: i. ii. iii. iv.
Initialize a population at random Choose the optimal genome (the Stud) for mating. Perform crossover. Repeat until stopping criteria is satisfied.
The crossover operation is the heart of the SGA. In general, the SGA implements in terms of the following steps.
Shuffle two stud elements (selected randomly). Check the diversity according to the hamming distance
between the shuffled stud and the current mate: If diversity is bigger than a set threshold, perform crossover to generate one offspring, Else, mutate the current mate to generate the child [37]. Repeat for all other mates [36].
4. SKH Because the search used in the regular KH method relies completely on randomness, the KH cannot always find the optimal solutions. In KH algorithm, adaptive genetic reproduction mechanisms have been introduced so as to improve its performance [33]. Nevertheless, on occasion, KH may not be successful in advancing on better solutions on some high-dimensional complicated problems. Generally, the regular KH method is skillful at investigating the search space extensively and locating the area of global best solution, but it is poor at deciding solution in a greedy way [3]. In the present study, in order to considerably advance the performance of KH, similar to adaptive genetic reproduction mechanisms [33], an updated genetic reproduction schemes, called stud selection and crossover (SSC) operator, is introduced into the KH approach to develop a SKH approach to optimize the benchmark functions. The introduced SSC operator is inspired by the prestigious SGA. That is, in this paper, the attribute of natural evolution is endowed with the original krill to give birth to a variant of super krill that is well capable of implementing the SSC operator. In SKH, the SSC operator is utilized to only take over the newly generating better solutions for each krill individual; while in KH, it is inclined to accept all the updated krill. The mainframe of SSC operator is given in Algorithm 1. From Algorithm 1, we can see that SSC operator actually covers two minor operators: selection and crossover. Similar to SGA, the idea of the SSC operator is to employ the optimal krill (the Stud) to mate with all the other krill to generate the child krill in place of a not-sogood solution. Therefore, in essence no stochastic selection is used in SSC operator. In Algorithm 1, to begin with, the Stud, i.e., the optimal krill individual is chosen as the first parent. And then, another parent is selected to mate with the stud and create two children by roulette wheel selection. Note that we must make sure the study is not selected as the second parent. Next, using two selected parents, a novel krill Xi' is generated by some kind of crossover (Crossover operator). The crossover operation is the heart of SSC operator. The quality of the newly generated offspring Xi' (Fi') is estimated by the objective function. If Fi' oFi, we accept this newly generated krill
365
individual Xi' as Xiþ 1 in the next generation. Otherwise, the timerelated position of the krill in the search space is updated by Eq. (6) as new solution Xiþ 1 in the next generation. This is so quite greedy selection strategy that it can allow the whole population proceed to better solutions and not worsen the population all the time. Algorithm 1. Stud selection and crossover (SSC) operator Begin Perform selection operator Choose the best krill (the Stud) for mating. Implement crossover operator Generate new krill Xi' by crossover. Evaluate its quality/fitness Fi'. if (Fi' oFi) then do Accept the new generated solution Xi' as Xi þ 1 else Update the krill by Eq. (6) as Xi þ 1 end if End. In SKH, to begin with, the regular KH approach is applied to reduce the search area to a more promising area. Whereafter, the novel SSC operator that is a good greedy strategy is utilized to only take over improved solutions to better the quality of the solutions. Through this way, the proposed SKH approach can search the whole space extensively by basic KH method and extract useful information by SSC operator. Both good exploration of the regular KH method and the extraordinary exploitation ability of the SSC operator can be completely exerted. In fact, based on the figuration of SKH, the regular KH in SKH emphasizes the global search at the start of the process to escape from local solutions; while subsequently SSC operator inspires the local search in the search space at the later run phase of the process. Therefore, through this effective mechanism, the proposed SKH method can take full use of the extensive exploration of the KH and combat with the weak local search of the basic KH approach. Comparing with other optimization approaches, this could be an advantage for this approach as we can see in the simulations below. Most importantly, this method can further settle the serious conflict between exploration and exploitation efficiently. By merging above-analyzed SSC operator together with regular KH method, the SKH has been developed, and the mainframe of the SKH can be described in Algorithm 2. Here, NP is the size of the parent population P. Algorithm 2. SKH algorithm Begin Step 1: Initialization. Set The Generation Counter t ¼1; initialize the population P of NP krill; set the foraging speed Vf, the maximum diffusion speed Dmax, and the maximum induced speed Nmax; a probability of crossover pc. Step 2: Evaluating population. Evaluate the krill population based on its position. Step 3: While t oMaxGeneration do Sort all the krill according to their fitness. for i¼ 1:NP (all krill) do Perform the three motions. Update position for krill i by SSC operator in Algorithm 1. Evaluate each krill based on its new position Xi þ 1. end for i Sort all the krill and find the current best. t ¼tþ 1; Step 4: end while Step 5: Output the best solutions. End.
366
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370
Table 1 Benchmark functions. ID
Name
F01 F02 F03 F04 F05 F06
Dixon and Price Fletcher–Powell Griewank Levy Penalty #1
Ackley
ID
Name
ID
Name
ID
Name
F07 F08 F09 F10 F11 F12
Penalty #2 Perm #1 Perm #2 Powell Quartic Rastrigin
F13 F14 F15 F16 F17 F18
Rosenbrock Schwefel 2.26 Schwefel 1.2 Schwefel 2.22 Schwefel 2.21 Sphere
F19 F20 F21 F22
Step Sum Squares Trid Zakharov
5. Simulation experiments Here, a wide selection of benchmarks was utilized to investigate the effectiveness of the SKH. All the benchmarks were collected from previous researches that studied various aspects of optimization using stochastic optimization techniques. These benchmarks are given in Table 1. More detailed knowledge about all the benchmarks can be found in [20,38,39]. 5.1. The performance of SKH In order to investigate the performance of SKH, it was compared with eleven meta-heuristic methods that are ABC [9], ACO [11,40], BBO [20], DE [13], ES [15], GA [34,35], HS [21,22], KH [33], PBIL [32], PSO [25,41], and SGA [36]. In addition, in [33], among all variants of KH method, the KH II significantly outperforms all other variants of KH method which testifies the robustness of the KH approach. Therefore, here we use KH II as basic KH method. In the simulations below, we will use the unchanged parameters for KH, SGA, and SKH: the foraging speed Vf ¼ 0.02, the maximum diffusion speed Dmax ¼0.005, the maximum induced speed Nmax ¼ 0.01, and a crossover probability of single point crossover pc ¼1 (only for SKH). (The reason why we select single point crossover is shown in Section 5.2). For ABC, ACO, BBO, DE, ES, GA, HS, PBIL, and PSO, we set the parameters as [20,42,43]. We ran 200 times for each method on each problem to achieve typical performances. The results of the simulations are shown in Tables 2 and 3, which indicate the average and the best performance of each method. The optimal solution achieved by each method for each benchmark is marked in bold. In addition, different scales are used to normalize the results, so values cannot be comparative between them. The detailed normalization process can be found in [44].In our work, the number of the elements in all the methods is 30 (i.e., d ¼30). From Table 2, on average, SKH is the most effective at finding objective function minimum on seventeen of the twenty-two benchmarks (F01, F02, F04-F07, F10-F15, and F17-F21). BBO is the second most effective, performing the best on the benchmarks F03, F16 and F22. DE ranks 3 and performs the best on the functions F08-F09. For the best solutions, Table 3 shows that SKH performs the best on twenty of the twenty-two benchmarks which are F01– F02 and F04–F21. ACO and BBO are the second most effective, performing the best on the benchmark F22 and F03, respectively. Besides, to look at the merits of the SKH approach in more detail, convergence plots of twelve methods are also given in our work. However, limited by the length of paper, only some most representative problems are provided in the Figs. 1–5. The results are the average optimal objective function value obtained from 200 runs that the accurate function value, not normalized. We use KH short for KH II in the legend of following figures and next texts. Fig. 1 illustrates that SKH is significantly superior to all the other approaches. For other approaches, SGA, BBO and KH rank 2, 3 and 4 eventually, while ABC, ACO, DE, ES, GA, HS, PBIL, PSO cannot find the global minimum after 50 generations.
Furthermore, all the approaches start the optimization process from the same initial point; however SKH greatly outperforms all others shortly. For this case, though converging slowly later, PSO find the best solutions initially among 12 methods; but, SKH overtakes it after 3 generations. On first glance, SKH can find the best function value for Powell function. Eventually, SGA, BBO and KH perform the second, third, and fourth best at finding the global optimal solution that converge more slowly than SKH. Apparently, SKH converges the fastest and significantly outperforms all other approaches for this case. Here, all the approaches show the almost same starting point, however SKH outperforms the other algorithms in the whole optimization process. Carefully studying Table 2 and Fig. 3, SGA performs slightly better than BBO in this multimodal function, and both of them are inferior to the SKH and KH. Furthermore, ABC, ACO, DE, ES, GA, HS, PBIL, and PSO fail to find the satisfied solution under given conditions. From Fig. 4, it is apparent that, SKH is significantly superior to all others in the optimization process. Carefully studying Table 2 and Fig. 4, BBO, KH and SGA performs also well and rank 2, 3, and 4, respectively. All of them are inferior to the SKH method. From Fig. 5, SKH performs far better than other approaches in the optimization process for this problem. In addition, for other algorithm, SGA, BBO and KH perform very well and rank 2, 3 and 4, respectively. From the Tables 2 and 3, and Figs. 1–5, we can conclude that our SKH approach greatly outperforms the other eleven optimization techniques. In most cases, BBO, KH and SGA are only inferior to SKH, and perform the second best among 12 approaches. At last, note that, BBO was compared with seven methods on 14 functions and an engineering problem [20]. The experiments proved the robustness of BBO. Also, it is indirectly proven that our metaheuristic SKH approach is a more robust and efficient optimization approach than other meta-heuristic search methods. 5.2. Influence of different crossover types In general, there are three different crossover types used in SGA, which are single point crossover, two point crossover, and uniform crossover. The choice of the crossover type is significant for all kinds of evolutionary algorithms to solve specific problems. To look at the effects among three crossover types, we implemented 200 simulations of SKH on the fourteen most representative test problems to achieve representative performances. The experimental results are recorded in Tables 4 and 5. Tables 4 and 5 represent the optimal and mean performance of SKH approach, respectively. Here, single point crossover, two point crossover, and uniform crossover are used in SKH1, SKH2, and SKH3, respectively. From Table 4, obviously, in most cases, for best solutions, it can be seen that SKH1 significantly outperforms SKH3 and SKH2 among three SKH methods. From Table 5, on average, it can be seen that SKH1 performs little better than SKH3; while SKH2 performs the worst among three SKH methods. In conclusion, SKH1 performs the best among three SKH methods. So, SKH1 is selected as basic SKH method in our experiments.
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370
367
Table 2 Mean normalized optimization results. ABC
ACO
BBO
DE
ES
GA
HS
KH
PBIL
PSO
SGA
SKH
F01 F02 F03 F04 F05 F06 F07 F08 F09 F10 F11 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22
10.26 2.7E4 3.06 165.36 8.36 3.5E7 2.0E6 1.3E5 2.3E4 692.72 2.0E4 3.46 26.93 146.62 9.22 3.12 19.39 2.0E3 267.55 127.08 45.22 1.42
10.28 3.3E4 8.63 31.50 13.21 4.8E7 8.0E6 4.4E4 2.1E4 2.0E3 2.3E4 5.96 117.65 99.66 8.71 5.72 12.66 3.8E3 110.10 201.70 11.79 1.8E6
6.22 1.3E3 1.00 23.56 1.56 5.5E5 6.5E4 2.8E5 9.8E4 171.68 965.60 1.30 5.42 55.76 5.67 1.00 14.61 313.40 41.08 20.00 9.90 1.00
9.64 1.1E4 4.01 97.72 10.23 1.4E7 8.1E5 1.00 1.00 1.1E3 8.7E3 4.82 22.22 175.20 12.23 3.49 18.56 1.2E3 169.54 72.14 62.42 1.82
11.67 1.0E5 8.70 237.51 25.37 1.1E8 6.5E6 107.21 27.10 2.7E3 7.4E4 6.47 118.90 199.75 12.72 7.85 18.26 5.5E3 534.58 320.50 64.26 2.09
10.88 1.4E4 5.01 110.04 10.49 6.4E6 5.5E5 7.8E3 2.1E3 575.38 9.5E3 5.18 39.11 88.94 8.77 4.38 15.63 2.8E3 216.13 106.34 6.80 1.54
11.89 8.9E4 8.22 389.58 23.85 1.5E8 8.8E6 23.48 6.49 2.1E3 7.1E4 6.33 86.60 231.32 11.78 6.39 18.68 4.9E3 681.77 303.46 99.48 9.77
3.31 2.2E3 3.87 16.71 3.49 2.4E5 4.1E4 5.4E4 1.7E4 346.61 1.5E3 3.05 6.40 160.91 6.56 3.30 3.26 238.67 25.64 22.17 13.58 1.20
11.95 1.1E5 8.03 434.74 29.06 2.0E8 1.1E7 2.7E3 1.2E3 2.2E3 8.8E4 6.52 102.62 238.14 12.50 6.34 19.28 5.8E3 757.85 361.85 108.44 1.82
10.28 2.6E4 7.44 178.54 14.10 1.8E7 1.6E6 500.37 64.82 875.41 1.9E4 4.90 30.06 236.25 9.40 7.99 19.83 2.2E3 288.55 112.56 54.18 2.06
7.11 1.1E3 1.25 28.21 1.53 7.6E4 2.2E4 3.1E3 2.1E3 62.12 781.75 2.28 6.98 76.49 8.05 1.73 13.21 491.27 45.87 24.81 5.80 1.31
1.00 1.00 7.54 1.00 1.00 1.00 1.00 267.90 1.7E3 1.00 1.00 1.00 1.00 1.00 1.00 2.40 1.00 1.00 1.00 1.00 1.00 5.83
Total
0
0
3
2
0
0
0
0
0
0
0
17
Table 3 Best normalized optimization results. ABC
ACO
BBO
DE
ES
GA
HS
KH
PBIL
PSO
SGA
SKH
F01 F02 F03 F04 F05 F06 F07 F08 F09 F10 F11 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22
231.31 1.5E5 4.13 77.23 1.8E3 2.4E8 1.5E9 2.4E25 1.9E24 2.0E5 4.4E10 37.06 17.02 1.0E5 33.33 24.85 222.33 4.0E5 1.1E4 1.4E4 1.0E5 76.44
223.86 1.6E5 13.78 12.54 2.6E3 13.55 59.28 1.1E26 1.1E26 6.9E5 3.6E10 64.02 149.42 5.1E4 20.25 43.78 127.28 7.1E5 4.3E3 2.7E4 1.0E4 1.00
123.44 4.4E3 1.00 9.58 351.34 4.1E3 1.5E7 1.1E26 1.1E26 3.5E4 1.5E9 12.74 7.46 3.9E4 15.65 5.17 157.51 3.4E4 1.3E3 2.7E3 1.2E4 39.78
207.13 8.1E4 6.88 70.50 1.6E3 6.1E7 8.8E8 1.2E16 1.7E16 3.5E5 1.6E10 51.03 24.02 1.4E5 35.98 25.53 209.73 2.0E5 7.8E3 1.1E4 1.2E5 76.50
268.93 8.3E5 14.64 172.31 5.4E3 1.1E9 6.7E9 4.6E21 2.8E21 5.5E5 2.6E11 72.55 136.09 1.5E5 45.97 63.63 217.83 1.1E6 2.6E4 5.1E4 1.2E5 103.35
227.34 3.9E4 6.13 59.57 2.1E3 2.8E6 2.1E8 1.1E26 1.1E26 7.6E4 1.3E10 42.57 33.64 4.9E4 22.46 30.91 126.95 3.5E5 5.7E3 8.7E3 1.1E4 52.36
281.00 5.8E5 14.56 310.73 6.1E3 9.7E8 1.1E10 2.9E21 8.2E19 7.6E5 2.8E11 67.30 99.20 1.7E5 41.91 54.26 235.66 1.1E6 3.9E4 4.5E4 1.2E5 92.70
43.98 1.6E4 6.91 10.74 574.29 5.0E5 3.1E7 1.0E17 3.6E7 7.4E4 3.4E9 30.78 7.52 1.0E5 20.88 23.78 26.34 4.1E4 1.2E3 3.1E3 1.3E4 51.58
281.40 8.0E5 11.09 361.59 6.2E3 1.1E9 1.0E10 1.1E26 1.1E26 8.5E5 4.0E11 70.03 116.68 1.7E5 43.24 53.10 235.76 1.2E6 4.2E4 6.7E4 1.3E5 95.75
232.66 6.0E4 11.41 131.05 3.1E3 1.1E8 1.3E9 4.5E22 1.3E22 2.7E5 4.1E10 53.95 21.37 1.8E5 32.03 49.72 206.09 4.3E5 1.5E4 1.2E4 1.1E5 65.66
141.42 4.3E3 1.54 14.48 288.87 365.48 2.3E5 1.1E26 1.1E26 1.0E4 6.3E8 20.79 7.05 4.1E4 14.60 13.41 115.19 8.1E4 1.1E3 2.2E3 1.1E4 48.53
1.00 1.00 6.18 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 73.62
Total
0
1
1
0
0
0
0
0
0
0
0
20
The simulation experiments conducted in Section 5.1 and Section 5.2 show that our proposed SKH algorithm with single point crossover performed the best and most effectively when dealing with the global numerical optimization problems.
6. Conclusion In the present work, the SSC operator has been introduced into the KH approach to propose an improved search approach, called SKH method, for optimization problems. In SKH, firstly, regular KH method is utilized to shrink the search area to a limited region. The SSC operator, containing selection and crossover operation, is applied to choose a good candidate solution in place of a notso-good solution in order to enhance its reliability and
accurateness dealing with optimization problems. When solving the complicated problems, KH may not continue to proceed to better solutions at all times [33]. Then, SSC operator is adaptively launched to re-start the search through crossover operator. With both techniques merged, SKH can balance exploration and exploitation and efficiently deal with complicated multimodal problems effectively. Furthermore, from the experimental results, we can arrive at a conclusion that the SKH considerably improves the accurateness of the global optimality and the quality of the solutions. However, similar to other meta-heuristics, SKH has a fixed limitation. We must fine-tune the control parameters every time according to specific real-life problem. In function optimization, there are a variety of problems that still deserve further scrutiny, and many more robust optimization approaches should be developed aiming to the specific problem.
368
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370 10000
7
10
6
10
8000
5
benchmark function value
benchmark function value
ABC ACO BBO DE ES GA
9000
10
4
10
3
10
ABC ACO BBO DE ES GA
2
10
1
10
0
10
0
5
HS KH PBIL PSO SGA SKH
10
15
7000
HS KH PBIL PSO SGA SKH
6000 5000 4000 3000 2000 1000 0
20
25
30
35
40
45
50
0
5
10
15
20
25
30
35
40
45
50
Number of generations
Number of generations
Fig. 4. Performance comparison for the F20 Sum Squares function.
Fig. 1. Performance comparison for the F02 Dixon & Price function.
6
4.5 14000
benchmark function value
12000
10000
HS KH PBIL PSO SGA SKH
8000
6000
4000
3.5 3
HS KH PBIL PSO SGA SKH
2.5 2 1.5 1 0.5
2000
0 0
ABC ACO BBO DE ES GA
4
benchmark function value
ABC ACO BBO DE ES GA
x 10
0
5
10
15
20
25
30
35
40
45
0
5
10
15
50
20
25
30
35
40
45
50
Number of generations
Number of generations Fig. 5. Performance comparison for the F21 Trid function. Fig. 2. Performance comparison for the F10 Powell function.
Table 4 Best normalized optimization results with different crossover. 4
7
x 10
ABC ACO BBO DE ES GA
benchmark function value
6
5
HS KH PBIL PSO SGA SKH
4
3
2
1
0
0
5
10
15
20
25
30
35
40
45
Number of generations Fig. 3. Performance comparison for the F19 Step function.
50
SKH1
SKH2
SKH3
F01 F03 F04 F06 F07 F11 F12 F13 F14 F15 F16 F17 F18 F19
1.00 1.00 1.00 1.80 1.00 9.8E6 1.00 1.00 1.00 1.00 1.00 1.00 495.07 1.00
3.60 9.3E4 1.01 1.00 11.96 76.10 28.28 18.40 2.2E3 36.87 15.53 3.51 1.77 162.00
2.88 3.9E4 1.00 1.24 10.11 1.00 43.91 17.52 1.9E3 35.40 2.96 2.61 1.00 37.00
Total
11
1
3
The future work can be focused on the following problems. Firstly, the proposed SKH method may be applied to work out practical engineering optimization problems to prove its efficiency for dealing with real-world problems. Secondly, a newer
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370
Table 5 Mean normalized optimization results with different crossover. SKH1
SKH2
SKH3
F01 F03 F04 F06 F07 F11 F12 F13 F14 F15 F16 F17 F18 F19
1.23 1.00 3.22 6.30 1.00 2.2E5 1.00 1.00 1.00 1.00 1.00 1.18 132.20 1.00
1.14 3.0E4 1.77 1.18 3.25 39.41 14.38 2.31 938.92 41.02 3.51 1.11 22.42 168.03
1.00 2.0E4 1.00 1.00 3.40 1.00 9.23 2.13 896.62 39.66 1.43 1.00 1.00 54.03
Total
8
0
6
meta-heuristic search technique can be devised to solve more complicated optimization problems more efficiently. Thirdly, in the current work, we only proved the superiority of the SKH through numerical study. Thus, further mathematical analysis can be done using dynamic system, such as Markov chain, to prove and explain the convergence of the proposed method. Finally, there are various ways to evaluate the performance of the optimization algorithms. In this work, only the average and best values are studied against the number of generations. In future work, the performance of the SKH may be evaluated in other ways, such as their computational complexity in terms of flops of calculation, or the number of fitness function evaluations. References [1] G. Wang, L. Guo, A novel hybrid bat algorithm with harmony search for global numerical optimization,. J. Appl. Math. (2013) 1-21. [2] X.S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, Frome, 2010. [3] G. Wang, L. Guo, A.H. Gandomi, L. Cao, J. Li, A.H. Alavi, H. Duan, Lévy-flight krill herd algorithm, Math. Probl. Eng. (2013) 1–14. (2013). [4] H.-C. Lu, M.-H. Chang, C.-H. Tsai, Parameter estimation of fuzzy neural network controller based on a modified differential evolution, Neurocomputing 89 (2012) 178–192. [5] X. Hong, S. Chen, The system identification and control of Hammerstein system using non-uniform rational B-spline neural network and particle swarm optimization, Neurocomputing 82 (2012) 216–223. [6] H. Duan, W. Zhao, G. Wang, X. Feng, Test-sheet composition using analytic hierarchy process and hybrid metaheuristic algorithm TS/BBO, Math. Probl. Eng. (2012) 1–22. (2012). [7] X.S. Yang, A.H. Gandomi, S. Talatahari, A.H. Alavi, Metaheuristics in Water, Geotechnical and Transport Engineering, Elsevier, Waltham, MA, 2013. [8] A.H. Gandomi, X.S. Yang, S. Talatahari, A.H. Alavi, Metaheuristic Applications in Structures and Infrastructures, Elsevier, Waltham, MA, 2013. [9] D. Karaboga, B. Basturk, A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm, J. Global Optim. 39 (2007) 459–471. [10] A.H. Gandomi, A.H. Alavi, Multi-stage genetic programming: A new strategy to nonlinear system modeling, Inform. Sci. 181 (2011) 5227–5239. [11] M. Dorigo, T. Stutzle, Ant Colony Optimization, MIT Press, Cambridge, 2004. [12] T.-J. Hsieh, H.-F. Hsiao, W.-C. Yeh, Mining financial distress trend data using penalty guided support vector machines based on hybrid of particle swarm optimization and artificial bee colony algorithm, Neurocomputing 82 (2012) 196–206. [13] R. Storn, K. Price, Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim. 11 (1997) 341–359. [14] A.H. Gandomi, X.-S. Yang, S. Talatahari, S. Deb, Coupled eagle strategy and differential evolution for unconstrained and constrained global optimization, Comput. Math. Appl. 63 (2012) 191–200. [15] H. Beyer, The Theory of Evolution Strategies, Springer, New York, 2001. [16] X.S. Yang, A.H. Gandomi, Bat algorithm: a novel approach for global engineering optimization, Eng. Comput. 29 (2012) 464–483. [17] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, Path planning for UCAV using bat algorithm with mutation, Sci. World J. 2012) 1-15. [18] A. Kaveh, S. Talatahari, A novel heuristic optimization method: charged system search, Acta Mech. 213 (2010) 267–289.
369
[19] A. Kaveh, S. Talatahari, Charged system search for optimal design of frame structures. Appl. Soft Comput. 12 (2012), 382–393. [20] D. Simon, Biogeography-based optimization, IEEE. Trans. Evolut. Comput 12 (2008) 702–713. [21] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2001) 60–68. [22] D. Zou, L. Gao, J. Wu, S. Li, Novel global harmony search algorithm for unconstrained problems, Neurocomputing 73 (2010) 3308–3318. [23] A.H. Gandomi, X.-S. Yang, A.H. Alavi, Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems, Eng. Comput. 29 (2013) 17–35. [24] G. Wang, L. Guo, H. Duan, H. Wang, L. Liu, M. Shao, A hybrid meta-heuristic DE/CS algorithm for UCAV three-dimension path planning, Scientific World Journal, 2012, pp. 1–11. [25] J. Kennedy, R. Eberhart, Particle swarm optimization, in: Proceedings of the IEEE International Conference on Neural Networks IEEE, Perth, Australia, 1995, pp. 1942–1948. [26] S. Gholizadeh, F. Fattahi, Design optimization of tall steel buildings by a modified particle swarm algorithm, Struct. Des. Tall Spec. (2013), http://dx.doi. org/10.1002/tal.1042. [27] S. Talatahari, M. Kheirollahi, C. Farahmandpour, A.H. Gandomi, A multi-stage particle swarm for optimum design of truss structures, Neural Comput. Appl. 23 (2013) 1297–1309, http://dx.doi.org/10.1007/s00521-012-1072-5. [28] O.K. Erol, I. Eksin, A new optimization method: Big Bang-Big Crunch, Adv. Eng. Softw. 37 (2006) 106–111. [29] A. Kaveh, S. Talatahari, Size optimization of space trusses using big Bang-Big Crunch algorithm, Comput. Struct 87 (2009) 1129–1140. [30] A. Kaveh, S. Talatahari, Optimal design of Schwedler and ribbed domes via hybrid Big Bang-Big Crunch algorithm, J. Constr. Steel Res. 66 (2010) 412–419. [31] A. Kaveh, S. Talatahari, A discrete big bang-big crunch algorithm for optimal design of skeletal structures, Asian J. Civil Eng. 11 (2010) 103–122. [32] B. Shumeet, Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning, Carnegie Mellon University, Pittsburgh, PA, 1994. [33] A.H. Gandomi, A.H. Alavi, Krill herd: a new bio-inspired optimization algorithm, Commun. Nonlinear Sci. Numer. Simul. 17 (2012) 4831–4845. [34] D.E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, New York, 1998. [35] H. He, Y. Tan, A two-stage genetic algorithm for automatic clustering, Neurocomputing 81 (2012) 49–59. [36] W. Khatib, P. Fleming, The stud GA: a mini revolution?, in: A. Eiben, T. Back, M. Schoenauer, H. Schwefel (Eds.), Proceedings of the 5th International Conference on Parallel Problem Solving from Nature, Springer-Verlag, New York, USA, 1998, pp. 683–691. [37] V.V.R. Silva, W. Khatib, P.J. Fleming, Performance optimization of gas turbine engine, Engl. Appl. Artif. Intell. 18 (2005) 575–583. [38] X. Yao, Y. Liu, G. Lin, Evolutionary programming made faster, IEEE. Trans. Evolut. Comput. 3 (1999) 82–102. [39] X.-S. Yang, Z. Cui, R. Xiao, A.H. Gandomi, M. Karamanoglu, Swarm Intelligence and Bio-Inspired Computation, Elsevier, Waltham, MA, 2013. [40] R. Vatankhah, S. Etemadi, A. Alasty, G.-R. Vossoughi, M. Boroushaki, Active leading through obstacles using ant-colony algorithm, Neurocomputing 88 (2012) 67–77. [41] Y. Sun, L. Zhang, X. Gu, A hybrid co-evolutionary cultural algorithm based on particle swarm optimization for solving global optimization problems, Neurocomputing 98 (2012) 76–89. [42] G.-G. Wang, L. Guo, H. Duan, H. Wang, A new improved firefly algorithm for global numerical optimization, J. Comput. Theor. Nanosci. 11 (2014) 477–485, http://dx.doi.org/10.1166/jctn.2014.3383. [43] G. Wang, L. Guo, H. Wang, H. Duan, L. Liu, J. Li, Incorporating mutation scheme into krill herd algorithm for global numerical optimization, Neural Comput. Appl. (2013), http://dx.doi.org/10.1007/s00521-012-1304-8. [44] G.-G. Wang, A.H. Gandomi, A.H. Alavi, G.-S. Hao, Hybrid krill herd algorithm with differential evolution for global numerical optimization, Neural Comput. Appl. (2013) 1–10, http://dx.doi.org/10.1007/s00521-013-1485-9.
Gai-Ge Wang obtained his bachelor degree in computer science and technology from Yili Normal University, Yining, Xinjiang, China, in 2007. His masters was in the field of “Intelligent planning and planning recognition” at Northeast Normal University, Changchun, China. In 2010 he began working on his Ph.D. for developing target threat evaluation by employing computational intelligence techniques at Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun, China. He is currently a researcher in School of Computer Science and Technology at Jiangsu Normal University, Xuzhou, China. Gai-Ge Wang has published over 20 journal papers and conference papers. His research interests are meta-heuristic optimization methods and its application in engineering.
370
G.-G. Wang et al. / Neurocomputing 128 (2014) 363–370 Amir H. Gandomi is the pioneer of Krill Herd Algorithm. He was selected as an elite in 2008 by Iranian National Institute of Elites. He used to be a lecturer in Tafresh University and serve as a researcher in National Elites Foundation. He is currently a researcher in Department of Civil Engineering at the University of Akron, OH. Amir Hossein Gandomi has published over 60 journal papers and several discussion papers, conference papers and book chapters. He has two patents and has published three books in Elsevier. His research interests are Metaheuristics modeling and optimization.
Amir H. Alavi is pioneer of Krill Herd Algorithm proposed in 2012. He graduated in Civil and Geotechnical Engineering from Iran University of Science & Technology (IUST), Tehran, Iran. He used to be a lecturer in Eqbal Lahoori Institute of Higher Education and serve as a researcher in School of Civil Engineering at IUST. He is currently a researcher in Department of Civil & Environmental Engineering at Michigan State University, MI, USA. He has published over 90 research papers and book chapters. He has two patents and has published two books in Elsevier. His research interests are engineering modeling and optimization.