(IEMA) on Engineering Design Problems - Semantic Scholar

Report 1 Downloads 44 Views
Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

Performance of Infeasibility Empowered Memetic Algorithm (IEMA) on Engineering Design Problems Hemant K. Singh, Tapabrata Ray, and Warren Smith School of Engineering and Information Technology, University of New South Wales, Australian Defence Force Academy, Canberra, ACT {h.singh,t.ray,w.smith}@adfa.edu.au http://www.unsw.adfa.edu.au

Abstract. Engineering design optimization problems often involve a number of constraints. These constraints may result from factors such as practicality, safety and functionality of the design and/or limit on time and resources. In addition, for many design problems, each function evaluation may be a result of an expensive computational procedure (such as CFD, FEA etc.), which imposes a limitation on the number of function evaluations that can be carried out to find a near optimal solution. Consequently, there is a significant interest in the optimization community to develop efficient algorithms to deal with constraint optimization problems. In this paper, a new memetic algorithm is presented, which incorporates two mechanisms to expedite the convergence towards the optimum. First is the use of marginally infeasible solutions to intensify the search near constraint boundary, where optimum solution(s) are most likely to be found. Second is performing local search from promising solutions in order to inject good quality solutions in the population early during the search. The performance of the presented algorithm is demonstrated on a set of engineering design problems, using a low computation budget (1000 function evaluations). Keywords: constraint handling, engineering design, expensive problems.

1 Introduction In the recent years, population based heuristic algorithms have gained popularity as generic optimizers. This is because they do not require any conditions on continuity or differentiability of objective functions, and hence are suitable for optimization of a wide range of problems. In addition, they can capture the whole Pareto optimal front for multi-objective problems in a single run as opposed to most single point methods. Most engineering design optimization problems contain a number of constraints. These constraints usually impose limits on space, time, availability of resources, cost, safety and viability of design, aesthetics, ergonomics, and many more. In addition, many of the engineering design problems are computationally expensive, which means that evaluating each design (function evaluation) can take a long time. Artificial intelligence and heuristic optimization techniques are being increasingly used these days to solve a variety of real life optimization problems. However, the usefulness of these applications depend on how efficiently these heuristic methods are J. Li (Ed.): AI 2010, LNAI 6464, pp. 425–434, 2010. c Springer-Verlag Berlin Heidelberg 2010 

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

426

H.K. Singh, T. Ray, and W. Smith

able to deal with the constraints, especially when there is a limited budget on function evaluations owing to computational complexity. Consequently, constraint handling has attracted a lot of attention from the evolutionary optimization community. Some of the earlier proposals for constraint handling include widely used penalty function approach, where the objective value is degraded by imposing a penalty on the solutions that violate any of the constraints. A number of variants of penalty functions have been proposed, which include static penalty function models [16], dynamic penalty function models [15], annealing penalty function models [18], adaptive penalty models [11] and death penalty function models [13]. There has also been a number of other proposals which include special representation schemes [3,20], repair strategies [32], separate ranking of objective and constraint functions [9]. Detailed review of various constraint handling techniques used in conjunction with evolutionary algorithms can be found in in [1,19]. Since the final aim of optimization is to achieve the feasible optimal solution, a preference for a feasible solution over an infeasible solution is built into the ranking in most of the evolutionary algorithms. Such a preference tries to drive the population towards the feasible search space before improving the objective function(s). However, many a times, the search space may consist of disconnected feasible regions and such a preference may result in localization of solutions in a sub-optimal region, which is not desirable for convergence. In addition, for most constrained problems, the solution to the optimization problem is likely to lie on the constraint boundary. Therefore, an infeasible solution near the constraint boundary may be more suitable for guiding the search than a feasible solution away from it. Some of the approaches that have exploited information from the infeasible solutions to expedite the search include use of constraints as additional objectives [31,26], explicit parent matching schemes [12], preferential treatment of best infeasible solutions [17] etc. Recently, Singh et al. [30] proposed an infeasibility driven evolutionary algorithm (IDEA), which explicitly maintains marginally infeasible solutions during the search. By maintaining these solutions (in addition to good feasible solutions), the search is intensified near the constraint boundary, where the optimum solution is likely to occur. In addition, the algorithm also provides some marginally infeasible solutions near the optimum solution as an output, which could be used for trade-off studies. The benefit of using infeasibility driven approach over conventional feasible-first ranking procedure was demonstrated on a number of benchmark problems in [30,27]. In the presented work, the algorithm proposed in [30,27] has been further refined in order to find near optimal solutions in relatively fewer function evaluations. In the literature, often global search methods (such as evolutionary algorithms) are used in conjunction with local search methods (such as gradient search) to search for optimum solution efficiently. This hybrid approach is referred to as memetic algorithm [21]. For a review on memetic algorithms, the readers are referred to [22]. The algorithm presented in this paper is a memetic algorithm, which uses IDEA as a global search method. Within each generation, a local search is initiated from a promising solution in the population. The primary purpose of the local search is to inject good quality solutions to the population early during the search.

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

Performance of IEMA on Engineering Design Problems

427

It is worthwhile mentioning here that the use of surrogate models for approximation of objective(s) and constraint(s) can be advantageous for solving computationally expensive problems [14], since the search can be guided using approximations in lieu of the actual function evaluations. The focus of the present work is to improve upon the existing IDEA algorithm by embedding local search in it. The algorithm presented here does not involve surrogate assistance. However, the integration of surrogate modeling with this algorithm will be considered in the future work for further improvements. Rest of the paper is organized as follows. Since the proposed algorithm utilizes concepts from Infeasibility Driven Evolutionary Algorithm (IDEA), a background on IDEA is given in Section 2. The proposed Infeasibility Empowered Memetic Algorithm (IEMA) is then described in Section 3. The performance of the proposed IEMA on a set of engineering design problems is then reported in Section 4. Finally, a summary of the findings of the paper is presented in Section 5.

2 Infeasibility Driven Evolutionary Algorithm (IDEA) Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al. [30]. It differs from the conventional EAs significantly in the terms of ranking and selection of the solutions. While most EAs rank feasible solutions above infeasible solutions, IDEA ranks solutions based on the original objectives along with additional objective representing constraint violation measure. IDEA explicitly maintains a few infeasible solutions during the search. In addition, “good” infeasible solutions are ranked higher than the feasible solutions, and thereby the search proceeds through both feasible and infeasible regions, resulting in greater rate of convergence to optimal solutions. The benefits obtained in convergence using explicit preservation of infeasible solutions motivated the development of IDEA [30,27], where the original problem is reformulated as an unconstrained problem with “violation measure” of the solutions as an additional objective. Violation measure is a quantity that is calculated based on the constraint violations of the solutions in he population. The studies reported in [30,27] indicate that IDEA has better rate of convergence compared to a conventional EA for a number of constrained single and multi-objective optimization problems. A generalized single-objective constrained optimization problem can be formulated as shown in (1) Minimize f (x) Subject to gi (x) ≥ 0,

i = 1, . . . , m

h j (x) = 0,

j = 1, . . . , p

(1)

where x = (x1 , . . . , xn ) is the design variable vector bounded by lower and upper bounds x ∈ S ⊂ ℜn . Here, g(x) represents an inequality constraint, whereas h(x) represents an equality constraint. It is a usual practice to convert the equality constraints to inequality constraints using a small tolerance (i.e. h(x) = 0 is converted to |h(x)| ≤ ε ). Hence, the discussion presented here is with regards to presence of inequality constraints only.

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

428

H.K. Singh, T. Ray, and W. Smith

To effectively search the design space (including the feasible and the infeasible regions), the original single objective constrained optimization problem is reformulated as bi-objective unconstrained optimization problem as shown in (2). Minimize

f1 (x) = f1 (x) f2 (x) = violation measure

(2)

The additional objective represents a measure of constraint violation, which is referred to as “violation measure”. It is based on the amount of relative constraint violations among the population members. Each solution in the population is assigned m ranks, corresponding to m constraints. The ranks are calculated as follows. To get the ranks corresponding to ith constraint, all the solutions are sorted based on the constraint violation value of ith constraint. Solutions that do not violate the constraint are assigned rank 0. The solution with the least constraint violation value gets rank 1, and the rest of the solutions are assigned increasing ranks in the ascending order of their constraint violation values. The process is repeated for all the constraints and as a result each solution in the population gets assigned m ranks. The violation measure of a solution is the sum of these m ranks of the solution corresponding to m constraints. The main steps of IDEA are outlined in Algorithm 1. IDEA uses simulated binary crossover (SBX) and polynomial mutation operators to generate offspring from a pair of parents selected using binary tournament as in NSGA-II [8]. Individual solutions in the population are evaluated using the original problem definition (1) and the infeasible solutions are identified. The solutions in the parent and offspring population are divided into a feasible set (S f ) and an infeasible set (Sin f ). The solutions in the feasible set and the infeasible set are ranked separately using the non-dominated sorting and crowding distance sorting [8] based on 2 objectives as per the modified problem definition (2). The solutions for the next generation are selected from both the sets to maintain infeasible solutions in the population. In addition, some of the infeasible solutions are ranked higher than the feasible solutions to provide a selection pressure to create better infeasible solutions resulting in an active search through the infeasible search space. Algorithm 1. Infeasibility Driven Evolutionary Algorithm (IDEA) Require: N {Population Size} Require: NG > 1 {Number of Generations} Require: 0 < α < 1 {Proportion of infeasible solutions} 1: Nin f = α ∗ N 2: N f = N − Nin f 3: pop1 = Initialize() 4: Evaluate(pop1 ) 5: for i = 2 to NG do 6: child popi−1 = Evolve(popi−1 ) 7: Evaluate(child popi−1 ) 8: (S f , Sin f ) = Split(popi−1 + child popi−1 ) 9: Rank(S f ) 10: Rank(Sin f ) 11: popi = Sin f (1 : Nin f ) + S f (1 : N f ) 12: end for

A user-defined parameter α is used to maintain a set of infeasible solutions as a fraction of the size of the population. The numbers N f and Nin f denote the number of

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

Performance of IEMA on Engineering Design Problems

429

feasible and infeasible solutions as determined by parameter α . If the infeasible set Sin f has more than Nin f solutions, then first Nin f solutions are selected based on their rank, else all the solutions from Sin f are selected. The rest of the solutions are selected from the feasible set S f , provided there are at least N f number of feasible solutions. If S f has fewer solutions, all the feasible solutions are selected and the rest are filled with infeasible solutions from Sin f . The solutions are ranked from 1 to N in the order they are selected. Hence, the infeasible solutions selected first are ranked higher than the feasible solutions selected later.

3 Infeasibility Empowered Memetic Algorithm (IEMA) The proposed algorithm is constructed using IDEA as the baseline algorithm. For single objective problems, a local search can be a very efficient tool for optimization. However, its performance is largely dependent on the starting solution, rendering it unreliable for global optimization. The proposed algorithm tries to exploit the advantages of both these approaches, i.e. a) intensifying the search near the constraint boundary by preserving marginally infeasible solutions and b) effectiveness of local search to expedite the convergence in potentially optimal regions of the search space. As mentioned before, an approach that combines global and local search is termed as a memetic algorithm. Hence, the proposed algorithm as is referred to as Infeasibility Empowered Memetic Algorithm (IEMA). The proposed IEMA is outlined in algorithm 2. In IEMA, during each generation, apart from the evolution of the solutions using IDEA, a local search is performed from a solution in the population for a prescribed number of function evaluations (set to 20×nvar in the presented studies, where nvar is the number of design variables). Sequential Quadratic Programming (SQP) [24] has been used in the presented studies for the local search. The starting solution for the local search is determined from the solutions in the population in the following way: 1. If the local search in the previous generation was able to improve the best solution, then the new best solution is used as the starting solution for the local search. 2. If the local search was unable to improve the best solution in the previous generation, it is evident that the existing best solution (in the previous generation) is either not a good starting solution for the local search, or close enough to optimum (either local or global), such that further improvements are difficult. Therefore, in such a case, a random solution is selected from the high ranked infeasible solutions and the feasible solutions in the population, in an attempt to improve the objective value further. High ranked infeasible solutions consist of the the Nin f = α ∗ N solutions (refer to algorithm 2) After performing the local search the worst solution in the population is replaced by the best solution found from the local search. The ranking of solutions is done in the same way as done in IDEA. The injection of good quality solutions found using the local search guides the population towards potentially optimal regions of the search space. The evolved solutions in turn act as good starting solutions for the local search in subsequent generations. In this way, both IDEA and local search work together to identify the optimum solution.

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

430

H.K. Singh, T. Ray, and W. Smith

Algorithm 2 . Infeasibility Empowered Memetic Algorithm (IEMA) Require: N {Population Size} Require: NG > 1 {Number of Generations} Require: 0 < α < 1 {Proportion of infeasible solutions} 1: Nin f = α ∗ N 2: N f = N − Nin f 3: pop1 = Initialize() 4: Evaluate(pop1 ) 5: for i = 2 to NG do 6: child popi−1 = Evolve(popi−1 ) 7: Evaluate(child popi−1 ) 8: (S f , Sin f ) = Split(popi−1 + child popi−1 ) 9: Rank(S f ) 10: Rank(Sin f ) 11: popi = Sin f (1 : Nin f ) + S f (1 : N f ) 12: x ← Choose starting solution in popi 13: xbest ← Local search (x) {xbest is the best solution found using local search from x} 14: Replace worst solution in popi with xbest 15: Rank(popi ) {Rank the solutions again in popi } 16: end for

4 Numerical Experiments The performance of the proposed IEMA algorithm is reported on four benchmark engineering design problems, viz. Belleville spring design [29], welded beam design [6], car side impact [28] and bulk carrier design [23] (single objective formulation as studied in [30]). These problems have been used in the literature by various researchers in order to test the performance of the constraint handling techniques. The results of IEMA are compared with those obtained from two other algorithms: 1. Non-dominated sorting Genetic Algorithm (NSGA-II) [8], which is one of the most widely used Evolutionary Algorithms for optimization presently. 2. Infeasibility Driven Evolutionary Algorithm (IDEA) [30,27], which is the precursor to IEMA. The aim of comparing these three algorithms is to highlight the benefit obtained over NSGA-II by incorporating preservation of good infeasible solutions (IDEA) and then by the further incorporation of local search (IEMA). To this end, percentage improvements in the objective values using these two algorithms over NSGA-II are reported in Table 2. In addition, some of the best results reported earlier in the literature for these problems have also been included for further comparisons. Very limited number of function evaluations (only 1000) have been allowed for the studies presented here, inline with the paradigm that function evaluations can be often very expensive for engineering design problems. 4.1 Experimental Setup The crossover and mutation parameters are kept same for all the three algorithms (NSGA-II, IDEA and IEMA), and are listed in Table 1. Thirty independent runs are performed on each problem using each algorithm.

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

Performance of IEMA on Engineering Design Problems

431

Table 1. Parameters used for the experiments Parameter

Value

Population Size Maximum function evaluations Crossover Probability Crossover index Mutation Probability Mutation index Infeasibility Ratio (α )

40 1000 0.9 10 0.1 20 0.2

4.2 Results The results using the three algorithms are summarized in Table 2. It is seen that IEMA is able to achieve better objective values than both IDEA and NSGA-II for all problems. The results of IDEA are better than NSGA-II for all problems except bulk carrier design, for which it is marginally worse (less than 1 %) than NSGA-II. The percentage improvement attained in using IEMA and IDEA over NSGA-II varies for different problems, but it can be seen that as high as 20.43 % improvement over the best result was obtained using IEMA (for the case of Belleville spring design). Furthermore, the improvements in the median values indicates that IEMA is able to achieve the good objective values very consistently. Again, for the case of Belleville spring design, 47.25 % improvement was seen in the median value using IEMA as compared to that obtained using NSGA-II. The improvements in the the other problems are comparatively less in magnitude, but still significant and consistent. Another impressive feature of the performance from IEMA for the studied problems is its ability to obtain good objective values in much fewer evaluations as compared to the those reported earlier in literature. In the summary of results shown in Table 2, the function evaluations used in some of the previous studies are also listed, in addition to the best values reported. Except the recent studies by Isaacs [14] which also use 1000 evaluations for comparison, the number of function evaluations used in most other studies are much higher than those used here. Even so, the objectives values reported here are better (or very close) than the best reported previously1. Also worth mentioning here is that the best results reported for Belleville spring design and Welded beam design in [14] use surrogate assisted algorithms, but superior results have been obtained in the presented studies without the use of surrogates. This also highlights a further scope of improvement over current studies, i.e. inclusion of surrogate assisted techniques in IEMA. The best design vectors found using IEMA are listed in Table 3. Although the results obtained using the proposed IEMA are very promising, it is not without limitations. The most prominent limitation of IEMA (at least in the current implementation) is its inability to handle discrete variables (during the local search). Therefore, experiments have been reported only on problems with continuous variables. However, it could be resolved with use of more specialized operators. Secondly, the performance is also likely to deteriorate if the number of variables is very high, because the calculation of gradients itself will become computationally expensive in that case. 1

Please note that slight variations in the results might also result from different precision of the variables or machines used for conducting previously reported experiments.

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

432

H.K. Singh, T. Ray, and W. Smith

Table 2. Results for engineering design problems. The numbers in the brackets indicate percent improvement in the objective values compared to those obtained using NSGA-II. (Note: For Belleville spring design, the thickness of the spring has been considered as a discrete variable in [7], but as a continuous variable in others, including the presented studies.)

Best Median Belleville spring [29] Worst std. Feasible runs Best Median Welded beam [6] Worst std. Feasible runs Best Median Car side impact [28] Worst std. Feasible runs Best Median Bulk carrier design [23] Worst std. Feasible runs

IEMA

IDEA

1.97967 (20.43 %) 1.97967 (47.25 %) 6.32532 0.788324 30 2.38096 ( 5.48%) 2.38096 ( 33.55% ) 4.69066 0.560984 30 23.5857 ( 1.26 %) 23.5857 ( 2.90% ) 23.5857 5.25507e-08 30 8.60617 (3.47% ) 8.72483 ( 9.43% ) 13.1018 1.56552 24

2.20176 ( 11.50%) 3.38646 (9.76%) 7.2774 1.28952 21 2.45567 ( 2.52%) 2.81411 ( 21.46%) 4.45493 0.545042 30 23.6988 (0.79%) 24.0132 (1.14%) 25.2929 0.368223 30 8.93236 ( -0.18%) 9.70404 ( -0.73%) 11.791 0.848578 30

NSGA-II

Other best reported (reference) (evals) 2.121964(Coello [2]) (24K) 2.29 (Isaacs [14]) (1K) 2.16256(Deb,Goyal [7]) (10K) 1.978715 (Siddall [29]) (infeas.)

2.48789 3.75291 6.5233 0.978053 21 2.51916 2.3854347 (Ray,Liew [25](33K) 3.58301 2.44 (Isaacs [14])(1K) 5.11578 2.38119 (Deb [5])(40K) 0.71464 2.38119 (Deb [4]) (320K) 30 23.8872 23.585651 (Saxena,Deb [28]) 24.2895 23.59 (Gu et al. [10]) 26.679 0.671836 30 8.91589 8.6083 (Singh et al.[30])(25K) 9.63375 11.827 0.889552 30

Table 3. The best design vectors found using the proposed IEMA Problem Belleville spring Welded beam Car side impact Bulk carrier design

x (12.01, 10.0305, 0.204143, 0.2) (0.244369, 6.21752, 8.29147 , 0.244369) (0.5, 1.22573 , 0.5, 1.20711, 0.875, 0.884189, 0.4,0.345 , 0.192 , 0 , 0 ) (280.908, 18.4985 , 25.4265, 0.75, 46.8181, 14)

f 1.97967 2.38096 23.5857 8.60617

5 Summary and Future Work In this paper, an Infeasibility Empowered Memetic Algorithm (IEMA) is presented. IEMA combines the advantages of IDEA, which focuses the search near the constraint boundaries, with local search, an efficient tool for solving single objective continuous problems. In the proposed algorithm, ranking is done similar to IDEA, and in each generation, the solutions are enhanced by doing a local search from a good quality solution in the population. The performance of IEMA is studied on a set of constrained engineering design problems for a low number of function evaluations. The proposed IEMA offers significant improvements over NSGA-II in objective values for the problems studied, and also compare favorably to the other reported results in the literature. Further improvements in IEMA by incorporating the use of surrogate assisted techniques and enhancements for handling discrete variables are currently underway.

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

Performance of IEMA on Engineering Design Problems

433

Acknowledgment The presented work was supported by grants from DSARC, UNSW@ADFA, Australia. The authors would also like to thank Dr. Amitay Isaacs for his support in implementation of the presented algorithm.

References 1. Coello Coello, C.A.: Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Computer Methods in Applied Mechanics and Engineering 191(11-12), 1245–1287 (2002) 2. Coello Coello, C.A.: Treating constraints as objectives for single-objective evolutionary optimization. Engineering Optimization 32(3), 275–308 (2000) 3. Davis, L. (ed.): Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York (1991) 4. Deb, K.: Optimal design of a welded beam via genetic algorithms. AIAA Journal 29(8), 2013–2015 (1991) 5. Deb, K.: An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering 186, 311–338 (2000) 6. Deb, K.: Multi-Objective Optimization using Evolutionary Algorithms. John Wiley and Sons Pvt. Ltd., Chichester (2001) 7. Deb, K., Goyal, M.: A combined genetic adaptive search (geneas) for engineering design. Computer Science and Informatics 26, 30–45 (1996) 8. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6, 182–197 (2002) 9. Deb, K.: An Efficient Constraint Handling Method for Genetic Algorithms. Computer Methods in Applied Mechanics and Engineering 186(2), 311–338 (2000) 10. Gu, L., Yang, R., Tho, C., Makowski, M., Faruque, O., Li, Y.: Optimisation and robustness for crashworthiness of side impact. International Journal of Vehicle Design 26(4), 348–360 (2001) 11. Hadj-Alouane, A.B., Bean, J.C.: A Genetic Algorithm for the Multiple-Choice Integer Program. Operations Research 45, 92–101 (1997) 12. Hinterding, R., Michalewicz, Z.: Your brains and my beauty: parent matching for constrained optimisation. In: Proceedings of 1998 IEEE Conference on Evolutionary Computaion, pp. 810–815 (May 1998) 13. Hoffmeister, F., Sprave, J.: Problem-independent handling of constraints by use of metric penalty functions. In: Fogel, L.J., Angeline, P.J., B¨ack, T. (eds.) Proceedings of the Fifth Annual Conference on Evolutionary Programming (EP 1996), pp. 289–294. The MIT Press, San Diego (February 1996) 14. Isaacs, A.: Development of optimization methods to solve computationally expensive problems. Ph.D. thesis, University of New South Wales, Australian Defence Force Academy (UNSW@ADFA), Canberra, Australia (2009) 15. Joines, J., Houck, C.: On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with GAs. In: Fogel, D. (ed.) Proceedings of the First IEEE Conference on Evolutionary Computation, Orlando, Florida, pp. 579–584 (1994) 16. Kuri-Morales, A., Quezada, C.V.: A Universal Eclectic Genetic Algorithm for Constrained Optimization. In: Proceedings 6th European Congress on Intelligent Techniques & Soft Computing, EUFIT 1998, Verlag Mainz, Aachen, Germany, pp. 518–522 (September 1998)

Jiuyong Li \(Ed.\), AI 2010: Advances in Artificial Intelligence 3rd Australasian Joint Conference Adelaide,Australia,December 2010 Proceedings Springer-Verlag Berlin Heidelberg 2010

434

H.K. Singh, T. Ray, and W. Smith

17. Mezura-Montes, E., Coello Coello, C.: A simple multimembered evolution strategy to solve constrained optimization problems. IEEE Transactions on Evolutionary Computation 9(1), 1–17 (2005) 18. Michalewicz, Z.: Genetic Algorithms, Numerical Optimization, and Constraints. In: Eshelman, L.J. (ed.) Proceedings of the Sixth International Conference on Genetic Algorithms (ICGA 1995), pp. 151–158. University of Pittsburgh, Morgan Kaufmann Publishers, San Mateo, California (July 1995) 19. Michalewicz, Z.: A Survey of Constraint Handling Techniques in Evolutionary Computation Methods. In: McDonnell, J.R., Reynolds, R.G., Fogel, D.B. (eds.) Proceedings of the 4th Annual Conference on Evolutionary Programming, pp. 135–155. The MIT Press, Cambridge (1995) 20. Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs. Springer, Heidelberg (1996) 21. Moscato, P.: On evolution, search, optimization, genetic algorithms and martial arts: Towards memetic algorithms. Tech. Rep. C3P report 826, Caltech Concurrent Computation Program, Caltech, California, USA (1989) 22. Ong, Y.S., Lim, M., Chen, X.: Memetic computation;past, present; future [research frontier]. IEEE Computational Intelligence Magazine 5(2), 24–31 (2010) 23. Parsons, M., Scott, R.: Formulation of multicriterion design optimization problems for solution with scalar numerical optimization methods. Journal of Ship Research 48(1), 61–76 (2004) 24. Powell, M.: A fast algorithm for nonlinearly constrained optimization calculations. In: Watson, G. (ed.) Numerical Analysis, pp. 144–157. Springer, Heidelberg (1978) 25. Ray, T., Liew, K.: Society and civilization: An optimization algorithm based on the simulation of social behavior. IEEE Transactions on Evolutionary Computation 7(4), 386–396 (2003) 26. Ray, T., Tai, K., Seow, K.: Multiobjective design optimization by an evolutionary algorithm. Engineering Optimization 33(4), 399–424 (2001) 27. Ray, T., Singh, H.K., Isaacs, A., Smith, W.: Infeasibility driven evolutionary algorithm for constrained optimization. In: Mezura-Montes, E. (ed.) Constraint Handling in Evolutionary Optimization. Studies in Computational Intelligence, pp. 145–165. Springer, Heidelberg (2009) 28. Saxena, D.K., Deb, K.: Trading on infeasibility by exploiting constraint’s criticality through multi-objectivization: A system design perspective. In: Proceedings of IEEE Congress on Evolutionary Computation (CEC 2007), September 25-28, pp. 919–926 (2007) 29. Siddall, J.N.: Optimal engineering design - principles and applications. Marcel Dekker, Inc., New York (1982) 30. Singh, H.K., Isaacs, A., Ray, T., Smith, W.: Infeasibility Driven Evolutionary Algorithm (IDEA) for Engineering Design Optimization. In: Wobcke, W., Zhang, M. (eds.) AI 2008. LNCS (LNAI), vol. 5360, pp. 104–115. Springer, Heidelberg (2008) 31. Vieira, D.A.G., Adriano, R.L.S., Vasconcelos, J.A., Krahenbuhl, L.: Treating constraints as objectives in multiobjective optimization problems using niched pareto genetic algorithm. IEEE Transactions on Magnetics 40(2) (March 2004) 32. Xiao, J., Michalewicz, Z., Trojanowski, K.: Adaptive Evolutionary Planner/Navigator for Mobile Robots. IEEE Transactions on Evolutionary Computation 1(1), 18–28 (1997)